Apr 28 19:13:57.701025 ip-10-0-139-184 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 28 19:13:57.701036 ip-10-0-139-184 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 28 19:13:57.701043 ip-10-0-139-184 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 28 19:13:57.701331 ip-10-0-139-184 systemd[1]: Failed to start Kubernetes Kubelet. Apr 28 19:14:07.829352 ip-10-0-139-184 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 28 19:14:07.829369 ip-10-0-139-184 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 435153f60bd4431c9072906e4041fec7 -- Apr 28 19:16:23.931936 ip-10-0-139-184 systemd[1]: Starting Kubernetes Kubelet... Apr 28 19:16:24.315542 ip-10-0-139-184 kubenswrapper[2576]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 28 19:16:24.315542 ip-10-0-139-184 kubenswrapper[2576]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 28 19:16:24.315542 ip-10-0-139-184 kubenswrapper[2576]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 28 19:16:24.315542 ip-10-0-139-184 kubenswrapper[2576]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 28 19:16:24.315542 ip-10-0-139-184 kubenswrapper[2576]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 28 19:16:24.317597 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.317510 2576 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 28 19:16:24.320445 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.320431 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 28 19:16:24.320445 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.320446 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 28 19:16:24.320507 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.320451 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 28 19:16:24.320507 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.320455 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 28 19:16:24.320507 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.320458 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 28 19:16:24.320507 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.320461 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 28 19:16:24.320507 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.320464 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 28 19:16:24.320507 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.320467 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 28 19:16:24.320507 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.320470 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 28 19:16:24.320507 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.320472 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 28 19:16:24.320507 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.320475 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 28 19:16:24.320507 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.320477 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 28 19:16:24.320507 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.320480 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 28 19:16:24.320507 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.320483 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 28 19:16:24.320507 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.320485 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 28 19:16:24.320507 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.320488 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 28 19:16:24.320507 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.320490 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 28 19:16:24.320507 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.320494 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 28 19:16:24.320507 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.320498 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 28 19:16:24.320507 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.320507 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 28 19:16:24.320507 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.320510 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 28 19:16:24.320971 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.320513 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 28 19:16:24.320971 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.320516 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 28 19:16:24.320971 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.320519 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 28 19:16:24.320971 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.320521 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 28 19:16:24.320971 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.320524 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 28 19:16:24.320971 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.320526 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 28 19:16:24.320971 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.320529 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 28 19:16:24.320971 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.320531 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 28 19:16:24.320971 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.320534 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 28 19:16:24.320971 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.320536 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 28 19:16:24.320971 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.320538 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 28 19:16:24.320971 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.320541 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 28 19:16:24.320971 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.320543 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 28 19:16:24.320971 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.320546 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 28 19:16:24.320971 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.320549 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 28 19:16:24.320971 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.320551 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 28 19:16:24.320971 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.320554 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 28 19:16:24.320971 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.320556 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 28 19:16:24.320971 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.320559 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 28 19:16:24.320971 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.320562 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 28 19:16:24.321487 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.320564 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 28 19:16:24.321487 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.320567 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 28 19:16:24.321487 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.320571 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 28 19:16:24.321487 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.320573 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 28 19:16:24.321487 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.320576 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 28 19:16:24.321487 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.320578 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 28 19:16:24.321487 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.320580 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 28 19:16:24.321487 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.320583 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 28 19:16:24.321487 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.320585 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 28 19:16:24.321487 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.320588 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 28 19:16:24.321487 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.320590 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 28 19:16:24.321487 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.320593 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 28 19:16:24.321487 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.320595 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 28 19:16:24.321487 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.320612 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 28 19:16:24.321487 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.320615 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 28 19:16:24.321487 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.320618 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 28 19:16:24.321487 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.320620 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 28 19:16:24.321487 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.320623 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 28 19:16:24.321487 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.320626 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 28 19:16:24.321487 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.320629 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 28 19:16:24.321994 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.320631 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 28 19:16:24.321994 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.320634 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 28 19:16:24.321994 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.320636 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 28 19:16:24.321994 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.320639 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 28 19:16:24.321994 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.320641 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 28 19:16:24.321994 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.320644 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 28 19:16:24.321994 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.320647 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 28 19:16:24.321994 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.320649 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 28 19:16:24.321994 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.320652 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 28 19:16:24.321994 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.320654 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 28 19:16:24.321994 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.320657 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 28 19:16:24.321994 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.320659 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 28 19:16:24.321994 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.320662 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 28 19:16:24.321994 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.320664 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 28 19:16:24.321994 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.320667 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 28 19:16:24.321994 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.320670 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 28 19:16:24.321994 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.320672 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 28 19:16:24.321994 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.320676 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 28 19:16:24.321994 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.320679 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 28 19:16:24.321994 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.320681 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 28 19:16:24.322478 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.320684 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 28 19:16:24.322478 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.320687 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 28 19:16:24.322478 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.320689 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 28 19:16:24.322478 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.320693 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 28 19:16:24.322478 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.320697 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 28 19:16:24.322733 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.322721 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 28 19:16:24.322733 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.322733 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 28 19:16:24.322791 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.322736 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 28 19:16:24.322791 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.322739 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 28 19:16:24.322791 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.322743 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 28 19:16:24.322791 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.322746 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 28 19:16:24.322791 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.322749 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 28 19:16:24.322791 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.322751 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 28 19:16:24.322791 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.322754 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 28 19:16:24.322791 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.322757 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 28 19:16:24.322791 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.322759 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 28 19:16:24.322791 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.322762 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 28 19:16:24.322791 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.322765 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 28 19:16:24.322791 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.322768 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 28 19:16:24.322791 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.322771 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 28 19:16:24.322791 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.322773 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 28 19:16:24.322791 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.322776 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 28 19:16:24.322791 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.322779 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 28 19:16:24.322791 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.322781 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 28 19:16:24.322791 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.322784 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 28 19:16:24.322791 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.322787 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 28 19:16:24.323235 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.322789 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 28 19:16:24.323235 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.322792 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 28 19:16:24.323235 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.322795 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 28 19:16:24.323235 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.322797 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 28 19:16:24.323235 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.322800 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 28 19:16:24.323235 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.322803 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 28 19:16:24.323235 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.322806 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 28 19:16:24.323235 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.322808 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 28 19:16:24.323235 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.322811 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 28 19:16:24.323235 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.322814 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 28 19:16:24.323235 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.322817 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 28 19:16:24.323235 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.322819 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 28 19:16:24.323235 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.322822 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 28 19:16:24.323235 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.322824 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 28 19:16:24.323235 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.322826 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 28 19:16:24.323235 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.322829 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 28 19:16:24.323235 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.322831 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 28 19:16:24.323235 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.322834 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 28 19:16:24.323235 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.322836 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 28 19:16:24.323235 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.322838 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 28 19:16:24.323740 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.322841 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 28 19:16:24.323740 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.322843 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 28 19:16:24.323740 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.322846 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 28 19:16:24.323740 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.322848 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 28 19:16:24.323740 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.322851 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 28 19:16:24.323740 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.322853 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 28 19:16:24.323740 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.322856 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 28 19:16:24.323740 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.322858 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 28 19:16:24.323740 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.322861 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 28 19:16:24.323740 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.322863 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 28 19:16:24.323740 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.322867 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 28 19:16:24.323740 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.322871 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 28 19:16:24.323740 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.322874 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 28 19:16:24.323740 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.322879 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 28 19:16:24.323740 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.322882 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 28 19:16:24.323740 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.322884 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 28 19:16:24.323740 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.322886 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 28 19:16:24.323740 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.322889 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 28 19:16:24.323740 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.322891 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 28 19:16:24.324196 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.322894 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 28 19:16:24.324196 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.322897 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 28 19:16:24.324196 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.322899 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 28 19:16:24.324196 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.322901 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 28 19:16:24.324196 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.322904 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 28 19:16:24.324196 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.322906 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 28 19:16:24.324196 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.322909 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 28 19:16:24.324196 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.322911 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 28 19:16:24.324196 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.322914 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 28 19:16:24.324196 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.322916 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 28 19:16:24.324196 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.322919 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 28 19:16:24.324196 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.322921 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 28 19:16:24.324196 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.322923 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 28 19:16:24.324196 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.322926 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 28 19:16:24.324196 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.322928 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 28 19:16:24.324196 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.322931 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 28 19:16:24.324196 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.322935 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 28 19:16:24.324196 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.322938 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 28 19:16:24.324196 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.322941 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 28 19:16:24.324196 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.322945 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 28 19:16:24.324695 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.322948 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 28 19:16:24.324695 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.322950 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 28 19:16:24.324695 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.322953 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 28 19:16:24.324695 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.322955 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 28 19:16:24.324695 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.322958 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 28 19:16:24.324695 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.322961 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 28 19:16:24.324695 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323031 2576 flags.go:64] FLAG: --address="0.0.0.0" Apr 28 19:16:24.324695 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323039 2576 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 28 19:16:24.324695 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323045 2576 flags.go:64] FLAG: --anonymous-auth="true" Apr 28 19:16:24.324695 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323051 2576 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 28 19:16:24.324695 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323056 2576 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 28 19:16:24.324695 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323059 2576 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 28 19:16:24.324695 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323064 2576 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 28 19:16:24.324695 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323068 2576 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 28 19:16:24.324695 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323072 2576 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 28 19:16:24.324695 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323075 2576 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 28 19:16:24.324695 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323079 2576 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 28 19:16:24.324695 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323082 2576 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 28 19:16:24.324695 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323086 2576 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 28 19:16:24.324695 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323089 2576 flags.go:64] FLAG: --cgroup-root="" Apr 28 19:16:24.324695 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323091 2576 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 28 19:16:24.324695 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323094 2576 flags.go:64] FLAG: --client-ca-file="" Apr 28 19:16:24.324695 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323097 2576 flags.go:64] FLAG: --cloud-config="" Apr 28 19:16:24.325245 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323100 2576 flags.go:64] FLAG: --cloud-provider="external" Apr 28 19:16:24.325245 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323103 2576 flags.go:64] FLAG: --cluster-dns="[]" Apr 28 19:16:24.325245 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323108 2576 flags.go:64] FLAG: --cluster-domain="" Apr 28 19:16:24.325245 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323110 2576 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 28 19:16:24.325245 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323113 2576 flags.go:64] FLAG: --config-dir="" Apr 28 19:16:24.325245 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323116 2576 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 28 19:16:24.325245 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323119 2576 flags.go:64] FLAG: --container-log-max-files="5" Apr 28 19:16:24.325245 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323123 2576 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 28 19:16:24.325245 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323127 2576 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 28 19:16:24.325245 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323130 2576 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 28 19:16:24.325245 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323133 2576 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 28 19:16:24.325245 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323136 2576 flags.go:64] FLAG: --contention-profiling="false" Apr 28 19:16:24.325245 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323139 2576 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 28 19:16:24.325245 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323141 2576 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 28 19:16:24.325245 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323144 2576 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 28 19:16:24.325245 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323148 2576 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 28 19:16:24.325245 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323152 2576 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 28 19:16:24.325245 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323155 2576 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 28 19:16:24.325245 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323158 2576 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 28 19:16:24.325245 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323161 2576 flags.go:64] FLAG: --enable-load-reader="false" Apr 28 19:16:24.325245 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323164 2576 flags.go:64] FLAG: --enable-server="true" Apr 28 19:16:24.325245 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323167 2576 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 28 19:16:24.325245 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323171 2576 flags.go:64] FLAG: --event-burst="100" Apr 28 19:16:24.325245 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323174 2576 flags.go:64] FLAG: --event-qps="50" Apr 28 19:16:24.325245 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323177 2576 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 28 19:16:24.325870 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323181 2576 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 28 19:16:24.325870 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323183 2576 flags.go:64] FLAG: --eviction-hard="" Apr 28 19:16:24.325870 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323187 2576 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 28 19:16:24.325870 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323190 2576 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 28 19:16:24.325870 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323193 2576 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 28 19:16:24.325870 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323196 2576 flags.go:64] FLAG: --eviction-soft="" Apr 28 19:16:24.325870 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323199 2576 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 28 19:16:24.325870 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323202 2576 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 28 19:16:24.325870 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323205 2576 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 28 19:16:24.325870 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323208 2576 flags.go:64] FLAG: --experimental-mounter-path="" Apr 28 19:16:24.325870 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323212 2576 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 28 19:16:24.325870 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323215 2576 flags.go:64] FLAG: --fail-swap-on="true" Apr 28 19:16:24.325870 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323218 2576 flags.go:64] FLAG: --feature-gates="" Apr 28 19:16:24.325870 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323222 2576 flags.go:64] FLAG: --file-check-frequency="20s" Apr 28 19:16:24.325870 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323225 2576 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 28 19:16:24.325870 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323228 2576 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 28 19:16:24.325870 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323231 2576 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 28 19:16:24.325870 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323234 2576 flags.go:64] FLAG: --healthz-port="10248" Apr 28 19:16:24.325870 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323238 2576 flags.go:64] FLAG: --help="false" Apr 28 19:16:24.325870 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323242 2576 flags.go:64] FLAG: --hostname-override="ip-10-0-139-184.ec2.internal" Apr 28 19:16:24.325870 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323245 2576 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 28 19:16:24.325870 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323249 2576 flags.go:64] FLAG: --http-check-frequency="20s" Apr 28 19:16:24.325870 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323252 2576 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 28 19:16:24.326475 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323256 2576 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 28 19:16:24.326475 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323259 2576 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 28 19:16:24.326475 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323262 2576 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 28 19:16:24.326475 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323265 2576 flags.go:64] FLAG: --image-service-endpoint="" Apr 28 19:16:24.326475 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323268 2576 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 28 19:16:24.326475 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323270 2576 flags.go:64] FLAG: --kube-api-burst="100" Apr 28 19:16:24.326475 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323273 2576 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 28 19:16:24.326475 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323276 2576 flags.go:64] FLAG: --kube-api-qps="50" Apr 28 19:16:24.326475 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323279 2576 flags.go:64] FLAG: --kube-reserved="" Apr 28 19:16:24.326475 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323282 2576 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 28 19:16:24.326475 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323285 2576 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 28 19:16:24.326475 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323288 2576 flags.go:64] FLAG: --kubelet-cgroups="" Apr 28 19:16:24.326475 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323291 2576 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 28 19:16:24.326475 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323294 2576 flags.go:64] FLAG: --lock-file="" Apr 28 19:16:24.326475 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323296 2576 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 28 19:16:24.326475 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323299 2576 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 28 19:16:24.326475 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323302 2576 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 28 19:16:24.326475 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323307 2576 flags.go:64] FLAG: --log-json-split-stream="false" Apr 28 19:16:24.326475 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323310 2576 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 28 19:16:24.326475 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323313 2576 flags.go:64] FLAG: --log-text-split-stream="false" Apr 28 19:16:24.326475 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323315 2576 flags.go:64] FLAG: --logging-format="text" Apr 28 19:16:24.326475 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323318 2576 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 28 19:16:24.326475 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323321 2576 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 28 19:16:24.326475 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323324 2576 flags.go:64] FLAG: --manifest-url="" Apr 28 19:16:24.327091 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323327 2576 flags.go:64] FLAG: --manifest-url-header="" Apr 28 19:16:24.327091 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323331 2576 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 28 19:16:24.327091 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323334 2576 flags.go:64] FLAG: --max-open-files="1000000" Apr 28 19:16:24.327091 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323339 2576 flags.go:64] FLAG: --max-pods="110" Apr 28 19:16:24.327091 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323344 2576 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 28 19:16:24.327091 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323347 2576 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 28 19:16:24.327091 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323350 2576 flags.go:64] FLAG: --memory-manager-policy="None" Apr 28 19:16:24.327091 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323354 2576 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 28 19:16:24.327091 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323357 2576 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 28 19:16:24.327091 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323360 2576 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 28 19:16:24.327091 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323362 2576 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 28 19:16:24.327091 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323371 2576 flags.go:64] FLAG: --node-status-max-images="50" Apr 28 19:16:24.327091 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323374 2576 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 28 19:16:24.327091 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323377 2576 flags.go:64] FLAG: --oom-score-adj="-999" Apr 28 19:16:24.327091 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323380 2576 flags.go:64] FLAG: --pod-cidr="" Apr 28 19:16:24.327091 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323383 2576 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 28 19:16:24.327091 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323389 2576 flags.go:64] FLAG: --pod-manifest-path="" Apr 28 19:16:24.327091 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323392 2576 flags.go:64] FLAG: --pod-max-pids="-1" Apr 28 19:16:24.327091 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323395 2576 flags.go:64] FLAG: --pods-per-core="0" Apr 28 19:16:24.327091 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323398 2576 flags.go:64] FLAG: --port="10250" Apr 28 19:16:24.327091 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323401 2576 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 28 19:16:24.327091 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323404 2576 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-08983ed9a486b5746" Apr 28 19:16:24.327091 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323407 2576 flags.go:64] FLAG: --qos-reserved="" Apr 28 19:16:24.327091 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323410 2576 flags.go:64] FLAG: --read-only-port="10255" Apr 28 19:16:24.327686 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323413 2576 flags.go:64] FLAG: --register-node="true" Apr 28 19:16:24.327686 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323416 2576 flags.go:64] FLAG: --register-schedulable="true" Apr 28 19:16:24.327686 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323419 2576 flags.go:64] FLAG: --register-with-taints="" Apr 28 19:16:24.327686 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323427 2576 flags.go:64] FLAG: --registry-burst="10" Apr 28 19:16:24.327686 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323430 2576 flags.go:64] FLAG: --registry-qps="5" Apr 28 19:16:24.327686 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323432 2576 flags.go:64] FLAG: --reserved-cpus="" Apr 28 19:16:24.327686 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323435 2576 flags.go:64] FLAG: --reserved-memory="" Apr 28 19:16:24.327686 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323439 2576 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 28 19:16:24.327686 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323442 2576 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 28 19:16:24.327686 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323445 2576 flags.go:64] FLAG: --rotate-certificates="false" Apr 28 19:16:24.327686 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323448 2576 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 28 19:16:24.327686 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323452 2576 flags.go:64] FLAG: --runonce="false" Apr 28 19:16:24.327686 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323459 2576 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 28 19:16:24.327686 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323463 2576 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 28 19:16:24.327686 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323466 2576 flags.go:64] FLAG: --seccomp-default="false" Apr 28 19:16:24.327686 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323469 2576 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 28 19:16:24.327686 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323472 2576 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 28 19:16:24.327686 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323475 2576 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 28 19:16:24.327686 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323478 2576 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 28 19:16:24.327686 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323481 2576 flags.go:64] FLAG: --storage-driver-password="root" Apr 28 19:16:24.327686 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323484 2576 flags.go:64] FLAG: --storage-driver-secure="false" Apr 28 19:16:24.327686 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323487 2576 flags.go:64] FLAG: --storage-driver-table="stats" Apr 28 19:16:24.327686 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323490 2576 flags.go:64] FLAG: --storage-driver-user="root" Apr 28 19:16:24.327686 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323492 2576 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 28 19:16:24.327686 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323495 2576 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 28 19:16:24.327686 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323498 2576 flags.go:64] FLAG: --system-cgroups="" Apr 28 19:16:24.328294 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323501 2576 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 28 19:16:24.328294 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323506 2576 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 28 19:16:24.328294 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323509 2576 flags.go:64] FLAG: --tls-cert-file="" Apr 28 19:16:24.328294 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323511 2576 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 28 19:16:24.328294 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323516 2576 flags.go:64] FLAG: --tls-min-version="" Apr 28 19:16:24.328294 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323519 2576 flags.go:64] FLAG: --tls-private-key-file="" Apr 28 19:16:24.328294 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323521 2576 flags.go:64] FLAG: --topology-manager-policy="none" Apr 28 19:16:24.328294 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323524 2576 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 28 19:16:24.328294 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323528 2576 flags.go:64] FLAG: --topology-manager-scope="container" Apr 28 19:16:24.328294 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323531 2576 flags.go:64] FLAG: --v="2" Apr 28 19:16:24.328294 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323536 2576 flags.go:64] FLAG: --version="false" Apr 28 19:16:24.328294 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323540 2576 flags.go:64] FLAG: --vmodule="" Apr 28 19:16:24.328294 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323544 2576 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 28 19:16:24.328294 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.323548 2576 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 28 19:16:24.328294 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.323666 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 28 19:16:24.328294 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.323671 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 28 19:16:24.328294 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.323674 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 28 19:16:24.328294 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.323677 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 28 19:16:24.328294 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.323683 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 28 19:16:24.328294 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.323686 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 28 19:16:24.328294 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.323690 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 28 19:16:24.328294 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.323694 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 28 19:16:24.328816 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.323697 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 28 19:16:24.328816 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.323700 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 28 19:16:24.328816 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.323702 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 28 19:16:24.328816 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.323705 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 28 19:16:24.328816 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.323708 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 28 19:16:24.328816 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.323711 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 28 19:16:24.328816 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.323713 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 28 19:16:24.328816 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.323716 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 28 19:16:24.328816 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.323718 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 28 19:16:24.328816 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.323721 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 28 19:16:24.328816 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.323723 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 28 19:16:24.328816 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.323725 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 28 19:16:24.328816 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.323728 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 28 19:16:24.328816 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.323730 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 28 19:16:24.328816 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.323736 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 28 19:16:24.328816 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.323739 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 28 19:16:24.328816 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.323742 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 28 19:16:24.328816 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.323744 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 28 19:16:24.328816 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.323747 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 28 19:16:24.329294 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.323750 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 28 19:16:24.329294 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.323753 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 28 19:16:24.329294 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.323755 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 28 19:16:24.329294 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.323758 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 28 19:16:24.329294 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.323760 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 28 19:16:24.329294 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.323763 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 28 19:16:24.329294 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.323766 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 28 19:16:24.329294 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.323769 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 28 19:16:24.329294 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.323771 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 28 19:16:24.329294 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.323774 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 28 19:16:24.329294 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.323777 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 28 19:16:24.329294 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.323780 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 28 19:16:24.329294 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.323783 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 28 19:16:24.329294 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.323785 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 28 19:16:24.329294 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.323788 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 28 19:16:24.329294 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.323791 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 28 19:16:24.329294 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.323793 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 28 19:16:24.329294 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.323796 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 28 19:16:24.329294 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.323798 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 28 19:16:24.329294 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.323801 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 28 19:16:24.329294 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.323803 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 28 19:16:24.329909 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.323806 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 28 19:16:24.329909 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.323808 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 28 19:16:24.329909 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.323811 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 28 19:16:24.329909 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.323813 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 28 19:16:24.329909 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.323815 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 28 19:16:24.329909 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.323818 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 28 19:16:24.329909 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.323821 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 28 19:16:24.329909 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.323824 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 28 19:16:24.329909 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.323827 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 28 19:16:24.329909 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.323829 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 28 19:16:24.329909 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.323832 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 28 19:16:24.329909 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.323834 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 28 19:16:24.329909 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.323837 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 28 19:16:24.329909 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.323839 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 28 19:16:24.329909 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.323841 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 28 19:16:24.329909 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.323844 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 28 19:16:24.329909 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.323846 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 28 19:16:24.329909 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.323849 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 28 19:16:24.329909 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.323851 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 28 19:16:24.329909 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.323854 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 28 19:16:24.330403 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.323856 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 28 19:16:24.330403 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.323859 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 28 19:16:24.330403 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.323862 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 28 19:16:24.330403 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.323865 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 28 19:16:24.330403 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.323867 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 28 19:16:24.330403 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.323870 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 28 19:16:24.330403 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.323872 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 28 19:16:24.330403 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.323875 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 28 19:16:24.330403 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.323877 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 28 19:16:24.330403 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.323880 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 28 19:16:24.330403 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.323882 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 28 19:16:24.330403 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.323885 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 28 19:16:24.330403 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.323888 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 28 19:16:24.330403 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.323892 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 28 19:16:24.330403 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.323895 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 28 19:16:24.330403 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.323898 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 28 19:16:24.330403 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.323901 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 28 19:16:24.330403 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.323903 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 28 19:16:24.330872 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.324576 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 28 19:16:24.331234 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.331213 2576 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 28 19:16:24.331264 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.331236 2576 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 28 19:16:24.331294 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331288 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 28 19:16:24.331324 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331295 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 28 19:16:24.331324 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331299 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 28 19:16:24.331324 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331303 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 28 19:16:24.331324 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331306 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 28 19:16:24.331324 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331310 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 28 19:16:24.331324 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331312 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 28 19:16:24.331324 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331315 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 28 19:16:24.331324 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331318 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 28 19:16:24.331324 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331320 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 28 19:16:24.331324 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331323 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 28 19:16:24.331324 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331326 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 28 19:16:24.331324 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331329 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 28 19:16:24.331640 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331332 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 28 19:16:24.331640 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331335 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 28 19:16:24.331640 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331337 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 28 19:16:24.331640 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331340 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 28 19:16:24.331640 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331342 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 28 19:16:24.331640 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331345 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 28 19:16:24.331640 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331347 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 28 19:16:24.331640 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331350 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 28 19:16:24.331640 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331352 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 28 19:16:24.331640 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331354 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 28 19:16:24.331640 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331357 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 28 19:16:24.331640 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331359 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 28 19:16:24.331640 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331362 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 28 19:16:24.331640 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331365 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 28 19:16:24.331640 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331367 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 28 19:16:24.331640 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331370 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 28 19:16:24.331640 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331372 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 28 19:16:24.331640 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331374 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 28 19:16:24.331640 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331378 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 28 19:16:24.331640 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331380 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 28 19:16:24.332120 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331383 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 28 19:16:24.332120 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331385 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 28 19:16:24.332120 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331388 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 28 19:16:24.332120 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331390 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 28 19:16:24.332120 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331393 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 28 19:16:24.332120 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331395 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 28 19:16:24.332120 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331398 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 28 19:16:24.332120 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331401 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 28 19:16:24.332120 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331405 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 28 19:16:24.332120 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331408 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 28 19:16:24.332120 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331410 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 28 19:16:24.332120 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331413 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 28 19:16:24.332120 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331415 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 28 19:16:24.332120 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331418 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 28 19:16:24.332120 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331421 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 28 19:16:24.332120 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331424 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 28 19:16:24.332120 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331427 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 28 19:16:24.332120 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331430 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 28 19:16:24.332120 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331432 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 28 19:16:24.332120 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331435 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 28 19:16:24.332637 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331438 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 28 19:16:24.332637 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331440 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 28 19:16:24.332637 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331443 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 28 19:16:24.332637 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331445 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 28 19:16:24.332637 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331448 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 28 19:16:24.332637 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331450 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 28 19:16:24.332637 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331453 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 28 19:16:24.332637 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331455 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 28 19:16:24.332637 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331458 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 28 19:16:24.332637 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331460 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 28 19:16:24.332637 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331463 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 28 19:16:24.332637 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331467 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 28 19:16:24.332637 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331472 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 28 19:16:24.332637 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331474 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 28 19:16:24.332637 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331477 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 28 19:16:24.332637 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331480 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 28 19:16:24.332637 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331483 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 28 19:16:24.332637 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331486 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 28 19:16:24.332637 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331488 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 28 19:16:24.333103 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331491 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 28 19:16:24.333103 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331493 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 28 19:16:24.333103 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331496 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 28 19:16:24.333103 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331499 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 28 19:16:24.333103 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331501 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 28 19:16:24.333103 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331504 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 28 19:16:24.333103 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331507 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 28 19:16:24.333103 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331510 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 28 19:16:24.333103 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331512 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 28 19:16:24.333103 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331515 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 28 19:16:24.333103 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331518 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 28 19:16:24.333103 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331521 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 28 19:16:24.333103 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331523 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 28 19:16:24.333103 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331525 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 28 19:16:24.333103 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.331531 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 28 19:16:24.333479 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331644 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 28 19:16:24.333479 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331650 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 28 19:16:24.333479 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331653 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 28 19:16:24.333479 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331656 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 28 19:16:24.333479 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331659 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 28 19:16:24.333479 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331662 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 28 19:16:24.333479 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331665 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 28 19:16:24.333479 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331667 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 28 19:16:24.333479 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331670 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 28 19:16:24.333479 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331672 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 28 19:16:24.333479 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331675 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 28 19:16:24.333479 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331679 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 28 19:16:24.333479 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331683 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 28 19:16:24.333479 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331686 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 28 19:16:24.333479 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331688 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 28 19:16:24.333479 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331691 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 28 19:16:24.333479 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331694 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 28 19:16:24.333479 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331697 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 28 19:16:24.333479 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331700 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 28 19:16:24.333479 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331702 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 28 19:16:24.334066 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331705 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 28 19:16:24.334066 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331707 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 28 19:16:24.334066 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331710 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 28 19:16:24.334066 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331713 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 28 19:16:24.334066 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331716 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 28 19:16:24.334066 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331718 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 28 19:16:24.334066 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331721 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 28 19:16:24.334066 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331723 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 28 19:16:24.334066 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331726 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 28 19:16:24.334066 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331728 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 28 19:16:24.334066 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331730 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 28 19:16:24.334066 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331733 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 28 19:16:24.334066 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331735 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 28 19:16:24.334066 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331738 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 28 19:16:24.334066 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331740 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 28 19:16:24.334066 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331743 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 28 19:16:24.334066 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331745 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 28 19:16:24.334066 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331748 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 28 19:16:24.334066 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331750 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 28 19:16:24.334066 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331753 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 28 19:16:24.334598 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331755 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 28 19:16:24.334598 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331757 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 28 19:16:24.334598 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331760 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 28 19:16:24.334598 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331763 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 28 19:16:24.334598 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331765 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 28 19:16:24.334598 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331768 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 28 19:16:24.334598 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331770 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 28 19:16:24.334598 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331773 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 28 19:16:24.334598 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331775 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 28 19:16:24.334598 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331777 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 28 19:16:24.334598 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331780 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 28 19:16:24.334598 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331782 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 28 19:16:24.334598 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331785 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 28 19:16:24.334598 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331787 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 28 19:16:24.334598 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331790 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 28 19:16:24.334598 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331792 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 28 19:16:24.334598 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331796 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 28 19:16:24.334598 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331798 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 28 19:16:24.334598 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331801 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 28 19:16:24.334598 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331804 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 28 19:16:24.335090 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331806 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 28 19:16:24.335090 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331809 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 28 19:16:24.335090 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331811 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 28 19:16:24.335090 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331814 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 28 19:16:24.335090 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331816 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 28 19:16:24.335090 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331818 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 28 19:16:24.335090 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331821 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 28 19:16:24.335090 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331823 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 28 19:16:24.335090 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331826 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 28 19:16:24.335090 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331828 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 28 19:16:24.335090 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331831 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 28 19:16:24.335090 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331834 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 28 19:16:24.335090 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331837 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 28 19:16:24.335090 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331839 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 28 19:16:24.335090 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331842 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 28 19:16:24.335090 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331845 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 28 19:16:24.335090 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331848 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 28 19:16:24.335090 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331850 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 28 19:16:24.335090 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331852 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 28 19:16:24.335563 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331855 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 28 19:16:24.335563 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331857 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 28 19:16:24.335563 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331860 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 28 19:16:24.335563 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331862 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 28 19:16:24.335563 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331865 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 28 19:16:24.335563 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331867 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 28 19:16:24.335563 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:24.331869 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 28 19:16:24.335563 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.331875 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 28 19:16:24.335563 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.332477 2576 server.go:962] "Client rotation is on, will bootstrap in background" Apr 28 19:16:24.335563 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.334328 2576 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 28 19:16:24.335563 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.335126 2576 server.go:1019] "Starting client certificate rotation" Apr 28 19:16:24.335563 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.335219 2576 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 28 19:16:24.335900 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.335655 2576 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 28 19:16:24.353827 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.353799 2576 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 28 19:16:24.357731 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.357705 2576 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 28 19:16:24.368261 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.368243 2576 log.go:25] "Validated CRI v1 runtime API" Apr 28 19:16:24.373087 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.373073 2576 log.go:25] "Validated CRI v1 image API" Apr 28 19:16:24.374220 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.374204 2576 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 28 19:16:24.377904 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.377763 2576 fs.go:135] Filesystem UUIDs: map[1e1be56b-ea99-47d0-af8c-861bb5158d12:/dev/nvme0n1p3 3d84880a-9407-4c23-876c-3562fd496cd3:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2] Apr 28 19:16:24.377904 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.377904 2576 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 28 19:16:24.383463 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.383349 2576 manager.go:217] Machine: {Timestamp:2026-04-28 19:16:24.381621241 +0000 UTC m=+0.344521618 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3069195 MemoryCapacity:32812175360 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2004bdc1990a743cde91e11d9afd3b SystemUUID:ec2004bd-c199-0a74-3cde-91e11d9afd3b BootID:435153f6-0bd4-431c-9072-906e4041fec7 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406085632 Type:vfs Inodes:4005392 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406089728 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:90:b2:d3:4b:91 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:90:b2:d3:4b:91 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:f6:e5:b3:92:c7:52 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812175360 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 28 19:16:24.384102 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.384089 2576 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 28 19:16:24.384190 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.384179 2576 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 28 19:16:24.385144 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.385119 2576 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 28 19:16:24.385299 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.385147 2576 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-139-184.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 28 19:16:24.385348 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.385309 2576 topology_manager.go:138] "Creating topology manager with none policy" Apr 28 19:16:24.385348 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.385317 2576 container_manager_linux.go:306] "Creating device plugin manager" Apr 28 19:16:24.385348 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.385334 2576 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 28 19:16:24.386025 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.386015 2576 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 28 19:16:24.386377 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.386361 2576 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 28 19:16:24.387090 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.387080 2576 state_mem.go:36] "Initialized new in-memory state store" Apr 28 19:16:24.387208 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.387199 2576 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 28 19:16:24.389595 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.389584 2576 kubelet.go:491] "Attempting to sync node with API server" Apr 28 19:16:24.389672 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.389620 2576 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 28 19:16:24.389672 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.389642 2576 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 28 19:16:24.389672 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.389655 2576 kubelet.go:397] "Adding apiserver pod source" Apr 28 19:16:24.389672 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.389668 2576 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 28 19:16:24.390564 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.390550 2576 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 28 19:16:24.390651 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.390571 2576 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 28 19:16:24.393117 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.393102 2576 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 28 19:16:24.394338 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.394326 2576 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 28 19:16:24.395981 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.395968 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 28 19:16:24.396080 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.395988 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 28 19:16:24.396080 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.395994 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 28 19:16:24.396080 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.396000 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 28 19:16:24.396080 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.396006 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 28 19:16:24.396080 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.396011 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 28 19:16:24.396080 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.396017 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 28 19:16:24.396080 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.396022 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 28 19:16:24.396080 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.396029 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 28 19:16:24.396080 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.396036 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 28 19:16:24.396080 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.396045 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 28 19:16:24.396080 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.396053 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 28 19:16:24.397809 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.397774 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 28 19:16:24.397892 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.397816 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 28 19:16:24.402138 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.402120 2576 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 28 19:16:24.402204 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.402168 2576 server.go:1295] "Started kubelet" Apr 28 19:16:24.402310 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.402265 2576 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 28 19:16:24.402391 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.402304 2576 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 28 19:16:24.402391 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.402355 2576 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 28 19:16:24.402744 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:16:24.402708 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 28 19:16:24.402815 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.402756 2576 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-139-184.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 28 19:16:24.402923 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:16:24.402904 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-139-184.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 28 19:16:24.403453 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.403437 2576 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 28 19:16:24.403473 ip-10-0-139-184 systemd[1]: Started Kubernetes Kubelet. Apr 28 19:16:24.404889 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.404875 2576 server.go:317] "Adding debug handlers to kubelet server" Apr 28 19:16:24.407888 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.407872 2576 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 28 19:16:24.408093 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:16:24.407235 2576 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-139-184.ec2.internal.18aa9b5327b96c0c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-139-184.ec2.internal,UID:ip-10-0-139-184.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-139-184.ec2.internal,},FirstTimestamp:2026-04-28 19:16:24.402136076 +0000 UTC m=+0.365036433,LastTimestamp:2026-04-28 19:16:24.402136076 +0000 UTC m=+0.365036433,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-139-184.ec2.internal,}" Apr 28 19:16:24.408289 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.408271 2576 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 28 19:16:24.409066 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.409048 2576 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 28 19:16:24.409281 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.409260 2576 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 28 19:16:24.409359 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.409188 2576 factory.go:55] Registering systemd factory Apr 28 19:16:24.409359 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.409313 2576 factory.go:223] Registration of the systemd container factory successfully Apr 28 19:16:24.409433 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.409361 2576 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 28 19:16:24.409433 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.409405 2576 reconstruct.go:97] "Volume reconstruction finished" Apr 28 19:16:24.409433 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.409413 2576 reconciler.go:26] "Reconciler: start to sync state" Apr 28 19:16:24.410421 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:16:24.410264 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-184.ec2.internal\" not found" Apr 28 19:16:24.410563 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.410548 2576 factory.go:153] Registering CRI-O factory Apr 28 19:16:24.410653 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.410565 2576 factory.go:223] Registration of the crio container factory successfully Apr 28 19:16:24.410653 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.410624 2576 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 28 19:16:24.410653 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.410648 2576 factory.go:103] Registering Raw factory Apr 28 19:16:24.410797 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.410660 2576 manager.go:1196] Started watching for new ooms in manager Apr 28 19:16:24.412320 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.412303 2576 manager.go:319] Starting recovery of all containers Apr 28 19:16:24.415464 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:16:24.415426 2576 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 28 19:16:24.415967 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:16:24.415926 2576 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-139-184.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 28 19:16:24.416088 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:16:24.416009 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 28 19:16:24.420647 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.420624 2576 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-zm7hb" Apr 28 19:16:24.425340 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.425321 2576 manager.go:324] Recovery completed Apr 28 19:16:24.429745 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.429722 2576 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-zm7hb" Apr 28 19:16:24.430083 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.430072 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 28 19:16:24.433114 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.433099 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-184.ec2.internal" event="NodeHasSufficientMemory" Apr 28 19:16:24.433185 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.433126 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-184.ec2.internal" event="NodeHasNoDiskPressure" Apr 28 19:16:24.433185 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.433136 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-184.ec2.internal" event="NodeHasSufficientPID" Apr 28 19:16:24.433718 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.433704 2576 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 28 19:16:24.433777 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.433719 2576 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 28 19:16:24.433777 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.433734 2576 state_mem.go:36] "Initialized new in-memory state store" Apr 28 19:16:24.435899 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.435883 2576 policy_none.go:49] "None policy: Start" Apr 28 19:16:24.435899 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.435901 2576 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 28 19:16:24.436012 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.435911 2576 state_mem.go:35] "Initializing new in-memory state store" Apr 28 19:16:24.436012 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:16:24.435866 2576 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-139-184.ec2.internal.18aa9b5329921a08 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-139-184.ec2.internal,UID:ip-10-0-139-184.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-139-184.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-139-184.ec2.internal,},FirstTimestamp:2026-04-28 19:16:24.433113608 +0000 UTC m=+0.396013965,LastTimestamp:2026-04-28 19:16:24.433113608 +0000 UTC m=+0.396013965,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-139-184.ec2.internal,}" Apr 28 19:16:24.474255 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.474237 2576 manager.go:341] "Starting Device Plugin manager" Apr 28 19:16:24.492732 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:16:24.474315 2576 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 28 19:16:24.492732 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.474332 2576 server.go:85] "Starting device plugin registration server" Apr 28 19:16:24.492732 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.474584 2576 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 28 19:16:24.492732 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.474594 2576 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 28 19:16:24.492732 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.474728 2576 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 28 19:16:24.492732 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.474828 2576 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 28 19:16:24.492732 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.474836 2576 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 28 19:16:24.492732 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:16:24.475696 2576 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 28 19:16:24.492732 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:16:24.475754 2576 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-139-184.ec2.internal\" not found" Apr 28 19:16:24.507328 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.507288 2576 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 28 19:16:24.508795 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.508772 2576 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 28 19:16:24.508890 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.508800 2576 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 28 19:16:24.508890 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.508818 2576 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 28 19:16:24.508890 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.508826 2576 kubelet.go:2451] "Starting kubelet main sync loop" Apr 28 19:16:24.509000 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:16:24.508914 2576 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 28 19:16:24.513260 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.513242 2576 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 28 19:16:24.575808 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.575725 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 28 19:16:24.578081 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.578060 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-184.ec2.internal" event="NodeHasSufficientMemory" Apr 28 19:16:24.578203 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.578095 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-184.ec2.internal" event="NodeHasNoDiskPressure" Apr 28 19:16:24.578203 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.578106 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-184.ec2.internal" event="NodeHasSufficientPID" Apr 28 19:16:24.578203 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.578130 2576 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-139-184.ec2.internal" Apr 28 19:16:24.589335 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.589310 2576 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-139-184.ec2.internal" Apr 28 19:16:24.589456 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:16:24.589339 2576 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-139-184.ec2.internal\": node \"ip-10-0-139-184.ec2.internal\" not found" Apr 28 19:16:24.609130 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.609092 2576 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-184.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-139-184.ec2.internal"] Apr 28 19:16:24.609258 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.609182 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 28 19:16:24.609258 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:16:24.609232 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-184.ec2.internal\" not found" Apr 28 19:16:24.610060 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.610047 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-184.ec2.internal" event="NodeHasSufficientMemory" Apr 28 19:16:24.610128 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.610073 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-184.ec2.internal" event="NodeHasNoDiskPressure" Apr 28 19:16:24.610128 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.610083 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-184.ec2.internal" event="NodeHasSufficientPID" Apr 28 19:16:24.612438 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.612426 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 28 19:16:24.612597 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.612581 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-184.ec2.internal" Apr 28 19:16:24.612680 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.612639 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 28 19:16:24.613324 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.613307 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-184.ec2.internal" event="NodeHasSufficientMemory" Apr 28 19:16:24.613395 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.613338 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-184.ec2.internal" event="NodeHasNoDiskPressure" Apr 28 19:16:24.613395 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.613348 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-184.ec2.internal" event="NodeHasSufficientPID" Apr 28 19:16:24.613395 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.613315 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-184.ec2.internal" event="NodeHasSufficientMemory" Apr 28 19:16:24.613485 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.613408 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-184.ec2.internal" event="NodeHasNoDiskPressure" Apr 28 19:16:24.613485 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.613422 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-184.ec2.internal" event="NodeHasSufficientPID" Apr 28 19:16:24.615763 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.615747 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-184.ec2.internal" Apr 28 19:16:24.615847 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.615773 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 28 19:16:24.616584 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.616566 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-184.ec2.internal" event="NodeHasSufficientMemory" Apr 28 19:16:24.616708 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.616597 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-184.ec2.internal" event="NodeHasNoDiskPressure" Apr 28 19:16:24.616708 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.616628 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-184.ec2.internal" event="NodeHasSufficientPID" Apr 28 19:16:24.629134 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:16:24.629114 2576 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-139-184.ec2.internal\" not found" node="ip-10-0-139-184.ec2.internal" Apr 28 19:16:24.633044 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:16:24.633029 2576 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-139-184.ec2.internal\" not found" node="ip-10-0-139-184.ec2.internal" Apr 28 19:16:24.710307 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:16:24.710270 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-184.ec2.internal\" not found" Apr 28 19:16:24.710454 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.710338 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/77560a24cc76a08538baa0efe4273073-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-139-184.ec2.internal\" (UID: \"77560a24cc76a08538baa0efe4273073\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-184.ec2.internal" Apr 28 19:16:24.710454 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.710363 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/77560a24cc76a08538baa0efe4273073-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-139-184.ec2.internal\" (UID: \"77560a24cc76a08538baa0efe4273073\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-184.ec2.internal" Apr 28 19:16:24.810832 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:16:24.810797 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-184.ec2.internal\" not found" Apr 28 19:16:24.810991 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.810840 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/77560a24cc76a08538baa0efe4273073-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-139-184.ec2.internal\" (UID: \"77560a24cc76a08538baa0efe4273073\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-184.ec2.internal" Apr 28 19:16:24.810991 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.810864 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/77560a24cc76a08538baa0efe4273073-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-139-184.ec2.internal\" (UID: \"77560a24cc76a08538baa0efe4273073\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-184.ec2.internal" Apr 28 19:16:24.810991 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.810893 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/e3440d423eacb4ec58cf1cc320321a41-config\") pod \"kube-apiserver-proxy-ip-10-0-139-184.ec2.internal\" (UID: \"e3440d423eacb4ec58cf1cc320321a41\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-139-184.ec2.internal" Apr 28 19:16:24.810991 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.810930 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/77560a24cc76a08538baa0efe4273073-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-139-184.ec2.internal\" (UID: \"77560a24cc76a08538baa0efe4273073\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-184.ec2.internal" Apr 28 19:16:24.810991 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.810956 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/77560a24cc76a08538baa0efe4273073-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-139-184.ec2.internal\" (UID: \"77560a24cc76a08538baa0efe4273073\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-184.ec2.internal" Apr 28 19:16:24.911796 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:16:24.911698 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-184.ec2.internal\" not found" Apr 28 19:16:24.911796 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.911715 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/e3440d423eacb4ec58cf1cc320321a41-config\") pod \"kube-apiserver-proxy-ip-10-0-139-184.ec2.internal\" (UID: \"e3440d423eacb4ec58cf1cc320321a41\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-139-184.ec2.internal" Apr 28 19:16:24.911796 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.911748 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/e3440d423eacb4ec58cf1cc320321a41-config\") pod \"kube-apiserver-proxy-ip-10-0-139-184.ec2.internal\" (UID: \"e3440d423eacb4ec58cf1cc320321a41\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-139-184.ec2.internal" Apr 28 19:16:24.930911 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.930884 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-184.ec2.internal" Apr 28 19:16:24.935428 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:24.935409 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-184.ec2.internal" Apr 28 19:16:25.012311 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:16:25.012259 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-184.ec2.internal\" not found" Apr 28 19:16:25.112708 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:16:25.112674 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-184.ec2.internal\" not found" Apr 28 19:16:25.213110 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:16:25.213022 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-184.ec2.internal\" not found" Apr 28 19:16:25.313480 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:16:25.313443 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-184.ec2.internal\" not found" Apr 28 19:16:25.335710 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:25.335673 2576 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 28 19:16:25.336290 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:25.335903 2576 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 28 19:16:25.408319 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:25.408291 2576 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 28 19:16:25.414655 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:16:25.414623 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-184.ec2.internal\" not found" Apr 28 19:16:25.426220 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:25.426196 2576 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 28 19:16:25.431501 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:25.431471 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-27 19:11:24 +0000 UTC" deadline="2027-09-26 12:21:10.899947666 +0000 UTC" Apr 28 19:16:25.431501 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:25.431500 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12377h4m45.468451382s" Apr 28 19:16:25.453294 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:25.453260 2576 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-gj2z5" Apr 28 19:16:25.462327 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:25.462302 2576 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-gj2z5" Apr 28 19:16:25.480781 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:25.480730 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3440d423eacb4ec58cf1cc320321a41.slice/crio-e84ac6e21aabc4ee77eaaec023e81e20c77920fc06cb0a3978f3121f9933812a WatchSource:0}: Error finding container e84ac6e21aabc4ee77eaaec023e81e20c77920fc06cb0a3978f3121f9933812a: Status 404 returned error can't find the container with id e84ac6e21aabc4ee77eaaec023e81e20c77920fc06cb0a3978f3121f9933812a Apr 28 19:16:25.485825 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:25.485806 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 28 19:16:25.491393 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:25.491363 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod77560a24cc76a08538baa0efe4273073.slice/crio-1301fa4a10d0afec787451c132e045ad390ca95d5457ec56eb98e218eac4c1fc WatchSource:0}: Error finding container 1301fa4a10d0afec787451c132e045ad390ca95d5457ec56eb98e218eac4c1fc: Status 404 returned error can't find the container with id 1301fa4a10d0afec787451c132e045ad390ca95d5457ec56eb98e218eac4c1fc Apr 28 19:16:25.511371 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:25.511315 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-184.ec2.internal" event={"ID":"e3440d423eacb4ec58cf1cc320321a41","Type":"ContainerStarted","Data":"e84ac6e21aabc4ee77eaaec023e81e20c77920fc06cb0a3978f3121f9933812a"} Apr 28 19:16:25.512141 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:25.512121 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-184.ec2.internal" event={"ID":"77560a24cc76a08538baa0efe4273073","Type":"ContainerStarted","Data":"1301fa4a10d0afec787451c132e045ad390ca95d5457ec56eb98e218eac4c1fc"} Apr 28 19:16:25.515347 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:16:25.515330 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-184.ec2.internal\" not found" Apr 28 19:16:25.553577 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:25.553551 2576 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 28 19:16:25.556789 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:25.556772 2576 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 28 19:16:25.584655 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:25.584621 2576 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 28 19:16:25.610407 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:25.610373 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-184.ec2.internal" Apr 28 19:16:25.623274 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:25.623255 2576 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 28 19:16:25.624012 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:25.624000 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-184.ec2.internal" Apr 28 19:16:25.633147 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:25.633126 2576 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 28 19:16:26.390036 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.390006 2576 apiserver.go:52] "Watching apiserver" Apr 28 19:16:26.398396 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.398372 2576 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 28 19:16:26.398829 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.398796 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-node-tuning-operator/tuned-z4tgm","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-184.ec2.internal","openshift-multus/multus-rktwr","openshift-multus/network-metrics-daemon-8j8w9","openshift-network-diagnostics/network-check-target-hgdfg","openshift-network-operator/iptables-alerter-z9bgv","openshift-ovn-kubernetes/ovnkube-node-5tdk8","kube-system/konnectivity-agent-j966f","kube-system/kube-apiserver-proxy-ip-10-0-139-184.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vqwcj","openshift-image-registry/node-ca-xfk6k","openshift-multus/multus-additional-cni-plugins-wt99f"] Apr 28 19:16:26.401302 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.401282 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-z4tgm" Apr 28 19:16:26.404028 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.404005 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 28 19:16:26.404028 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.404021 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 28 19:16:26.404188 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.404043 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-jc7mx\"" Apr 28 19:16:26.405653 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.405536 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-rktwr" Apr 28 19:16:26.407556 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.407539 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8j8w9" Apr 28 19:16:26.407677 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:16:26.407624 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8j8w9" podUID="4236a3f6-5c96-4e29-bb77-8dafe3cd242d" Apr 28 19:16:26.408154 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.407838 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 28 19:16:26.408154 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.407915 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 28 19:16:26.408154 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.408051 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 28 19:16:26.408154 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.408067 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-pjj6k\"" Apr 28 19:16:26.408154 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.408073 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 28 19:16:26.409736 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.409716 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hgdfg" Apr 28 19:16:26.409830 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:16:26.409781 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hgdfg" podUID="ca288914-564f-4959-9c10-76a6327678fe" Apr 28 19:16:26.409830 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.409794 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-z9bgv" Apr 28 19:16:26.412556 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.412063 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-5tdk8" Apr 28 19:16:26.412556 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.412279 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 28 19:16:26.412556 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.412288 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 28 19:16:26.412556 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.412354 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-5nmfb\"" Apr 28 19:16:26.412795 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.412579 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 28 19:16:26.414822 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.414801 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 28 19:16:26.415348 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.414973 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 28 19:16:26.415348 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.415028 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 28 19:16:26.415348 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.415096 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 28 19:16:26.415348 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.415207 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 28 19:16:26.415972 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.415956 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-4xdc4\"" Apr 28 19:16:26.417404 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.417385 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-j966f" Apr 28 19:16:26.418382 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.418363 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 28 19:16:26.420795 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.420773 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 28 19:16:26.421095 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.421078 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-xfk6k" Apr 28 19:16:26.421888 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.421866 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vqwcj" Apr 28 19:16:26.423383 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.423359 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/84fd7385-382e-46ca-a795-1826538a8901-etc-sysconfig\") pod \"tuned-z4tgm\" (UID: \"84fd7385-382e-46ca-a795-1826538a8901\") " pod="openshift-cluster-node-tuning-operator/tuned-z4tgm" Apr 28 19:16:26.423472 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.423415 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/84fd7385-382e-46ca-a795-1826538a8901-etc-sysctl-d\") pod \"tuned-z4tgm\" (UID: \"84fd7385-382e-46ca-a795-1826538a8901\") " pod="openshift-cluster-node-tuning-operator/tuned-z4tgm" Apr 28 19:16:26.423472 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.423450 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/d4d04cef-5f05-4c8f-82a1-c1e8350c738c-host-var-lib-cni-multus\") pod \"multus-rktwr\" (UID: \"d4d04cef-5f05-4c8f-82a1-c1e8350c738c\") " pod="openshift-multus/multus-rktwr" Apr 28 19:16:26.423572 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.423481 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/84fd7385-382e-46ca-a795-1826538a8901-etc-systemd\") pod \"tuned-z4tgm\" (UID: \"84fd7385-382e-46ca-a795-1826538a8901\") " pod="openshift-cluster-node-tuning-operator/tuned-z4tgm" Apr 28 19:16:26.423640 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.423531 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/84fd7385-382e-46ca-a795-1826538a8901-host\") pod \"tuned-z4tgm\" (UID: \"84fd7385-382e-46ca-a795-1826538a8901\") " pod="openshift-cluster-node-tuning-operator/tuned-z4tgm" Apr 28 19:16:26.423726 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.423704 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d4d04cef-5f05-4c8f-82a1-c1e8350c738c-multus-conf-dir\") pod \"multus-rktwr\" (UID: \"d4d04cef-5f05-4c8f-82a1-c1e8350c738c\") " pod="openshift-multus/multus-rktwr" Apr 28 19:16:26.423786 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.423752 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/d4d04cef-5f05-4c8f-82a1-c1e8350c738c-multus-daemon-config\") pod \"multus-rktwr\" (UID: \"d4d04cef-5f05-4c8f-82a1-c1e8350c738c\") " pod="openshift-multus/multus-rktwr" Apr 28 19:16:26.423786 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.423781 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d4d04cef-5f05-4c8f-82a1-c1e8350c738c-etc-kubernetes\") pod \"multus-rktwr\" (UID: \"d4d04cef-5f05-4c8f-82a1-c1e8350c738c\") " pod="openshift-multus/multus-rktwr" Apr 28 19:16:26.423882 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.423810 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bxd2\" (UniqueName: \"kubernetes.io/projected/35de0ddf-e6a6-49cd-b5bd-9d110f16b469-kube-api-access-8bxd2\") pod \"iptables-alerter-z9bgv\" (UID: \"35de0ddf-e6a6-49cd-b5bd-9d110f16b469\") " pod="openshift-network-operator/iptables-alerter-z9bgv" Apr 28 19:16:26.423882 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.423842 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/84fd7385-382e-46ca-a795-1826538a8901-etc-modprobe-d\") pod \"tuned-z4tgm\" (UID: \"84fd7385-382e-46ca-a795-1826538a8901\") " pod="openshift-cluster-node-tuning-operator/tuned-z4tgm" Apr 28 19:16:26.424008 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.423990 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/84fd7385-382e-46ca-a795-1826538a8901-run\") pod \"tuned-z4tgm\" (UID: \"84fd7385-382e-46ca-a795-1826538a8901\") " pod="openshift-cluster-node-tuning-operator/tuned-z4tgm" Apr 28 19:16:26.424060 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.424027 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/84fd7385-382e-46ca-a795-1826538a8901-lib-modules\") pod \"tuned-z4tgm\" (UID: \"84fd7385-382e-46ca-a795-1826538a8901\") " pod="openshift-cluster-node-tuning-operator/tuned-z4tgm" Apr 28 19:16:26.424060 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.424053 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 28 19:16:26.424147 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.424058 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/84fd7385-382e-46ca-a795-1826538a8901-var-lib-kubelet\") pod \"tuned-z4tgm\" (UID: \"84fd7385-382e-46ca-a795-1826538a8901\") " pod="openshift-cluster-node-tuning-operator/tuned-z4tgm" Apr 28 19:16:26.424147 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.424104 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5pwrb\" (UniqueName: \"kubernetes.io/projected/84fd7385-382e-46ca-a795-1826538a8901-kube-api-access-5pwrb\") pod \"tuned-z4tgm\" (UID: \"84fd7385-382e-46ca-a795-1826538a8901\") " pod="openshift-cluster-node-tuning-operator/tuned-z4tgm" Apr 28 19:16:26.424241 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.424216 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-wgwbw\"" Apr 28 19:16:26.424309 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.424293 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d4d04cef-5f05-4c8f-82a1-c1e8350c738c-multus-cni-dir\") pod \"multus-rktwr\" (UID: \"d4d04cef-5f05-4c8f-82a1-c1e8350c738c\") " pod="openshift-multus/multus-rktwr" Apr 28 19:16:26.424356 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.424325 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vhhk\" (UniqueName: \"kubernetes.io/projected/ca288914-564f-4959-9c10-76a6327678fe-kube-api-access-5vhhk\") pod \"network-check-target-hgdfg\" (UID: \"ca288914-564f-4959-9c10-76a6327678fe\") " pod="openshift-network-diagnostics/network-check-target-hgdfg" Apr 28 19:16:26.424647 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.424628 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/84fd7385-382e-46ca-a795-1826538a8901-etc-sysctl-conf\") pod \"tuned-z4tgm\" (UID: \"84fd7385-382e-46ca-a795-1826538a8901\") " pod="openshift-cluster-node-tuning-operator/tuned-z4tgm" Apr 28 19:16:26.424739 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.424662 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/84fd7385-382e-46ca-a795-1826538a8901-sys\") pod \"tuned-z4tgm\" (UID: \"84fd7385-382e-46ca-a795-1826538a8901\") " pod="openshift-cluster-node-tuning-operator/tuned-z4tgm" Apr 28 19:16:26.424739 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.424693 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/35de0ddf-e6a6-49cd-b5bd-9d110f16b469-host-slash\") pod \"iptables-alerter-z9bgv\" (UID: \"35de0ddf-e6a6-49cd-b5bd-9d110f16b469\") " pod="openshift-network-operator/iptables-alerter-z9bgv" Apr 28 19:16:26.424883 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.424868 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/d4d04cef-5f05-4c8f-82a1-c1e8350c738c-multus-socket-dir-parent\") pod \"multus-rktwr\" (UID: \"d4d04cef-5f05-4c8f-82a1-c1e8350c738c\") " pod="openshift-multus/multus-rktwr" Apr 28 19:16:26.424955 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.424904 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gv7dm\" (UniqueName: \"kubernetes.io/projected/4236a3f6-5c96-4e29-bb77-8dafe3cd242d-kube-api-access-gv7dm\") pod \"network-metrics-daemon-8j8w9\" (UID: \"4236a3f6-5c96-4e29-bb77-8dafe3cd242d\") " pod="openshift-multus/network-metrics-daemon-8j8w9" Apr 28 19:16:26.425007 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.424956 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/84fd7385-382e-46ca-a795-1826538a8901-etc-kubernetes\") pod \"tuned-z4tgm\" (UID: \"84fd7385-382e-46ca-a795-1826538a8901\") " pod="openshift-cluster-node-tuning-operator/tuned-z4tgm" Apr 28 19:16:26.425007 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.424985 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d4d04cef-5f05-4c8f-82a1-c1e8350c738c-cni-binary-copy\") pod \"multus-rktwr\" (UID: \"d4d04cef-5f05-4c8f-82a1-c1e8350c738c\") " pod="openshift-multus/multus-rktwr" Apr 28 19:16:26.425091 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.425012 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 28 19:16:26.425216 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.425017 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d4d04cef-5f05-4c8f-82a1-c1e8350c738c-host-run-netns\") pod \"multus-rktwr\" (UID: \"d4d04cef-5f05-4c8f-82a1-c1e8350c738c\") " pod="openshift-multus/multus-rktwr" Apr 28 19:16:26.425267 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.425251 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4236a3f6-5c96-4e29-bb77-8dafe3cd242d-metrics-certs\") pod \"network-metrics-daemon-8j8w9\" (UID: \"4236a3f6-5c96-4e29-bb77-8dafe3cd242d\") " pod="openshift-multus/network-metrics-daemon-8j8w9" Apr 28 19:16:26.425348 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.425321 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d4d04cef-5f05-4c8f-82a1-c1e8350c738c-os-release\") pod \"multus-rktwr\" (UID: \"d4d04cef-5f05-4c8f-82a1-c1e8350c738c\") " pod="openshift-multus/multus-rktwr" Apr 28 19:16:26.425396 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.425370 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/d4d04cef-5f05-4c8f-82a1-c1e8350c738c-host-run-k8s-cni-cncf-io\") pod \"multus-rktwr\" (UID: \"d4d04cef-5f05-4c8f-82a1-c1e8350c738c\") " pod="openshift-multus/multus-rktwr" Apr 28 19:16:26.425443 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.425396 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d4d04cef-5f05-4c8f-82a1-c1e8350c738c-host-var-lib-cni-bin\") pod \"multus-rktwr\" (UID: \"d4d04cef-5f05-4c8f-82a1-c1e8350c738c\") " pod="openshift-multus/multus-rktwr" Apr 28 19:16:26.425443 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.425436 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/d4d04cef-5f05-4c8f-82a1-c1e8350c738c-hostroot\") pod \"multus-rktwr\" (UID: \"d4d04cef-5f05-4c8f-82a1-c1e8350c738c\") " pod="openshift-multus/multus-rktwr" Apr 28 19:16:26.425531 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.425467 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/35de0ddf-e6a6-49cd-b5bd-9d110f16b469-iptables-alerter-script\") pod \"iptables-alerter-z9bgv\" (UID: \"35de0ddf-e6a6-49cd-b5bd-9d110f16b469\") " pod="openshift-network-operator/iptables-alerter-z9bgv" Apr 28 19:16:26.425531 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.425495 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/84fd7385-382e-46ca-a795-1826538a8901-etc-tuned\") pod \"tuned-z4tgm\" (UID: \"84fd7385-382e-46ca-a795-1826538a8901\") " pod="openshift-cluster-node-tuning-operator/tuned-z4tgm" Apr 28 19:16:26.425531 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.425523 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/84fd7385-382e-46ca-a795-1826538a8901-tmp\") pod \"tuned-z4tgm\" (UID: \"84fd7385-382e-46ca-a795-1826538a8901\") " pod="openshift-cluster-node-tuning-operator/tuned-z4tgm" Apr 28 19:16:26.425673 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.425547 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d4d04cef-5f05-4c8f-82a1-c1e8350c738c-system-cni-dir\") pod \"multus-rktwr\" (UID: \"d4d04cef-5f05-4c8f-82a1-c1e8350c738c\") " pod="openshift-multus/multus-rktwr" Apr 28 19:16:26.425673 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.425573 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d4d04cef-5f05-4c8f-82a1-c1e8350c738c-cnibin\") pod \"multus-rktwr\" (UID: \"d4d04cef-5f05-4c8f-82a1-c1e8350c738c\") " pod="openshift-multus/multus-rktwr" Apr 28 19:16:26.425673 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.425618 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d4d04cef-5f05-4c8f-82a1-c1e8350c738c-host-var-lib-kubelet\") pod \"multus-rktwr\" (UID: \"d4d04cef-5f05-4c8f-82a1-c1e8350c738c\") " pod="openshift-multus/multus-rktwr" Apr 28 19:16:26.425673 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.425646 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/d4d04cef-5f05-4c8f-82a1-c1e8350c738c-host-run-multus-certs\") pod \"multus-rktwr\" (UID: \"d4d04cef-5f05-4c8f-82a1-c1e8350c738c\") " pod="openshift-multus/multus-rktwr" Apr 28 19:16:26.425821 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.425709 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khmmj\" (UniqueName: \"kubernetes.io/projected/d4d04cef-5f05-4c8f-82a1-c1e8350c738c-kube-api-access-khmmj\") pod \"multus-rktwr\" (UID: \"d4d04cef-5f05-4c8f-82a1-c1e8350c738c\") " pod="openshift-multus/multus-rktwr" Apr 28 19:16:26.426866 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.426369 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-wt99f" Apr 28 19:16:26.426866 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.426464 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-m9bsx\"" Apr 28 19:16:26.426866 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.426654 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-plvh4\"" Apr 28 19:16:26.426866 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.426742 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 28 19:16:26.426866 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.426758 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 28 19:16:26.427110 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.426994 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 28 19:16:26.427110 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.426996 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 28 19:16:26.427110 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.427070 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 28 19:16:26.429268 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.428848 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 28 19:16:26.429268 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.429074 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 28 19:16:26.429268 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.429111 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-lqs9h\"" Apr 28 19:16:26.463597 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.463568 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-27 19:11:25 +0000 UTC" deadline="2028-01-10 11:09:18.890209322 +0000 UTC" Apr 28 19:16:26.463597 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.463596 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14919h52m52.426617094s" Apr 28 19:16:26.511351 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.511322 2576 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 28 19:16:26.526616 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.526561 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ff2f50e1-de53-4f11-a477-9236b340536b-log-socket\") pod \"ovnkube-node-5tdk8\" (UID: \"ff2f50e1-de53-4f11-a477-9236b340536b\") " pod="openshift-ovn-kubernetes/ovnkube-node-5tdk8" Apr 28 19:16:26.526616 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.526622 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nr28s\" (UniqueName: \"kubernetes.io/projected/51e675bf-bae4-491c-adfc-eae81fef84bf-kube-api-access-nr28s\") pod \"multus-additional-cni-plugins-wt99f\" (UID: \"51e675bf-bae4-491c-adfc-eae81fef84bf\") " pod="openshift-multus/multus-additional-cni-plugins-wt99f" Apr 28 19:16:26.526862 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.526660 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/84fd7385-382e-46ca-a795-1826538a8901-run\") pod \"tuned-z4tgm\" (UID: \"84fd7385-382e-46ca-a795-1826538a8901\") " pod="openshift-cluster-node-tuning-operator/tuned-z4tgm" Apr 28 19:16:26.526862 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.526690 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/84fd7385-382e-46ca-a795-1826538a8901-var-lib-kubelet\") pod \"tuned-z4tgm\" (UID: \"84fd7385-382e-46ca-a795-1826538a8901\") " pod="openshift-cluster-node-tuning-operator/tuned-z4tgm" Apr 28 19:16:26.526862 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.526738 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5vhhk\" (UniqueName: \"kubernetes.io/projected/ca288914-564f-4959-9c10-76a6327678fe-kube-api-access-5vhhk\") pod \"network-check-target-hgdfg\" (UID: \"ca288914-564f-4959-9c10-76a6327678fe\") " pod="openshift-network-diagnostics/network-check-target-hgdfg" Apr 28 19:16:26.526862 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.526757 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b58c1c28-103c-4a87-a33c-367210699fab-registration-dir\") pod \"aws-ebs-csi-driver-node-vqwcj\" (UID: \"b58c1c28-103c-4a87-a33c-367210699fab\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vqwcj" Apr 28 19:16:26.526862 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.526765 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/84fd7385-382e-46ca-a795-1826538a8901-run\") pod \"tuned-z4tgm\" (UID: \"84fd7385-382e-46ca-a795-1826538a8901\") " pod="openshift-cluster-node-tuning-operator/tuned-z4tgm" Apr 28 19:16:26.526862 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.526775 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/84fd7385-382e-46ca-a795-1826538a8901-var-lib-kubelet\") pod \"tuned-z4tgm\" (UID: \"84fd7385-382e-46ca-a795-1826538a8901\") " pod="openshift-cluster-node-tuning-operator/tuned-z4tgm" Apr 28 19:16:26.526862 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.526781 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/84fd7385-382e-46ca-a795-1826538a8901-etc-sysctl-conf\") pod \"tuned-z4tgm\" (UID: \"84fd7385-382e-46ca-a795-1826538a8901\") " pod="openshift-cluster-node-tuning-operator/tuned-z4tgm" Apr 28 19:16:26.526862 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.526832 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ff2f50e1-de53-4f11-a477-9236b340536b-host-kubelet\") pod \"ovnkube-node-5tdk8\" (UID: \"ff2f50e1-de53-4f11-a477-9236b340536b\") " pod="openshift-ovn-kubernetes/ovnkube-node-5tdk8" Apr 28 19:16:26.526862 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.526862 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ff2f50e1-de53-4f11-a477-9236b340536b-host-run-ovn-kubernetes\") pod \"ovnkube-node-5tdk8\" (UID: \"ff2f50e1-de53-4f11-a477-9236b340536b\") " pod="openshift-ovn-kubernetes/ovnkube-node-5tdk8" Apr 28 19:16:26.527261 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.526889 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ff2f50e1-de53-4f11-a477-9236b340536b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-5tdk8\" (UID: \"ff2f50e1-de53-4f11-a477-9236b340536b\") " pod="openshift-ovn-kubernetes/ovnkube-node-5tdk8" Apr 28 19:16:26.527261 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.526933 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ff2f50e1-de53-4f11-a477-9236b340536b-ovnkube-script-lib\") pod \"ovnkube-node-5tdk8\" (UID: \"ff2f50e1-de53-4f11-a477-9236b340536b\") " pod="openshift-ovn-kubernetes/ovnkube-node-5tdk8" Apr 28 19:16:26.527261 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.526974 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pnfw\" (UniqueName: \"kubernetes.io/projected/b58c1c28-103c-4a87-a33c-367210699fab-kube-api-access-4pnfw\") pod \"aws-ebs-csi-driver-node-vqwcj\" (UID: \"b58c1c28-103c-4a87-a33c-367210699fab\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vqwcj" Apr 28 19:16:26.527261 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.526998 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/84fd7385-382e-46ca-a795-1826538a8901-etc-sysctl-conf\") pod \"tuned-z4tgm\" (UID: \"84fd7385-382e-46ca-a795-1826538a8901\") " pod="openshift-cluster-node-tuning-operator/tuned-z4tgm" Apr 28 19:16:26.527261 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.527004 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/d4d04cef-5f05-4c8f-82a1-c1e8350c738c-multus-socket-dir-parent\") pod \"multus-rktwr\" (UID: \"d4d04cef-5f05-4c8f-82a1-c1e8350c738c\") " pod="openshift-multus/multus-rktwr" Apr 28 19:16:26.527261 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.527039 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gv7dm\" (UniqueName: \"kubernetes.io/projected/4236a3f6-5c96-4e29-bb77-8dafe3cd242d-kube-api-access-gv7dm\") pod \"network-metrics-daemon-8j8w9\" (UID: \"4236a3f6-5c96-4e29-bb77-8dafe3cd242d\") " pod="openshift-multus/network-metrics-daemon-8j8w9" Apr 28 19:16:26.527261 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.527057 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/d4d04cef-5f05-4c8f-82a1-c1e8350c738c-multus-socket-dir-parent\") pod \"multus-rktwr\" (UID: \"d4d04cef-5f05-4c8f-82a1-c1e8350c738c\") " pod="openshift-multus/multus-rktwr" Apr 28 19:16:26.527261 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.527068 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ff2f50e1-de53-4f11-a477-9236b340536b-host-cni-netd\") pod \"ovnkube-node-5tdk8\" (UID: \"ff2f50e1-de53-4f11-a477-9236b340536b\") " pod="openshift-ovn-kubernetes/ovnkube-node-5tdk8" Apr 28 19:16:26.527261 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.527090 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ff2f50e1-de53-4f11-a477-9236b340536b-ovnkube-config\") pod \"ovnkube-node-5tdk8\" (UID: \"ff2f50e1-de53-4f11-a477-9236b340536b\") " pod="openshift-ovn-kubernetes/ovnkube-node-5tdk8" Apr 28 19:16:26.527261 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.527128 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b5119881-7aaa-4ea1-8738-f8463adc7b0c-serviceca\") pod \"node-ca-xfk6k\" (UID: \"b5119881-7aaa-4ea1-8738-f8463adc7b0c\") " pod="openshift-image-registry/node-ca-xfk6k" Apr 28 19:16:26.527261 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.527154 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/84fd7385-382e-46ca-a795-1826538a8901-etc-kubernetes\") pod \"tuned-z4tgm\" (UID: \"84fd7385-382e-46ca-a795-1826538a8901\") " pod="openshift-cluster-node-tuning-operator/tuned-z4tgm" Apr 28 19:16:26.527261 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.527186 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d4d04cef-5f05-4c8f-82a1-c1e8350c738c-cni-binary-copy\") pod \"multus-rktwr\" (UID: \"d4d04cef-5f05-4c8f-82a1-c1e8350c738c\") " pod="openshift-multus/multus-rktwr" Apr 28 19:16:26.527261 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.527255 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/84fd7385-382e-46ca-a795-1826538a8901-etc-kubernetes\") pod \"tuned-z4tgm\" (UID: \"84fd7385-382e-46ca-a795-1826538a8901\") " pod="openshift-cluster-node-tuning-operator/tuned-z4tgm" Apr 28 19:16:26.527800 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.527288 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d4d04cef-5f05-4c8f-82a1-c1e8350c738c-host-run-netns\") pod \"multus-rktwr\" (UID: \"d4d04cef-5f05-4c8f-82a1-c1e8350c738c\") " pod="openshift-multus/multus-rktwr" Apr 28 19:16:26.527800 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.527315 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ff2f50e1-de53-4f11-a477-9236b340536b-run-ovn\") pod \"ovnkube-node-5tdk8\" (UID: \"ff2f50e1-de53-4f11-a477-9236b340536b\") " pod="openshift-ovn-kubernetes/ovnkube-node-5tdk8" Apr 28 19:16:26.527800 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.527344 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/d4d04cef-5f05-4c8f-82a1-c1e8350c738c-host-run-k8s-cni-cncf-io\") pod \"multus-rktwr\" (UID: \"d4d04cef-5f05-4c8f-82a1-c1e8350c738c\") " pod="openshift-multus/multus-rktwr" Apr 28 19:16:26.527800 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.527368 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d4d04cef-5f05-4c8f-82a1-c1e8350c738c-host-var-lib-cni-bin\") pod \"multus-rktwr\" (UID: \"d4d04cef-5f05-4c8f-82a1-c1e8350c738c\") " pod="openshift-multus/multus-rktwr" Apr 28 19:16:26.527800 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.527473 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/d4d04cef-5f05-4c8f-82a1-c1e8350c738c-host-run-k8s-cni-cncf-io\") pod \"multus-rktwr\" (UID: \"d4d04cef-5f05-4c8f-82a1-c1e8350c738c\") " pod="openshift-multus/multus-rktwr" Apr 28 19:16:26.527800 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.527473 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d4d04cef-5f05-4c8f-82a1-c1e8350c738c-host-var-lib-cni-bin\") pod \"multus-rktwr\" (UID: \"d4d04cef-5f05-4c8f-82a1-c1e8350c738c\") " pod="openshift-multus/multus-rktwr" Apr 28 19:16:26.527800 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.527376 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d4d04cef-5f05-4c8f-82a1-c1e8350c738c-host-run-netns\") pod \"multus-rktwr\" (UID: \"d4d04cef-5f05-4c8f-82a1-c1e8350c738c\") " pod="openshift-multus/multus-rktwr" Apr 28 19:16:26.527800 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.527482 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/35de0ddf-e6a6-49cd-b5bd-9d110f16b469-iptables-alerter-script\") pod \"iptables-alerter-z9bgv\" (UID: \"35de0ddf-e6a6-49cd-b5bd-9d110f16b469\") " pod="openshift-network-operator/iptables-alerter-z9bgv" Apr 28 19:16:26.527800 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.527543 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ff2f50e1-de53-4f11-a477-9236b340536b-systemd-units\") pod \"ovnkube-node-5tdk8\" (UID: \"ff2f50e1-de53-4f11-a477-9236b340536b\") " pod="openshift-ovn-kubernetes/ovnkube-node-5tdk8" Apr 28 19:16:26.527800 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.527581 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ff2f50e1-de53-4f11-a477-9236b340536b-host-cni-bin\") pod \"ovnkube-node-5tdk8\" (UID: \"ff2f50e1-de53-4f11-a477-9236b340536b\") " pod="openshift-ovn-kubernetes/ovnkube-node-5tdk8" Apr 28 19:16:26.527800 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.527629 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/b58c1c28-103c-4a87-a33c-367210699fab-sys-fs\") pod \"aws-ebs-csi-driver-node-vqwcj\" (UID: \"b58c1c28-103c-4a87-a33c-367210699fab\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vqwcj" Apr 28 19:16:26.527800 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.527653 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/51e675bf-bae4-491c-adfc-eae81fef84bf-tuning-conf-dir\") pod \"multus-additional-cni-plugins-wt99f\" (UID: \"51e675bf-bae4-491c-adfc-eae81fef84bf\") " pod="openshift-multus/multus-additional-cni-plugins-wt99f" Apr 28 19:16:26.527800 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.527677 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/51e675bf-bae4-491c-adfc-eae81fef84bf-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-wt99f\" (UID: \"51e675bf-bae4-491c-adfc-eae81fef84bf\") " pod="openshift-multus/multus-additional-cni-plugins-wt99f" Apr 28 19:16:26.527800 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.527733 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/84fd7385-382e-46ca-a795-1826538a8901-etc-tuned\") pod \"tuned-z4tgm\" (UID: \"84fd7385-382e-46ca-a795-1826538a8901\") " pod="openshift-cluster-node-tuning-operator/tuned-z4tgm" Apr 28 19:16:26.528398 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.527918 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d4d04cef-5f05-4c8f-82a1-c1e8350c738c-cni-binary-copy\") pod \"multus-rktwr\" (UID: \"d4d04cef-5f05-4c8f-82a1-c1e8350c738c\") " pod="openshift-multus/multus-rktwr" Apr 28 19:16:26.528398 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.528018 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d4d04cef-5f05-4c8f-82a1-c1e8350c738c-cnibin\") pod \"multus-rktwr\" (UID: \"d4d04cef-5f05-4c8f-82a1-c1e8350c738c\") " pod="openshift-multus/multus-rktwr" Apr 28 19:16:26.528398 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.528018 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/35de0ddf-e6a6-49cd-b5bd-9d110f16b469-iptables-alerter-script\") pod \"iptables-alerter-z9bgv\" (UID: \"35de0ddf-e6a6-49cd-b5bd-9d110f16b469\") " pod="openshift-network-operator/iptables-alerter-z9bgv" Apr 28 19:16:26.528398 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.528062 2576 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 28 19:16:26.528398 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.528092 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d4d04cef-5f05-4c8f-82a1-c1e8350c738c-cnibin\") pod \"multus-rktwr\" (UID: \"d4d04cef-5f05-4c8f-82a1-c1e8350c738c\") " pod="openshift-multus/multus-rktwr" Apr 28 19:16:26.528398 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.528131 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-khmmj\" (UniqueName: \"kubernetes.io/projected/d4d04cef-5f05-4c8f-82a1-c1e8350c738c-kube-api-access-khmmj\") pod \"multus-rktwr\" (UID: \"d4d04cef-5f05-4c8f-82a1-c1e8350c738c\") " pod="openshift-multus/multus-rktwr" Apr 28 19:16:26.528398 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.528163 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ff2f50e1-de53-4f11-a477-9236b340536b-host-run-netns\") pod \"ovnkube-node-5tdk8\" (UID: \"ff2f50e1-de53-4f11-a477-9236b340536b\") " pod="openshift-ovn-kubernetes/ovnkube-node-5tdk8" Apr 28 19:16:26.528398 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.528189 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fp5b\" (UniqueName: \"kubernetes.io/projected/b5119881-7aaa-4ea1-8738-f8463adc7b0c-kube-api-access-4fp5b\") pod \"node-ca-xfk6k\" (UID: \"b5119881-7aaa-4ea1-8738-f8463adc7b0c\") " pod="openshift-image-registry/node-ca-xfk6k" Apr 28 19:16:26.528398 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.528218 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ff2f50e1-de53-4f11-a477-9236b340536b-ovn-node-metrics-cert\") pod \"ovnkube-node-5tdk8\" (UID: \"ff2f50e1-de53-4f11-a477-9236b340536b\") " pod="openshift-ovn-kubernetes/ovnkube-node-5tdk8" Apr 28 19:16:26.528398 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.528244 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/84fd7385-382e-46ca-a795-1826538a8901-host\") pod \"tuned-z4tgm\" (UID: \"84fd7385-382e-46ca-a795-1826538a8901\") " pod="openshift-cluster-node-tuning-operator/tuned-z4tgm" Apr 28 19:16:26.528398 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.528293 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/d4d04cef-5f05-4c8f-82a1-c1e8350c738c-multus-daemon-config\") pod \"multus-rktwr\" (UID: \"d4d04cef-5f05-4c8f-82a1-c1e8350c738c\") " pod="openshift-multus/multus-rktwr" Apr 28 19:16:26.528398 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.528319 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8bxd2\" (UniqueName: \"kubernetes.io/projected/35de0ddf-e6a6-49cd-b5bd-9d110f16b469-kube-api-access-8bxd2\") pod \"iptables-alerter-z9bgv\" (UID: \"35de0ddf-e6a6-49cd-b5bd-9d110f16b469\") " pod="openshift-network-operator/iptables-alerter-z9bgv" Apr 28 19:16:26.528398 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.528348 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ff2f50e1-de53-4f11-a477-9236b340536b-run-openvswitch\") pod \"ovnkube-node-5tdk8\" (UID: \"ff2f50e1-de53-4f11-a477-9236b340536b\") " pod="openshift-ovn-kubernetes/ovnkube-node-5tdk8" Apr 28 19:16:26.528398 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.528374 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/84fd7385-382e-46ca-a795-1826538a8901-etc-modprobe-d\") pod \"tuned-z4tgm\" (UID: \"84fd7385-382e-46ca-a795-1826538a8901\") " pod="openshift-cluster-node-tuning-operator/tuned-z4tgm" Apr 28 19:16:26.528398 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.528398 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/84fd7385-382e-46ca-a795-1826538a8901-lib-modules\") pod \"tuned-z4tgm\" (UID: \"84fd7385-382e-46ca-a795-1826538a8901\") " pod="openshift-cluster-node-tuning-operator/tuned-z4tgm" Apr 28 19:16:26.528895 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.528422 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/84fd7385-382e-46ca-a795-1826538a8901-host\") pod \"tuned-z4tgm\" (UID: \"84fd7385-382e-46ca-a795-1826538a8901\") " pod="openshift-cluster-node-tuning-operator/tuned-z4tgm" Apr 28 19:16:26.528895 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.528433 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5pwrb\" (UniqueName: \"kubernetes.io/projected/84fd7385-382e-46ca-a795-1826538a8901-kube-api-access-5pwrb\") pod \"tuned-z4tgm\" (UID: \"84fd7385-382e-46ca-a795-1826538a8901\") " pod="openshift-cluster-node-tuning-operator/tuned-z4tgm" Apr 28 19:16:26.528895 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.528458 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d4d04cef-5f05-4c8f-82a1-c1e8350c738c-multus-cni-dir\") pod \"multus-rktwr\" (UID: \"d4d04cef-5f05-4c8f-82a1-c1e8350c738c\") " pod="openshift-multus/multus-rktwr" Apr 28 19:16:26.528895 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.528487 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/c1267be0-6760-422e-b253-3d9b5132c496-agent-certs\") pod \"konnectivity-agent-j966f\" (UID: \"c1267be0-6760-422e-b253-3d9b5132c496\") " pod="kube-system/konnectivity-agent-j966f" Apr 28 19:16:26.528895 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.528511 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/b58c1c28-103c-4a87-a33c-367210699fab-etc-selinux\") pod \"aws-ebs-csi-driver-node-vqwcj\" (UID: \"b58c1c28-103c-4a87-a33c-367210699fab\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vqwcj" Apr 28 19:16:26.528895 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.528536 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b5119881-7aaa-4ea1-8738-f8463adc7b0c-host\") pod \"node-ca-xfk6k\" (UID: \"b5119881-7aaa-4ea1-8738-f8463adc7b0c\") " pod="openshift-image-registry/node-ca-xfk6k" Apr 28 19:16:26.528895 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.528557 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/51e675bf-bae4-491c-adfc-eae81fef84bf-cnibin\") pod \"multus-additional-cni-plugins-wt99f\" (UID: \"51e675bf-bae4-491c-adfc-eae81fef84bf\") " pod="openshift-multus/multus-additional-cni-plugins-wt99f" Apr 28 19:16:26.528895 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.528580 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/84fd7385-382e-46ca-a795-1826538a8901-sys\") pod \"tuned-z4tgm\" (UID: \"84fd7385-382e-46ca-a795-1826538a8901\") " pod="openshift-cluster-node-tuning-operator/tuned-z4tgm" Apr 28 19:16:26.528895 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.528621 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/35de0ddf-e6a6-49cd-b5bd-9d110f16b469-host-slash\") pod \"iptables-alerter-z9bgv\" (UID: \"35de0ddf-e6a6-49cd-b5bd-9d110f16b469\") " pod="openshift-network-operator/iptables-alerter-z9bgv" Apr 28 19:16:26.528895 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.528651 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pdd8\" (UniqueName: \"kubernetes.io/projected/ff2f50e1-de53-4f11-a477-9236b340536b-kube-api-access-2pdd8\") pod \"ovnkube-node-5tdk8\" (UID: \"ff2f50e1-de53-4f11-a477-9236b340536b\") " pod="openshift-ovn-kubernetes/ovnkube-node-5tdk8" Apr 28 19:16:26.528895 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.528678 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ff2f50e1-de53-4f11-a477-9236b340536b-host-slash\") pod \"ovnkube-node-5tdk8\" (UID: \"ff2f50e1-de53-4f11-a477-9236b340536b\") " pod="openshift-ovn-kubernetes/ovnkube-node-5tdk8" Apr 28 19:16:26.528895 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.528707 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4236a3f6-5c96-4e29-bb77-8dafe3cd242d-metrics-certs\") pod \"network-metrics-daemon-8j8w9\" (UID: \"4236a3f6-5c96-4e29-bb77-8dafe3cd242d\") " pod="openshift-multus/network-metrics-daemon-8j8w9" Apr 28 19:16:26.528895 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.528731 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ff2f50e1-de53-4f11-a477-9236b340536b-var-lib-openvswitch\") pod \"ovnkube-node-5tdk8\" (UID: \"ff2f50e1-de53-4f11-a477-9236b340536b\") " pod="openshift-ovn-kubernetes/ovnkube-node-5tdk8" Apr 28 19:16:26.528895 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.528757 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/51e675bf-bae4-491c-adfc-eae81fef84bf-system-cni-dir\") pod \"multus-additional-cni-plugins-wt99f\" (UID: \"51e675bf-bae4-491c-adfc-eae81fef84bf\") " pod="openshift-multus/multus-additional-cni-plugins-wt99f" Apr 28 19:16:26.528895 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.528806 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/51e675bf-bae4-491c-adfc-eae81fef84bf-os-release\") pod \"multus-additional-cni-plugins-wt99f\" (UID: \"51e675bf-bae4-491c-adfc-eae81fef84bf\") " pod="openshift-multus/multus-additional-cni-plugins-wt99f" Apr 28 19:16:26.528895 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.528842 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d4d04cef-5f05-4c8f-82a1-c1e8350c738c-os-release\") pod \"multus-rktwr\" (UID: \"d4d04cef-5f05-4c8f-82a1-c1e8350c738c\") " pod="openshift-multus/multus-rktwr" Apr 28 19:16:26.528895 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.528891 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/d4d04cef-5f05-4c8f-82a1-c1e8350c738c-hostroot\") pod \"multus-rktwr\" (UID: \"d4d04cef-5f05-4c8f-82a1-c1e8350c738c\") " pod="openshift-multus/multus-rktwr" Apr 28 19:16:26.529630 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.528920 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d4d04cef-5f05-4c8f-82a1-c1e8350c738c-os-release\") pod \"multus-rktwr\" (UID: \"d4d04cef-5f05-4c8f-82a1-c1e8350c738c\") " pod="openshift-multus/multus-rktwr" Apr 28 19:16:26.529630 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.528951 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/d4d04cef-5f05-4c8f-82a1-c1e8350c738c-hostroot\") pod \"multus-rktwr\" (UID: \"d4d04cef-5f05-4c8f-82a1-c1e8350c738c\") " pod="openshift-multus/multus-rktwr" Apr 28 19:16:26.529630 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.528983 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/d4d04cef-5f05-4c8f-82a1-c1e8350c738c-multus-daemon-config\") pod \"multus-rktwr\" (UID: \"d4d04cef-5f05-4c8f-82a1-c1e8350c738c\") " pod="openshift-multus/multus-rktwr" Apr 28 19:16:26.529630 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.528974 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d4d04cef-5f05-4c8f-82a1-c1e8350c738c-multus-cni-dir\") pod \"multus-rktwr\" (UID: \"d4d04cef-5f05-4c8f-82a1-c1e8350c738c\") " pod="openshift-multus/multus-rktwr" Apr 28 19:16:26.529630 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.528926 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ff2f50e1-de53-4f11-a477-9236b340536b-env-overrides\") pod \"ovnkube-node-5tdk8\" (UID: \"ff2f50e1-de53-4f11-a477-9236b340536b\") " pod="openshift-ovn-kubernetes/ovnkube-node-5tdk8" Apr 28 19:16:26.529630 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.529031 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b58c1c28-103c-4a87-a33c-367210699fab-kubelet-dir\") pod \"aws-ebs-csi-driver-node-vqwcj\" (UID: \"b58c1c28-103c-4a87-a33c-367210699fab\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vqwcj" Apr 28 19:16:26.529630 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.529049 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/84fd7385-382e-46ca-a795-1826538a8901-sys\") pod \"tuned-z4tgm\" (UID: \"84fd7385-382e-46ca-a795-1826538a8901\") " pod="openshift-cluster-node-tuning-operator/tuned-z4tgm" Apr 28 19:16:26.529630 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.529073 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/84fd7385-382e-46ca-a795-1826538a8901-etc-modprobe-d\") pod \"tuned-z4tgm\" (UID: \"84fd7385-382e-46ca-a795-1826538a8901\") " pod="openshift-cluster-node-tuning-operator/tuned-z4tgm" Apr 28 19:16:26.529630 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:16:26.529092 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:16:26.529630 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.529102 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/51e675bf-bae4-491c-adfc-eae81fef84bf-cni-binary-copy\") pod \"multus-additional-cni-plugins-wt99f\" (UID: \"51e675bf-bae4-491c-adfc-eae81fef84bf\") " pod="openshift-multus/multus-additional-cni-plugins-wt99f" Apr 28 19:16:26.529630 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.529129 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/35de0ddf-e6a6-49cd-b5bd-9d110f16b469-host-slash\") pod \"iptables-alerter-z9bgv\" (UID: \"35de0ddf-e6a6-49cd-b5bd-9d110f16b469\") " pod="openshift-network-operator/iptables-alerter-z9bgv" Apr 28 19:16:26.529630 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.529140 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/84fd7385-382e-46ca-a795-1826538a8901-lib-modules\") pod \"tuned-z4tgm\" (UID: \"84fd7385-382e-46ca-a795-1826538a8901\") " pod="openshift-cluster-node-tuning-operator/tuned-z4tgm" Apr 28 19:16:26.529630 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.529178 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/84fd7385-382e-46ca-a795-1826538a8901-tmp\") pod \"tuned-z4tgm\" (UID: \"84fd7385-382e-46ca-a795-1826538a8901\") " pod="openshift-cluster-node-tuning-operator/tuned-z4tgm" Apr 28 19:16:26.529630 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:16:26.529206 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4236a3f6-5c96-4e29-bb77-8dafe3cd242d-metrics-certs podName:4236a3f6-5c96-4e29-bb77-8dafe3cd242d nodeName:}" failed. No retries permitted until 2026-04-28 19:16:27.029181389 +0000 UTC m=+2.992081733 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4236a3f6-5c96-4e29-bb77-8dafe3cd242d-metrics-certs") pod "network-metrics-daemon-8j8w9" (UID: "4236a3f6-5c96-4e29-bb77-8dafe3cd242d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:16:26.529630 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.529264 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d4d04cef-5f05-4c8f-82a1-c1e8350c738c-system-cni-dir\") pod \"multus-rktwr\" (UID: \"d4d04cef-5f05-4c8f-82a1-c1e8350c738c\") " pod="openshift-multus/multus-rktwr" Apr 28 19:16:26.529630 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.529315 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d4d04cef-5f05-4c8f-82a1-c1e8350c738c-host-var-lib-kubelet\") pod \"multus-rktwr\" (UID: \"d4d04cef-5f05-4c8f-82a1-c1e8350c738c\") " pod="openshift-multus/multus-rktwr" Apr 28 19:16:26.529630 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.529347 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/d4d04cef-5f05-4c8f-82a1-c1e8350c738c-host-run-multus-certs\") pod \"multus-rktwr\" (UID: \"d4d04cef-5f05-4c8f-82a1-c1e8350c738c\") " pod="openshift-multus/multus-rktwr" Apr 28 19:16:26.530577 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.529382 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/b58c1c28-103c-4a87-a33c-367210699fab-device-dir\") pod \"aws-ebs-csi-driver-node-vqwcj\" (UID: \"b58c1c28-103c-4a87-a33c-367210699fab\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vqwcj" Apr 28 19:16:26.530577 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.529388 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d4d04cef-5f05-4c8f-82a1-c1e8350c738c-host-var-lib-kubelet\") pod \"multus-rktwr\" (UID: \"d4d04cef-5f05-4c8f-82a1-c1e8350c738c\") " pod="openshift-multus/multus-rktwr" Apr 28 19:16:26.530577 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.529347 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d4d04cef-5f05-4c8f-82a1-c1e8350c738c-system-cni-dir\") pod \"multus-rktwr\" (UID: \"d4d04cef-5f05-4c8f-82a1-c1e8350c738c\") " pod="openshift-multus/multus-rktwr" Apr 28 19:16:26.530577 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.529410 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/84fd7385-382e-46ca-a795-1826538a8901-etc-sysconfig\") pod \"tuned-z4tgm\" (UID: \"84fd7385-382e-46ca-a795-1826538a8901\") " pod="openshift-cluster-node-tuning-operator/tuned-z4tgm" Apr 28 19:16:26.530577 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.529404 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/d4d04cef-5f05-4c8f-82a1-c1e8350c738c-host-run-multus-certs\") pod \"multus-rktwr\" (UID: \"d4d04cef-5f05-4c8f-82a1-c1e8350c738c\") " pod="openshift-multus/multus-rktwr" Apr 28 19:16:26.530577 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.529438 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/84fd7385-382e-46ca-a795-1826538a8901-etc-sysctl-d\") pod \"tuned-z4tgm\" (UID: \"84fd7385-382e-46ca-a795-1826538a8901\") " pod="openshift-cluster-node-tuning-operator/tuned-z4tgm" Apr 28 19:16:26.530577 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.529458 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/d4d04cef-5f05-4c8f-82a1-c1e8350c738c-host-var-lib-cni-multus\") pod \"multus-rktwr\" (UID: \"d4d04cef-5f05-4c8f-82a1-c1e8350c738c\") " pod="openshift-multus/multus-rktwr" Apr 28 19:16:26.530577 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.529474 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ff2f50e1-de53-4f11-a477-9236b340536b-run-systemd\") pod \"ovnkube-node-5tdk8\" (UID: \"ff2f50e1-de53-4f11-a477-9236b340536b\") " pod="openshift-ovn-kubernetes/ovnkube-node-5tdk8" Apr 28 19:16:26.530577 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.529456 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/84fd7385-382e-46ca-a795-1826538a8901-etc-sysconfig\") pod \"tuned-z4tgm\" (UID: \"84fd7385-382e-46ca-a795-1826538a8901\") " pod="openshift-cluster-node-tuning-operator/tuned-z4tgm" Apr 28 19:16:26.530577 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.529504 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/d4d04cef-5f05-4c8f-82a1-c1e8350c738c-host-var-lib-cni-multus\") pod \"multus-rktwr\" (UID: \"d4d04cef-5f05-4c8f-82a1-c1e8350c738c\") " pod="openshift-multus/multus-rktwr" Apr 28 19:16:26.530577 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.529550 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ff2f50e1-de53-4f11-a477-9236b340536b-etc-openvswitch\") pod \"ovnkube-node-5tdk8\" (UID: \"ff2f50e1-de53-4f11-a477-9236b340536b\") " pod="openshift-ovn-kubernetes/ovnkube-node-5tdk8" Apr 28 19:16:26.530577 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.529573 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/84fd7385-382e-46ca-a795-1826538a8901-etc-sysctl-d\") pod \"tuned-z4tgm\" (UID: \"84fd7385-382e-46ca-a795-1826538a8901\") " pod="openshift-cluster-node-tuning-operator/tuned-z4tgm" Apr 28 19:16:26.530577 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.529594 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/c1267be0-6760-422e-b253-3d9b5132c496-konnectivity-ca\") pod \"konnectivity-agent-j966f\" (UID: \"c1267be0-6760-422e-b253-3d9b5132c496\") " pod="kube-system/konnectivity-agent-j966f" Apr 28 19:16:26.530577 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.529654 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b58c1c28-103c-4a87-a33c-367210699fab-socket-dir\") pod \"aws-ebs-csi-driver-node-vqwcj\" (UID: \"b58c1c28-103c-4a87-a33c-367210699fab\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vqwcj" Apr 28 19:16:26.530577 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.529672 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/51e675bf-bae4-491c-adfc-eae81fef84bf-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-wt99f\" (UID: \"51e675bf-bae4-491c-adfc-eae81fef84bf\") " pod="openshift-multus/multus-additional-cni-plugins-wt99f" Apr 28 19:16:26.530577 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.529690 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/84fd7385-382e-46ca-a795-1826538a8901-etc-systemd\") pod \"tuned-z4tgm\" (UID: \"84fd7385-382e-46ca-a795-1826538a8901\") " pod="openshift-cluster-node-tuning-operator/tuned-z4tgm" Apr 28 19:16:26.530577 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.529725 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d4d04cef-5f05-4c8f-82a1-c1e8350c738c-multus-conf-dir\") pod \"multus-rktwr\" (UID: \"d4d04cef-5f05-4c8f-82a1-c1e8350c738c\") " pod="openshift-multus/multus-rktwr" Apr 28 19:16:26.531289 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.529751 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d4d04cef-5f05-4c8f-82a1-c1e8350c738c-etc-kubernetes\") pod \"multus-rktwr\" (UID: \"d4d04cef-5f05-4c8f-82a1-c1e8350c738c\") " pod="openshift-multus/multus-rktwr" Apr 28 19:16:26.531289 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.529755 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/84fd7385-382e-46ca-a795-1826538a8901-etc-systemd\") pod \"tuned-z4tgm\" (UID: \"84fd7385-382e-46ca-a795-1826538a8901\") " pod="openshift-cluster-node-tuning-operator/tuned-z4tgm" Apr 28 19:16:26.531289 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.529778 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ff2f50e1-de53-4f11-a477-9236b340536b-node-log\") pod \"ovnkube-node-5tdk8\" (UID: \"ff2f50e1-de53-4f11-a477-9236b340536b\") " pod="openshift-ovn-kubernetes/ovnkube-node-5tdk8" Apr 28 19:16:26.531289 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.529812 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d4d04cef-5f05-4c8f-82a1-c1e8350c738c-etc-kubernetes\") pod \"multus-rktwr\" (UID: \"d4d04cef-5f05-4c8f-82a1-c1e8350c738c\") " pod="openshift-multus/multus-rktwr" Apr 28 19:16:26.531289 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.529819 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d4d04cef-5f05-4c8f-82a1-c1e8350c738c-multus-conf-dir\") pod \"multus-rktwr\" (UID: \"d4d04cef-5f05-4c8f-82a1-c1e8350c738c\") " pod="openshift-multus/multus-rktwr" Apr 28 19:16:26.531289 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.531060 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/84fd7385-382e-46ca-a795-1826538a8901-etc-tuned\") pod \"tuned-z4tgm\" (UID: \"84fd7385-382e-46ca-a795-1826538a8901\") " pod="openshift-cluster-node-tuning-operator/tuned-z4tgm" Apr 28 19:16:26.532109 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.532086 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/84fd7385-382e-46ca-a795-1826538a8901-tmp\") pod \"tuned-z4tgm\" (UID: \"84fd7385-382e-46ca-a795-1826538a8901\") " pod="openshift-cluster-node-tuning-operator/tuned-z4tgm" Apr 28 19:16:26.533463 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:16:26.533444 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 28 19:16:26.533572 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:16:26.533467 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 28 19:16:26.533572 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:16:26.533487 2576 projected.go:194] Error preparing data for projected volume kube-api-access-5vhhk for pod openshift-network-diagnostics/network-check-target-hgdfg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:16:26.533572 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:16:26.533546 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ca288914-564f-4959-9c10-76a6327678fe-kube-api-access-5vhhk podName:ca288914-564f-4959-9c10-76a6327678fe nodeName:}" failed. No retries permitted until 2026-04-28 19:16:27.033527742 +0000 UTC m=+2.996428088 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-5vhhk" (UniqueName: "kubernetes.io/projected/ca288914-564f-4959-9c10-76a6327678fe-kube-api-access-5vhhk") pod "network-check-target-hgdfg" (UID: "ca288914-564f-4959-9c10-76a6327678fe") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:16:26.536846 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.536823 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-khmmj\" (UniqueName: \"kubernetes.io/projected/d4d04cef-5f05-4c8f-82a1-c1e8350c738c-kube-api-access-khmmj\") pod \"multus-rktwr\" (UID: \"d4d04cef-5f05-4c8f-82a1-c1e8350c738c\") " pod="openshift-multus/multus-rktwr" Apr 28 19:16:26.536846 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.536832 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gv7dm\" (UniqueName: \"kubernetes.io/projected/4236a3f6-5c96-4e29-bb77-8dafe3cd242d-kube-api-access-gv7dm\") pod \"network-metrics-daemon-8j8w9\" (UID: \"4236a3f6-5c96-4e29-bb77-8dafe3cd242d\") " pod="openshift-multus/network-metrics-daemon-8j8w9" Apr 28 19:16:26.539127 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.539107 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bxd2\" (UniqueName: \"kubernetes.io/projected/35de0ddf-e6a6-49cd-b5bd-9d110f16b469-kube-api-access-8bxd2\") pod \"iptables-alerter-z9bgv\" (UID: \"35de0ddf-e6a6-49cd-b5bd-9d110f16b469\") " pod="openshift-network-operator/iptables-alerter-z9bgv" Apr 28 19:16:26.541643 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.541621 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5pwrb\" (UniqueName: \"kubernetes.io/projected/84fd7385-382e-46ca-a795-1826538a8901-kube-api-access-5pwrb\") pod \"tuned-z4tgm\" (UID: \"84fd7385-382e-46ca-a795-1826538a8901\") " pod="openshift-cluster-node-tuning-operator/tuned-z4tgm" Apr 28 19:16:26.630556 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.630518 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ff2f50e1-de53-4f11-a477-9236b340536b-host-kubelet\") pod \"ovnkube-node-5tdk8\" (UID: \"ff2f50e1-de53-4f11-a477-9236b340536b\") " pod="openshift-ovn-kubernetes/ovnkube-node-5tdk8" Apr 28 19:16:26.630556 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.630557 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ff2f50e1-de53-4f11-a477-9236b340536b-host-run-ovn-kubernetes\") pod \"ovnkube-node-5tdk8\" (UID: \"ff2f50e1-de53-4f11-a477-9236b340536b\") " pod="openshift-ovn-kubernetes/ovnkube-node-5tdk8" Apr 28 19:16:26.630801 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.630577 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ff2f50e1-de53-4f11-a477-9236b340536b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-5tdk8\" (UID: \"ff2f50e1-de53-4f11-a477-9236b340536b\") " pod="openshift-ovn-kubernetes/ovnkube-node-5tdk8" Apr 28 19:16:26.630801 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.630593 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ff2f50e1-de53-4f11-a477-9236b340536b-ovnkube-script-lib\") pod \"ovnkube-node-5tdk8\" (UID: \"ff2f50e1-de53-4f11-a477-9236b340536b\") " pod="openshift-ovn-kubernetes/ovnkube-node-5tdk8" Apr 28 19:16:26.630801 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.630632 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4pnfw\" (UniqueName: \"kubernetes.io/projected/b58c1c28-103c-4a87-a33c-367210699fab-kube-api-access-4pnfw\") pod \"aws-ebs-csi-driver-node-vqwcj\" (UID: \"b58c1c28-103c-4a87-a33c-367210699fab\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vqwcj" Apr 28 19:16:26.630801 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.630650 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ff2f50e1-de53-4f11-a477-9236b340536b-host-cni-netd\") pod \"ovnkube-node-5tdk8\" (UID: \"ff2f50e1-de53-4f11-a477-9236b340536b\") " pod="openshift-ovn-kubernetes/ovnkube-node-5tdk8" Apr 28 19:16:26.630801 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.630655 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ff2f50e1-de53-4f11-a477-9236b340536b-host-run-ovn-kubernetes\") pod \"ovnkube-node-5tdk8\" (UID: \"ff2f50e1-de53-4f11-a477-9236b340536b\") " pod="openshift-ovn-kubernetes/ovnkube-node-5tdk8" Apr 28 19:16:26.630801 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.630655 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ff2f50e1-de53-4f11-a477-9236b340536b-host-kubelet\") pod \"ovnkube-node-5tdk8\" (UID: \"ff2f50e1-de53-4f11-a477-9236b340536b\") " pod="openshift-ovn-kubernetes/ovnkube-node-5tdk8" Apr 28 19:16:26.630801 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.630664 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ff2f50e1-de53-4f11-a477-9236b340536b-ovnkube-config\") pod \"ovnkube-node-5tdk8\" (UID: \"ff2f50e1-de53-4f11-a477-9236b340536b\") " pod="openshift-ovn-kubernetes/ovnkube-node-5tdk8" Apr 28 19:16:26.630801 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.630716 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b5119881-7aaa-4ea1-8738-f8463adc7b0c-serviceca\") pod \"node-ca-xfk6k\" (UID: \"b5119881-7aaa-4ea1-8738-f8463adc7b0c\") " pod="openshift-image-registry/node-ca-xfk6k" Apr 28 19:16:26.630801 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.630730 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ff2f50e1-de53-4f11-a477-9236b340536b-host-cni-netd\") pod \"ovnkube-node-5tdk8\" (UID: \"ff2f50e1-de53-4f11-a477-9236b340536b\") " pod="openshift-ovn-kubernetes/ovnkube-node-5tdk8" Apr 28 19:16:26.630801 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.630746 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ff2f50e1-de53-4f11-a477-9236b340536b-run-ovn\") pod \"ovnkube-node-5tdk8\" (UID: \"ff2f50e1-de53-4f11-a477-9236b340536b\") " pod="openshift-ovn-kubernetes/ovnkube-node-5tdk8" Apr 28 19:16:26.630801 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.630772 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ff2f50e1-de53-4f11-a477-9236b340536b-systemd-units\") pod \"ovnkube-node-5tdk8\" (UID: \"ff2f50e1-de53-4f11-a477-9236b340536b\") " pod="openshift-ovn-kubernetes/ovnkube-node-5tdk8" Apr 28 19:16:26.630801 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.630793 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ff2f50e1-de53-4f11-a477-9236b340536b-host-cni-bin\") pod \"ovnkube-node-5tdk8\" (UID: \"ff2f50e1-de53-4f11-a477-9236b340536b\") " pod="openshift-ovn-kubernetes/ovnkube-node-5tdk8" Apr 28 19:16:26.631357 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.630816 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/b58c1c28-103c-4a87-a33c-367210699fab-sys-fs\") pod \"aws-ebs-csi-driver-node-vqwcj\" (UID: \"b58c1c28-103c-4a87-a33c-367210699fab\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vqwcj" Apr 28 19:16:26.631357 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.630859 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/51e675bf-bae4-491c-adfc-eae81fef84bf-tuning-conf-dir\") pod \"multus-additional-cni-plugins-wt99f\" (UID: \"51e675bf-bae4-491c-adfc-eae81fef84bf\") " pod="openshift-multus/multus-additional-cni-plugins-wt99f" Apr 28 19:16:26.631357 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.630884 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/51e675bf-bae4-491c-adfc-eae81fef84bf-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-wt99f\" (UID: \"51e675bf-bae4-491c-adfc-eae81fef84bf\") " pod="openshift-multus/multus-additional-cni-plugins-wt99f" Apr 28 19:16:26.631357 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.630909 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ff2f50e1-de53-4f11-a477-9236b340536b-host-run-netns\") pod \"ovnkube-node-5tdk8\" (UID: \"ff2f50e1-de53-4f11-a477-9236b340536b\") " pod="openshift-ovn-kubernetes/ovnkube-node-5tdk8" Apr 28 19:16:26.631357 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.630912 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ff2f50e1-de53-4f11-a477-9236b340536b-run-ovn\") pod \"ovnkube-node-5tdk8\" (UID: \"ff2f50e1-de53-4f11-a477-9236b340536b\") " pod="openshift-ovn-kubernetes/ovnkube-node-5tdk8" Apr 28 19:16:26.631357 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.630912 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ff2f50e1-de53-4f11-a477-9236b340536b-host-cni-bin\") pod \"ovnkube-node-5tdk8\" (UID: \"ff2f50e1-de53-4f11-a477-9236b340536b\") " pod="openshift-ovn-kubernetes/ovnkube-node-5tdk8" Apr 28 19:16:26.631357 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.630930 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4fp5b\" (UniqueName: \"kubernetes.io/projected/b5119881-7aaa-4ea1-8738-f8463adc7b0c-kube-api-access-4fp5b\") pod \"node-ca-xfk6k\" (UID: \"b5119881-7aaa-4ea1-8738-f8463adc7b0c\") " pod="openshift-image-registry/node-ca-xfk6k" Apr 28 19:16:26.631357 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.630947 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ff2f50e1-de53-4f11-a477-9236b340536b-systemd-units\") pod \"ovnkube-node-5tdk8\" (UID: \"ff2f50e1-de53-4f11-a477-9236b340536b\") " pod="openshift-ovn-kubernetes/ovnkube-node-5tdk8" Apr 28 19:16:26.631357 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.630953 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ff2f50e1-de53-4f11-a477-9236b340536b-ovn-node-metrics-cert\") pod \"ovnkube-node-5tdk8\" (UID: \"ff2f50e1-de53-4f11-a477-9236b340536b\") " pod="openshift-ovn-kubernetes/ovnkube-node-5tdk8" Apr 28 19:16:26.631357 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.630969 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/b58c1c28-103c-4a87-a33c-367210699fab-sys-fs\") pod \"aws-ebs-csi-driver-node-vqwcj\" (UID: \"b58c1c28-103c-4a87-a33c-367210699fab\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vqwcj" Apr 28 19:16:26.631357 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.630991 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ff2f50e1-de53-4f11-a477-9236b340536b-run-openvswitch\") pod \"ovnkube-node-5tdk8\" (UID: \"ff2f50e1-de53-4f11-a477-9236b340536b\") " pod="openshift-ovn-kubernetes/ovnkube-node-5tdk8" Apr 28 19:16:26.631357 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.631021 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/c1267be0-6760-422e-b253-3d9b5132c496-agent-certs\") pod \"konnectivity-agent-j966f\" (UID: \"c1267be0-6760-422e-b253-3d9b5132c496\") " pod="kube-system/konnectivity-agent-j966f" Apr 28 19:16:26.631357 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.631046 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/b58c1c28-103c-4a87-a33c-367210699fab-etc-selinux\") pod \"aws-ebs-csi-driver-node-vqwcj\" (UID: \"b58c1c28-103c-4a87-a33c-367210699fab\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vqwcj" Apr 28 19:16:26.631357 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.631071 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b5119881-7aaa-4ea1-8738-f8463adc7b0c-host\") pod \"node-ca-xfk6k\" (UID: \"b5119881-7aaa-4ea1-8738-f8463adc7b0c\") " pod="openshift-image-registry/node-ca-xfk6k" Apr 28 19:16:26.631357 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.631095 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/51e675bf-bae4-491c-adfc-eae81fef84bf-cnibin\") pod \"multus-additional-cni-plugins-wt99f\" (UID: \"51e675bf-bae4-491c-adfc-eae81fef84bf\") " pod="openshift-multus/multus-additional-cni-plugins-wt99f" Apr 28 19:16:26.631357 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.631095 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/51e675bf-bae4-491c-adfc-eae81fef84bf-tuning-conf-dir\") pod \"multus-additional-cni-plugins-wt99f\" (UID: \"51e675bf-bae4-491c-adfc-eae81fef84bf\") " pod="openshift-multus/multus-additional-cni-plugins-wt99f" Apr 28 19:16:26.631357 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.631121 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2pdd8\" (UniqueName: \"kubernetes.io/projected/ff2f50e1-de53-4f11-a477-9236b340536b-kube-api-access-2pdd8\") pod \"ovnkube-node-5tdk8\" (UID: \"ff2f50e1-de53-4f11-a477-9236b340536b\") " pod="openshift-ovn-kubernetes/ovnkube-node-5tdk8" Apr 28 19:16:26.632174 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.631145 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ff2f50e1-de53-4f11-a477-9236b340536b-host-slash\") pod \"ovnkube-node-5tdk8\" (UID: \"ff2f50e1-de53-4f11-a477-9236b340536b\") " pod="openshift-ovn-kubernetes/ovnkube-node-5tdk8" Apr 28 19:16:26.632174 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.631147 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ff2f50e1-de53-4f11-a477-9236b340536b-host-run-netns\") pod \"ovnkube-node-5tdk8\" (UID: \"ff2f50e1-de53-4f11-a477-9236b340536b\") " pod="openshift-ovn-kubernetes/ovnkube-node-5tdk8" Apr 28 19:16:26.632174 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.631188 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ff2f50e1-de53-4f11-a477-9236b340536b-var-lib-openvswitch\") pod \"ovnkube-node-5tdk8\" (UID: \"ff2f50e1-de53-4f11-a477-9236b340536b\") " pod="openshift-ovn-kubernetes/ovnkube-node-5tdk8" Apr 28 19:16:26.632174 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.631213 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/51e675bf-bae4-491c-adfc-eae81fef84bf-system-cni-dir\") pod \"multus-additional-cni-plugins-wt99f\" (UID: \"51e675bf-bae4-491c-adfc-eae81fef84bf\") " pod="openshift-multus/multus-additional-cni-plugins-wt99f" Apr 28 19:16:26.632174 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.631261 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/51e675bf-bae4-491c-adfc-eae81fef84bf-os-release\") pod \"multus-additional-cni-plugins-wt99f\" (UID: \"51e675bf-bae4-491c-adfc-eae81fef84bf\") " pod="openshift-multus/multus-additional-cni-plugins-wt99f" Apr 28 19:16:26.632174 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.631290 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ff2f50e1-de53-4f11-a477-9236b340536b-env-overrides\") pod \"ovnkube-node-5tdk8\" (UID: \"ff2f50e1-de53-4f11-a477-9236b340536b\") " pod="openshift-ovn-kubernetes/ovnkube-node-5tdk8" Apr 28 19:16:26.632174 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.631312 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b58c1c28-103c-4a87-a33c-367210699fab-kubelet-dir\") pod \"aws-ebs-csi-driver-node-vqwcj\" (UID: \"b58c1c28-103c-4a87-a33c-367210699fab\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vqwcj" Apr 28 19:16:26.632174 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.631323 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ff2f50e1-de53-4f11-a477-9236b340536b-ovnkube-script-lib\") pod \"ovnkube-node-5tdk8\" (UID: \"ff2f50e1-de53-4f11-a477-9236b340536b\") " pod="openshift-ovn-kubernetes/ovnkube-node-5tdk8" Apr 28 19:16:26.632174 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.631323 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ff2f50e1-de53-4f11-a477-9236b340536b-ovnkube-config\") pod \"ovnkube-node-5tdk8\" (UID: \"ff2f50e1-de53-4f11-a477-9236b340536b\") " pod="openshift-ovn-kubernetes/ovnkube-node-5tdk8" Apr 28 19:16:26.632174 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.630664 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ff2f50e1-de53-4f11-a477-9236b340536b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-5tdk8\" (UID: \"ff2f50e1-de53-4f11-a477-9236b340536b\") " pod="openshift-ovn-kubernetes/ovnkube-node-5tdk8" Apr 28 19:16:26.632174 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.631338 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/51e675bf-bae4-491c-adfc-eae81fef84bf-cni-binary-copy\") pod \"multus-additional-cni-plugins-wt99f\" (UID: \"51e675bf-bae4-491c-adfc-eae81fef84bf\") " pod="openshift-multus/multus-additional-cni-plugins-wt99f" Apr 28 19:16:26.632174 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.631378 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/b58c1c28-103c-4a87-a33c-367210699fab-etc-selinux\") pod \"aws-ebs-csi-driver-node-vqwcj\" (UID: \"b58c1c28-103c-4a87-a33c-367210699fab\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vqwcj" Apr 28 19:16:26.632174 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.631388 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/b58c1c28-103c-4a87-a33c-367210699fab-device-dir\") pod \"aws-ebs-csi-driver-node-vqwcj\" (UID: \"b58c1c28-103c-4a87-a33c-367210699fab\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vqwcj" Apr 28 19:16:26.632174 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.631406 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ff2f50e1-de53-4f11-a477-9236b340536b-host-slash\") pod \"ovnkube-node-5tdk8\" (UID: \"ff2f50e1-de53-4f11-a477-9236b340536b\") " pod="openshift-ovn-kubernetes/ovnkube-node-5tdk8" Apr 28 19:16:26.632174 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.631435 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/b58c1c28-103c-4a87-a33c-367210699fab-device-dir\") pod \"aws-ebs-csi-driver-node-vqwcj\" (UID: \"b58c1c28-103c-4a87-a33c-367210699fab\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vqwcj" Apr 28 19:16:26.632174 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.631458 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ff2f50e1-de53-4f11-a477-9236b340536b-run-openvswitch\") pod \"ovnkube-node-5tdk8\" (UID: \"ff2f50e1-de53-4f11-a477-9236b340536b\") " pod="openshift-ovn-kubernetes/ovnkube-node-5tdk8" Apr 28 19:16:26.632174 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.631470 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ff2f50e1-de53-4f11-a477-9236b340536b-var-lib-openvswitch\") pod \"ovnkube-node-5tdk8\" (UID: \"ff2f50e1-de53-4f11-a477-9236b340536b\") " pod="openshift-ovn-kubernetes/ovnkube-node-5tdk8" Apr 28 19:16:26.632996 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.631471 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ff2f50e1-de53-4f11-a477-9236b340536b-run-systemd\") pod \"ovnkube-node-5tdk8\" (UID: \"ff2f50e1-de53-4f11-a477-9236b340536b\") " pod="openshift-ovn-kubernetes/ovnkube-node-5tdk8" Apr 28 19:16:26.632996 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.631426 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ff2f50e1-de53-4f11-a477-9236b340536b-run-systemd\") pod \"ovnkube-node-5tdk8\" (UID: \"ff2f50e1-de53-4f11-a477-9236b340536b\") " pod="openshift-ovn-kubernetes/ovnkube-node-5tdk8" Apr 28 19:16:26.632996 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.631501 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/51e675bf-bae4-491c-adfc-eae81fef84bf-system-cni-dir\") pod \"multus-additional-cni-plugins-wt99f\" (UID: \"51e675bf-bae4-491c-adfc-eae81fef84bf\") " pod="openshift-multus/multus-additional-cni-plugins-wt99f" Apr 28 19:16:26.632996 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.631523 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/51e675bf-bae4-491c-adfc-eae81fef84bf-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-wt99f\" (UID: \"51e675bf-bae4-491c-adfc-eae81fef84bf\") " pod="openshift-multus/multus-additional-cni-plugins-wt99f" Apr 28 19:16:26.632996 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.631543 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/51e675bf-bae4-491c-adfc-eae81fef84bf-cnibin\") pod \"multus-additional-cni-plugins-wt99f\" (UID: \"51e675bf-bae4-491c-adfc-eae81fef84bf\") " pod="openshift-multus/multus-additional-cni-plugins-wt99f" Apr 28 19:16:26.632996 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.631525 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ff2f50e1-de53-4f11-a477-9236b340536b-etc-openvswitch\") pod \"ovnkube-node-5tdk8\" (UID: \"ff2f50e1-de53-4f11-a477-9236b340536b\") " pod="openshift-ovn-kubernetes/ovnkube-node-5tdk8" Apr 28 19:16:26.632996 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.631561 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ff2f50e1-de53-4f11-a477-9236b340536b-etc-openvswitch\") pod \"ovnkube-node-5tdk8\" (UID: \"ff2f50e1-de53-4f11-a477-9236b340536b\") " pod="openshift-ovn-kubernetes/ovnkube-node-5tdk8" Apr 28 19:16:26.632996 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.631617 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b5119881-7aaa-4ea1-8738-f8463adc7b0c-host\") pod \"node-ca-xfk6k\" (UID: \"b5119881-7aaa-4ea1-8738-f8463adc7b0c\") " pod="openshift-image-registry/node-ca-xfk6k" Apr 28 19:16:26.632996 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.631637 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b58c1c28-103c-4a87-a33c-367210699fab-kubelet-dir\") pod \"aws-ebs-csi-driver-node-vqwcj\" (UID: \"b58c1c28-103c-4a87-a33c-367210699fab\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vqwcj" Apr 28 19:16:26.632996 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.631673 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/c1267be0-6760-422e-b253-3d9b5132c496-konnectivity-ca\") pod \"konnectivity-agent-j966f\" (UID: \"c1267be0-6760-422e-b253-3d9b5132c496\") " pod="kube-system/konnectivity-agent-j966f" Apr 28 19:16:26.632996 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.631693 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/51e675bf-bae4-491c-adfc-eae81fef84bf-os-release\") pod \"multus-additional-cni-plugins-wt99f\" (UID: \"51e675bf-bae4-491c-adfc-eae81fef84bf\") " pod="openshift-multus/multus-additional-cni-plugins-wt99f" Apr 28 19:16:26.632996 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.631699 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b58c1c28-103c-4a87-a33c-367210699fab-socket-dir\") pod \"aws-ebs-csi-driver-node-vqwcj\" (UID: \"b58c1c28-103c-4a87-a33c-367210699fab\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vqwcj" Apr 28 19:16:26.632996 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.631727 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/51e675bf-bae4-491c-adfc-eae81fef84bf-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-wt99f\" (UID: \"51e675bf-bae4-491c-adfc-eae81fef84bf\") " pod="openshift-multus/multus-additional-cni-plugins-wt99f" Apr 28 19:16:26.632996 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.631759 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ff2f50e1-de53-4f11-a477-9236b340536b-node-log\") pod \"ovnkube-node-5tdk8\" (UID: \"ff2f50e1-de53-4f11-a477-9236b340536b\") " pod="openshift-ovn-kubernetes/ovnkube-node-5tdk8" Apr 28 19:16:26.632996 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.631784 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ff2f50e1-de53-4f11-a477-9236b340536b-log-socket\") pod \"ovnkube-node-5tdk8\" (UID: \"ff2f50e1-de53-4f11-a477-9236b340536b\") " pod="openshift-ovn-kubernetes/ovnkube-node-5tdk8" Apr 28 19:16:26.632996 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.631808 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nr28s\" (UniqueName: \"kubernetes.io/projected/51e675bf-bae4-491c-adfc-eae81fef84bf-kube-api-access-nr28s\") pod \"multus-additional-cni-plugins-wt99f\" (UID: \"51e675bf-bae4-491c-adfc-eae81fef84bf\") " pod="openshift-multus/multus-additional-cni-plugins-wt99f" Apr 28 19:16:26.632996 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.631838 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/51e675bf-bae4-491c-adfc-eae81fef84bf-cni-binary-copy\") pod \"multus-additional-cni-plugins-wt99f\" (UID: \"51e675bf-bae4-491c-adfc-eae81fef84bf\") " pod="openshift-multus/multus-additional-cni-plugins-wt99f" Apr 28 19:16:26.633787 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.631857 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b58c1c28-103c-4a87-a33c-367210699fab-registration-dir\") pod \"aws-ebs-csi-driver-node-vqwcj\" (UID: \"b58c1c28-103c-4a87-a33c-367210699fab\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vqwcj" Apr 28 19:16:26.633787 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.631843 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b58c1c28-103c-4a87-a33c-367210699fab-socket-dir\") pod \"aws-ebs-csi-driver-node-vqwcj\" (UID: \"b58c1c28-103c-4a87-a33c-367210699fab\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vqwcj" Apr 28 19:16:26.633787 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.631925 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b5119881-7aaa-4ea1-8738-f8463adc7b0c-serviceca\") pod \"node-ca-xfk6k\" (UID: \"b5119881-7aaa-4ea1-8738-f8463adc7b0c\") " pod="openshift-image-registry/node-ca-xfk6k" Apr 28 19:16:26.633787 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.631974 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ff2f50e1-de53-4f11-a477-9236b340536b-node-log\") pod \"ovnkube-node-5tdk8\" (UID: \"ff2f50e1-de53-4f11-a477-9236b340536b\") " pod="openshift-ovn-kubernetes/ovnkube-node-5tdk8" Apr 28 19:16:26.633787 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.632019 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b58c1c28-103c-4a87-a33c-367210699fab-registration-dir\") pod \"aws-ebs-csi-driver-node-vqwcj\" (UID: \"b58c1c28-103c-4a87-a33c-367210699fab\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vqwcj" Apr 28 19:16:26.633787 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.632051 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ff2f50e1-de53-4f11-a477-9236b340536b-env-overrides\") pod \"ovnkube-node-5tdk8\" (UID: \"ff2f50e1-de53-4f11-a477-9236b340536b\") " pod="openshift-ovn-kubernetes/ovnkube-node-5tdk8" Apr 28 19:16:26.633787 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.632055 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ff2f50e1-de53-4f11-a477-9236b340536b-log-socket\") pod \"ovnkube-node-5tdk8\" (UID: \"ff2f50e1-de53-4f11-a477-9236b340536b\") " pod="openshift-ovn-kubernetes/ovnkube-node-5tdk8" Apr 28 19:16:26.633787 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.632316 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/51e675bf-bae4-491c-adfc-eae81fef84bf-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-wt99f\" (UID: \"51e675bf-bae4-491c-adfc-eae81fef84bf\") " pod="openshift-multus/multus-additional-cni-plugins-wt99f" Apr 28 19:16:26.633787 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.632320 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/c1267be0-6760-422e-b253-3d9b5132c496-konnectivity-ca\") pod \"konnectivity-agent-j966f\" (UID: \"c1267be0-6760-422e-b253-3d9b5132c496\") " pod="kube-system/konnectivity-agent-j966f" Apr 28 19:16:26.634201 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.633959 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/c1267be0-6760-422e-b253-3d9b5132c496-agent-certs\") pod \"konnectivity-agent-j966f\" (UID: \"c1267be0-6760-422e-b253-3d9b5132c496\") " pod="kube-system/konnectivity-agent-j966f" Apr 28 19:16:26.634201 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.634041 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ff2f50e1-de53-4f11-a477-9236b340536b-ovn-node-metrics-cert\") pod \"ovnkube-node-5tdk8\" (UID: \"ff2f50e1-de53-4f11-a477-9236b340536b\") " pod="openshift-ovn-kubernetes/ovnkube-node-5tdk8" Apr 28 19:16:26.643215 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.643145 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pdd8\" (UniqueName: \"kubernetes.io/projected/ff2f50e1-de53-4f11-a477-9236b340536b-kube-api-access-2pdd8\") pod \"ovnkube-node-5tdk8\" (UID: \"ff2f50e1-de53-4f11-a477-9236b340536b\") " pod="openshift-ovn-kubernetes/ovnkube-node-5tdk8" Apr 28 19:16:26.643357 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.643307 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nr28s\" (UniqueName: \"kubernetes.io/projected/51e675bf-bae4-491c-adfc-eae81fef84bf-kube-api-access-nr28s\") pod \"multus-additional-cni-plugins-wt99f\" (UID: \"51e675bf-bae4-491c-adfc-eae81fef84bf\") " pod="openshift-multus/multus-additional-cni-plugins-wt99f" Apr 28 19:16:26.643357 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.643309 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fp5b\" (UniqueName: \"kubernetes.io/projected/b5119881-7aaa-4ea1-8738-f8463adc7b0c-kube-api-access-4fp5b\") pod \"node-ca-xfk6k\" (UID: \"b5119881-7aaa-4ea1-8738-f8463adc7b0c\") " pod="openshift-image-registry/node-ca-xfk6k" Apr 28 19:16:26.643789 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.643769 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pnfw\" (UniqueName: \"kubernetes.io/projected/b58c1c28-103c-4a87-a33c-367210699fab-kube-api-access-4pnfw\") pod \"aws-ebs-csi-driver-node-vqwcj\" (UID: \"b58c1c28-103c-4a87-a33c-367210699fab\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vqwcj" Apr 28 19:16:26.712879 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.712840 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-z4tgm" Apr 28 19:16:26.720906 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.720878 2576 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 28 19:16:26.723094 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.723074 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-rktwr" Apr 28 19:16:26.733841 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.733819 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-z9bgv" Apr 28 19:16:26.739919 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.739895 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-5tdk8" Apr 28 19:16:26.746585 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.746567 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-j966f" Apr 28 19:16:26.755119 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.755097 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-xfk6k" Apr 28 19:16:26.762696 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.762679 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vqwcj" Apr 28 19:16:26.767283 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:26.767265 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-wt99f" Apr 28 19:16:27.036064 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:27.036024 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5vhhk\" (UniqueName: \"kubernetes.io/projected/ca288914-564f-4959-9c10-76a6327678fe-kube-api-access-5vhhk\") pod \"network-check-target-hgdfg\" (UID: \"ca288914-564f-4959-9c10-76a6327678fe\") " pod="openshift-network-diagnostics/network-check-target-hgdfg" Apr 28 19:16:27.036255 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:27.036102 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4236a3f6-5c96-4e29-bb77-8dafe3cd242d-metrics-certs\") pod \"network-metrics-daemon-8j8w9\" (UID: \"4236a3f6-5c96-4e29-bb77-8dafe3cd242d\") " pod="openshift-multus/network-metrics-daemon-8j8w9" Apr 28 19:16:27.036255 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:16:27.036213 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 28 19:16:27.036255 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:16:27.036234 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 28 19:16:27.036255 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:16:27.036244 2576 projected.go:194] Error preparing data for projected volume kube-api-access-5vhhk for pod openshift-network-diagnostics/network-check-target-hgdfg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:16:27.036388 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:16:27.036295 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ca288914-564f-4959-9c10-76a6327678fe-kube-api-access-5vhhk podName:ca288914-564f-4959-9c10-76a6327678fe nodeName:}" failed. No retries permitted until 2026-04-28 19:16:28.036281938 +0000 UTC m=+3.999182282 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-5vhhk" (UniqueName: "kubernetes.io/projected/ca288914-564f-4959-9c10-76a6327678fe-kube-api-access-5vhhk") pod "network-check-target-hgdfg" (UID: "ca288914-564f-4959-9c10-76a6327678fe") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:16:27.036388 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:16:27.036215 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:16:27.036388 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:16:27.036330 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4236a3f6-5c96-4e29-bb77-8dafe3cd242d-metrics-certs podName:4236a3f6-5c96-4e29-bb77-8dafe3cd242d nodeName:}" failed. No retries permitted until 2026-04-28 19:16:28.03632417 +0000 UTC m=+3.999224514 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4236a3f6-5c96-4e29-bb77-8dafe3cd242d-metrics-certs") pod "network-metrics-daemon-8j8w9" (UID: "4236a3f6-5c96-4e29-bb77-8dafe3cd242d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:16:27.148691 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:27.148669 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb58c1c28_103c_4a87_a33c_367210699fab.slice/crio-73e4a3663ceb5af5ebc3357c8753202935133b198fbc0961cb0812647c7e80dc WatchSource:0}: Error finding container 73e4a3663ceb5af5ebc3357c8753202935133b198fbc0961cb0812647c7e80dc: Status 404 returned error can't find the container with id 73e4a3663ceb5af5ebc3357c8753202935133b198fbc0961cb0812647c7e80dc Apr 28 19:16:27.150242 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:27.150213 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff2f50e1_de53_4f11_a477_9236b340536b.slice/crio-5c3033aaeedb6b9fc80f3582c41a8760ad21bb5b6cfa482f7fa67ded4c6d2d4b WatchSource:0}: Error finding container 5c3033aaeedb6b9fc80f3582c41a8760ad21bb5b6cfa482f7fa67ded4c6d2d4b: Status 404 returned error can't find the container with id 5c3033aaeedb6b9fc80f3582c41a8760ad21bb5b6cfa482f7fa67ded4c6d2d4b Apr 28 19:16:27.151105 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:27.151013 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod35de0ddf_e6a6_49cd_b5bd_9d110f16b469.slice/crio-4f584d07e5624e92a6e9b564bfcfe4cf54f1953f192e793f242db6eb3d9e8c67 WatchSource:0}: Error finding container 4f584d07e5624e92a6e9b564bfcfe4cf54f1953f192e793f242db6eb3d9e8c67: Status 404 returned error can't find the container with id 4f584d07e5624e92a6e9b564bfcfe4cf54f1953f192e793f242db6eb3d9e8c67 Apr 28 19:16:27.152375 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:27.152352 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51e675bf_bae4_491c_adfc_eae81fef84bf.slice/crio-dd8bff915a35db3dbbadbfee90ee63f78c047b7df948fb72e9782da18221aa1d WatchSource:0}: Error finding container dd8bff915a35db3dbbadbfee90ee63f78c047b7df948fb72e9782da18221aa1d: Status 404 returned error can't find the container with id dd8bff915a35db3dbbadbfee90ee63f78c047b7df948fb72e9782da18221aa1d Apr 28 19:16:27.153727 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:27.153654 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc1267be0_6760_422e_b253_3d9b5132c496.slice/crio-4cc8990420a0106ef31917c4b3aa292accd7d1efed98e88fc45b3a5411571f59 WatchSource:0}: Error finding container 4cc8990420a0106ef31917c4b3aa292accd7d1efed98e88fc45b3a5411571f59: Status 404 returned error can't find the container with id 4cc8990420a0106ef31917c4b3aa292accd7d1efed98e88fc45b3a5411571f59 Apr 28 19:16:27.154966 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:27.154817 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd4d04cef_5f05_4c8f_82a1_c1e8350c738c.slice/crio-aa4acc9354d6d818b2c6c62bc2e3727a3432c457ea3fcf5811825ca06067efd5 WatchSource:0}: Error finding container aa4acc9354d6d818b2c6c62bc2e3727a3432c457ea3fcf5811825ca06067efd5: Status 404 returned error can't find the container with id aa4acc9354d6d818b2c6c62bc2e3727a3432c457ea3fcf5811825ca06067efd5 Apr 28 19:16:27.157741 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:27.157707 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84fd7385_382e_46ca_a795_1826538a8901.slice/crio-fd7d460cd9afe32fce04b6bc40b9fa6601ee5df9963de1ad8b702fbcd89cdeb1 WatchSource:0}: Error finding container fd7d460cd9afe32fce04b6bc40b9fa6601ee5df9963de1ad8b702fbcd89cdeb1: Status 404 returned error can't find the container with id fd7d460cd9afe32fce04b6bc40b9fa6601ee5df9963de1ad8b702fbcd89cdeb1 Apr 28 19:16:27.158536 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:27.158518 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb5119881_7aaa_4ea1_8738_f8463adc7b0c.slice/crio-a6372f3848c13a3c4ecd7bf8ae1723e94479d7703cc94173c74c17c425e8de4a WatchSource:0}: Error finding container a6372f3848c13a3c4ecd7bf8ae1723e94479d7703cc94173c74c17c425e8de4a: Status 404 returned error can't find the container with id a6372f3848c13a3c4ecd7bf8ae1723e94479d7703cc94173c74c17c425e8de4a Apr 28 19:16:27.464263 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:27.464165 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-27 19:11:25 +0000 UTC" deadline="2027-12-03 00:41:32.300997789 +0000 UTC" Apr 28 19:16:27.464263 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:27.464205 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13997h25m4.836796775s" Apr 28 19:16:27.522027 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:27.520629 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-184.ec2.internal" event={"ID":"e3440d423eacb4ec58cf1cc320321a41","Type":"ContainerStarted","Data":"efe7f41d7d0000952399da688694ed9406975031b3a30495069484d1bb8ff754"} Apr 28 19:16:27.523131 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:27.522991 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-xfk6k" event={"ID":"b5119881-7aaa-4ea1-8738-f8463adc7b0c","Type":"ContainerStarted","Data":"a6372f3848c13a3c4ecd7bf8ae1723e94479d7703cc94173c74c17c425e8de4a"} Apr 28 19:16:27.524585 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:27.524560 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-rktwr" event={"ID":"d4d04cef-5f05-4c8f-82a1-c1e8350c738c","Type":"ContainerStarted","Data":"aa4acc9354d6d818b2c6c62bc2e3727a3432c457ea3fcf5811825ca06067efd5"} Apr 28 19:16:27.525828 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:27.525804 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-j966f" event={"ID":"c1267be0-6760-422e-b253-3d9b5132c496","Type":"ContainerStarted","Data":"4cc8990420a0106ef31917c4b3aa292accd7d1efed98e88fc45b3a5411571f59"} Apr 28 19:16:27.532376 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:27.532350 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vqwcj" event={"ID":"b58c1c28-103c-4a87-a33c-367210699fab","Type":"ContainerStarted","Data":"73e4a3663ceb5af5ebc3357c8753202935133b198fbc0961cb0812647c7e80dc"} Apr 28 19:16:27.534039 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:27.534014 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-z4tgm" event={"ID":"84fd7385-382e-46ca-a795-1826538a8901","Type":"ContainerStarted","Data":"fd7d460cd9afe32fce04b6bc40b9fa6601ee5df9963de1ad8b702fbcd89cdeb1"} Apr 28 19:16:27.539057 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:27.539032 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wt99f" event={"ID":"51e675bf-bae4-491c-adfc-eae81fef84bf","Type":"ContainerStarted","Data":"dd8bff915a35db3dbbadbfee90ee63f78c047b7df948fb72e9782da18221aa1d"} Apr 28 19:16:27.550938 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:27.550911 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-z9bgv" event={"ID":"35de0ddf-e6a6-49cd-b5bd-9d110f16b469","Type":"ContainerStarted","Data":"4f584d07e5624e92a6e9b564bfcfe4cf54f1953f192e793f242db6eb3d9e8c67"} Apr 28 19:16:27.554165 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:27.554126 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5tdk8" event={"ID":"ff2f50e1-de53-4f11-a477-9236b340536b","Type":"ContainerStarted","Data":"5c3033aaeedb6b9fc80f3582c41a8760ad21bb5b6cfa482f7fa67ded4c6d2d4b"} Apr 28 19:16:28.043089 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:28.043019 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4236a3f6-5c96-4e29-bb77-8dafe3cd242d-metrics-certs\") pod \"network-metrics-daemon-8j8w9\" (UID: \"4236a3f6-5c96-4e29-bb77-8dafe3cd242d\") " pod="openshift-multus/network-metrics-daemon-8j8w9" Apr 28 19:16:28.043274 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:28.043099 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5vhhk\" (UniqueName: \"kubernetes.io/projected/ca288914-564f-4959-9c10-76a6327678fe-kube-api-access-5vhhk\") pod \"network-check-target-hgdfg\" (UID: \"ca288914-564f-4959-9c10-76a6327678fe\") " pod="openshift-network-diagnostics/network-check-target-hgdfg" Apr 28 19:16:28.043274 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:16:28.043252 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 28 19:16:28.043274 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:16:28.043271 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 28 19:16:28.043454 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:16:28.043284 2576 projected.go:194] Error preparing data for projected volume kube-api-access-5vhhk for pod openshift-network-diagnostics/network-check-target-hgdfg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:16:28.043454 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:16:28.043344 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ca288914-564f-4959-9c10-76a6327678fe-kube-api-access-5vhhk podName:ca288914-564f-4959-9c10-76a6327678fe nodeName:}" failed. No retries permitted until 2026-04-28 19:16:30.043325265 +0000 UTC m=+6.006225613 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-5vhhk" (UniqueName: "kubernetes.io/projected/ca288914-564f-4959-9c10-76a6327678fe-kube-api-access-5vhhk") pod "network-check-target-hgdfg" (UID: "ca288914-564f-4959-9c10-76a6327678fe") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:16:28.043828 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:16:28.043803 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:16:28.043937 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:16:28.043902 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4236a3f6-5c96-4e29-bb77-8dafe3cd242d-metrics-certs podName:4236a3f6-5c96-4e29-bb77-8dafe3cd242d nodeName:}" failed. No retries permitted until 2026-04-28 19:16:30.043883873 +0000 UTC m=+6.006784220 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4236a3f6-5c96-4e29-bb77-8dafe3cd242d-metrics-certs") pod "network-metrics-daemon-8j8w9" (UID: "4236a3f6-5c96-4e29-bb77-8dafe3cd242d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:16:28.509818 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:28.509784 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8j8w9" Apr 28 19:16:28.510428 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:16:28.509936 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8j8w9" podUID="4236a3f6-5c96-4e29-bb77-8dafe3cd242d" Apr 28 19:16:28.510428 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:28.510350 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hgdfg" Apr 28 19:16:28.510595 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:16:28.510442 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hgdfg" podUID="ca288914-564f-4959-9c10-76a6327678fe" Apr 28 19:16:28.574509 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:28.573260 2576 generic.go:358] "Generic (PLEG): container finished" podID="77560a24cc76a08538baa0efe4273073" containerID="1ee8bc562ddca53a5019eec6c899953a359c334be5570907c998077c1cba16aa" exitCode=0 Apr 28 19:16:28.574509 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:28.574447 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-184.ec2.internal" event={"ID":"77560a24cc76a08538baa0efe4273073","Type":"ContainerDied","Data":"1ee8bc562ddca53a5019eec6c899953a359c334be5570907c998077c1cba16aa"} Apr 28 19:16:28.594222 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:28.593087 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-184.ec2.internal" podStartSLOduration=3.593066979 podStartE2EDuration="3.593066979s" podCreationTimestamp="2026-04-28 19:16:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:16:27.550801056 +0000 UTC m=+3.513701422" watchObservedRunningTime="2026-04-28 19:16:28.593066979 +0000 UTC m=+4.555967340" Apr 28 19:16:29.579645 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:29.579045 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-184.ec2.internal" event={"ID":"77560a24cc76a08538baa0efe4273073","Type":"ContainerStarted","Data":"d528f489ff49e803e596113b9578cbcba9e98fd8f959eaf42de7ca7bb00c3e8a"} Apr 28 19:16:29.594052 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:29.593997 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-184.ec2.internal" podStartSLOduration=4.59397749 podStartE2EDuration="4.59397749s" podCreationTimestamp="2026-04-28 19:16:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:16:29.593837816 +0000 UTC m=+5.556738192" watchObservedRunningTime="2026-04-28 19:16:29.59397749 +0000 UTC m=+5.556877865" Apr 28 19:16:30.060549 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:30.059743 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5vhhk\" (UniqueName: \"kubernetes.io/projected/ca288914-564f-4959-9c10-76a6327678fe-kube-api-access-5vhhk\") pod \"network-check-target-hgdfg\" (UID: \"ca288914-564f-4959-9c10-76a6327678fe\") " pod="openshift-network-diagnostics/network-check-target-hgdfg" Apr 28 19:16:30.060549 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:30.059818 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4236a3f6-5c96-4e29-bb77-8dafe3cd242d-metrics-certs\") pod \"network-metrics-daemon-8j8w9\" (UID: \"4236a3f6-5c96-4e29-bb77-8dafe3cd242d\") " pod="openshift-multus/network-metrics-daemon-8j8w9" Apr 28 19:16:30.060549 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:16:30.059961 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:16:30.060549 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:16:30.060026 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4236a3f6-5c96-4e29-bb77-8dafe3cd242d-metrics-certs podName:4236a3f6-5c96-4e29-bb77-8dafe3cd242d nodeName:}" failed. No retries permitted until 2026-04-28 19:16:34.060006473 +0000 UTC m=+10.022906823 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4236a3f6-5c96-4e29-bb77-8dafe3cd242d-metrics-certs") pod "network-metrics-daemon-8j8w9" (UID: "4236a3f6-5c96-4e29-bb77-8dafe3cd242d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:16:30.060549 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:16:30.060444 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 28 19:16:30.060549 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:16:30.060461 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 28 19:16:30.060549 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:16:30.060470 2576 projected.go:194] Error preparing data for projected volume kube-api-access-5vhhk for pod openshift-network-diagnostics/network-check-target-hgdfg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:16:30.060549 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:16:30.060506 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ca288914-564f-4959-9c10-76a6327678fe-kube-api-access-5vhhk podName:ca288914-564f-4959-9c10-76a6327678fe nodeName:}" failed. No retries permitted until 2026-04-28 19:16:34.060495412 +0000 UTC m=+10.023395756 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-5vhhk" (UniqueName: "kubernetes.io/projected/ca288914-564f-4959-9c10-76a6327678fe-kube-api-access-5vhhk") pod "network-check-target-hgdfg" (UID: "ca288914-564f-4959-9c10-76a6327678fe") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:16:30.509129 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:30.509095 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8j8w9" Apr 28 19:16:30.509296 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:16:30.509241 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8j8w9" podUID="4236a3f6-5c96-4e29-bb77-8dafe3cd242d" Apr 28 19:16:30.509667 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:30.509644 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hgdfg" Apr 28 19:16:30.509829 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:16:30.509747 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hgdfg" podUID="ca288914-564f-4959-9c10-76a6327678fe" Apr 28 19:16:32.509734 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:32.509702 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8j8w9" Apr 28 19:16:32.510163 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:16:32.509814 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8j8w9" podUID="4236a3f6-5c96-4e29-bb77-8dafe3cd242d" Apr 28 19:16:32.510163 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:32.509881 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hgdfg" Apr 28 19:16:32.510163 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:16:32.509998 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hgdfg" podUID="ca288914-564f-4959-9c10-76a6327678fe" Apr 28 19:16:34.095971 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:34.095927 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4236a3f6-5c96-4e29-bb77-8dafe3cd242d-metrics-certs\") pod \"network-metrics-daemon-8j8w9\" (UID: \"4236a3f6-5c96-4e29-bb77-8dafe3cd242d\") " pod="openshift-multus/network-metrics-daemon-8j8w9" Apr 28 19:16:34.096431 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:34.095992 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5vhhk\" (UniqueName: \"kubernetes.io/projected/ca288914-564f-4959-9c10-76a6327678fe-kube-api-access-5vhhk\") pod \"network-check-target-hgdfg\" (UID: \"ca288914-564f-4959-9c10-76a6327678fe\") " pod="openshift-network-diagnostics/network-check-target-hgdfg" Apr 28 19:16:34.096431 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:16:34.096129 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 28 19:16:34.096431 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:16:34.096148 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 28 19:16:34.096431 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:16:34.096160 2576 projected.go:194] Error preparing data for projected volume kube-api-access-5vhhk for pod openshift-network-diagnostics/network-check-target-hgdfg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:16:34.096431 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:16:34.096218 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ca288914-564f-4959-9c10-76a6327678fe-kube-api-access-5vhhk podName:ca288914-564f-4959-9c10-76a6327678fe nodeName:}" failed. No retries permitted until 2026-04-28 19:16:42.096198665 +0000 UTC m=+18.059099027 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-5vhhk" (UniqueName: "kubernetes.io/projected/ca288914-564f-4959-9c10-76a6327678fe-kube-api-access-5vhhk") pod "network-check-target-hgdfg" (UID: "ca288914-564f-4959-9c10-76a6327678fe") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:16:34.096726 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:16:34.096629 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:16:34.096726 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:16:34.096684 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4236a3f6-5c96-4e29-bb77-8dafe3cd242d-metrics-certs podName:4236a3f6-5c96-4e29-bb77-8dafe3cd242d nodeName:}" failed. No retries permitted until 2026-04-28 19:16:42.096667288 +0000 UTC m=+18.059567650 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4236a3f6-5c96-4e29-bb77-8dafe3cd242d-metrics-certs") pod "network-metrics-daemon-8j8w9" (UID: "4236a3f6-5c96-4e29-bb77-8dafe3cd242d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:16:34.510926 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:34.510895 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8j8w9" Apr 28 19:16:34.511077 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:34.510942 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hgdfg" Apr 28 19:16:34.511077 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:16:34.511027 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8j8w9" podUID="4236a3f6-5c96-4e29-bb77-8dafe3cd242d" Apr 28 19:16:34.511143 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:16:34.511117 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hgdfg" podUID="ca288914-564f-4959-9c10-76a6327678fe" Apr 28 19:16:36.509854 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:36.509823 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8j8w9" Apr 28 19:16:36.510267 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:16:36.509957 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8j8w9" podUID="4236a3f6-5c96-4e29-bb77-8dafe3cd242d" Apr 28 19:16:36.510267 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:36.510013 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hgdfg" Apr 28 19:16:36.510267 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:16:36.510120 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hgdfg" podUID="ca288914-564f-4959-9c10-76a6327678fe" Apr 28 19:16:37.031797 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:37.031764 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-rb8nk"] Apr 28 19:16:37.034497 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:37.034471 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-rb8nk" Apr 28 19:16:37.038552 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:37.038530 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-pvc6r\"" Apr 28 19:16:37.039051 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:37.039033 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 28 19:16:37.039731 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:37.039715 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 28 19:16:37.120344 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:37.120317 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/edd42d80-2884-4124-a4b2-2aea5543b72b-hosts-file\") pod \"node-resolver-rb8nk\" (UID: \"edd42d80-2884-4124-a4b2-2aea5543b72b\") " pod="openshift-dns/node-resolver-rb8nk" Apr 28 19:16:37.120509 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:37.120416 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/edd42d80-2884-4124-a4b2-2aea5543b72b-tmp-dir\") pod \"node-resolver-rb8nk\" (UID: \"edd42d80-2884-4124-a4b2-2aea5543b72b\") " pod="openshift-dns/node-resolver-rb8nk" Apr 28 19:16:37.120509 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:37.120440 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbbbh\" (UniqueName: \"kubernetes.io/projected/edd42d80-2884-4124-a4b2-2aea5543b72b-kube-api-access-dbbbh\") pod \"node-resolver-rb8nk\" (UID: \"edd42d80-2884-4124-a4b2-2aea5543b72b\") " pod="openshift-dns/node-resolver-rb8nk" Apr 28 19:16:37.221262 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:37.221229 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/edd42d80-2884-4124-a4b2-2aea5543b72b-hosts-file\") pod \"node-resolver-rb8nk\" (UID: \"edd42d80-2884-4124-a4b2-2aea5543b72b\") " pod="openshift-dns/node-resolver-rb8nk" Apr 28 19:16:37.221394 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:37.221322 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/edd42d80-2884-4124-a4b2-2aea5543b72b-tmp-dir\") pod \"node-resolver-rb8nk\" (UID: \"edd42d80-2884-4124-a4b2-2aea5543b72b\") " pod="openshift-dns/node-resolver-rb8nk" Apr 28 19:16:37.221394 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:37.221343 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dbbbh\" (UniqueName: \"kubernetes.io/projected/edd42d80-2884-4124-a4b2-2aea5543b72b-kube-api-access-dbbbh\") pod \"node-resolver-rb8nk\" (UID: \"edd42d80-2884-4124-a4b2-2aea5543b72b\") " pod="openshift-dns/node-resolver-rb8nk" Apr 28 19:16:37.221394 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:37.221355 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/edd42d80-2884-4124-a4b2-2aea5543b72b-hosts-file\") pod \"node-resolver-rb8nk\" (UID: \"edd42d80-2884-4124-a4b2-2aea5543b72b\") " pod="openshift-dns/node-resolver-rb8nk" Apr 28 19:16:37.221682 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:37.221666 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/edd42d80-2884-4124-a4b2-2aea5543b72b-tmp-dir\") pod \"node-resolver-rb8nk\" (UID: \"edd42d80-2884-4124-a4b2-2aea5543b72b\") " pod="openshift-dns/node-resolver-rb8nk" Apr 28 19:16:37.234969 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:37.234938 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbbbh\" (UniqueName: \"kubernetes.io/projected/edd42d80-2884-4124-a4b2-2aea5543b72b-kube-api-access-dbbbh\") pod \"node-resolver-rb8nk\" (UID: \"edd42d80-2884-4124-a4b2-2aea5543b72b\") " pod="openshift-dns/node-resolver-rb8nk" Apr 28 19:16:37.344614 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:37.344513 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-rb8nk" Apr 28 19:16:38.509217 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:38.509177 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8j8w9" Apr 28 19:16:38.509668 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:38.509232 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hgdfg" Apr 28 19:16:38.509668 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:16:38.509324 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8j8w9" podUID="4236a3f6-5c96-4e29-bb77-8dafe3cd242d" Apr 28 19:16:38.509668 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:16:38.509470 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hgdfg" podUID="ca288914-564f-4959-9c10-76a6327678fe" Apr 28 19:16:40.509758 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:40.509491 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8j8w9" Apr 28 19:16:40.510194 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:40.509518 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hgdfg" Apr 28 19:16:40.510194 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:16:40.509887 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8j8w9" podUID="4236a3f6-5c96-4e29-bb77-8dafe3cd242d" Apr 28 19:16:40.510194 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:16:40.509953 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hgdfg" podUID="ca288914-564f-4959-9c10-76a6327678fe" Apr 28 19:16:42.154946 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:42.154906 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5vhhk\" (UniqueName: \"kubernetes.io/projected/ca288914-564f-4959-9c10-76a6327678fe-kube-api-access-5vhhk\") pod \"network-check-target-hgdfg\" (UID: \"ca288914-564f-4959-9c10-76a6327678fe\") " pod="openshift-network-diagnostics/network-check-target-hgdfg" Apr 28 19:16:42.155383 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:42.154980 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4236a3f6-5c96-4e29-bb77-8dafe3cd242d-metrics-certs\") pod \"network-metrics-daemon-8j8w9\" (UID: \"4236a3f6-5c96-4e29-bb77-8dafe3cd242d\") " pod="openshift-multus/network-metrics-daemon-8j8w9" Apr 28 19:16:42.155383 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:16:42.155063 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 28 19:16:42.155383 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:16:42.155076 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:16:42.155383 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:16:42.155084 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 28 19:16:42.155383 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:16:42.155096 2576 projected.go:194] Error preparing data for projected volume kube-api-access-5vhhk for pod openshift-network-diagnostics/network-check-target-hgdfg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:16:42.155383 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:16:42.155138 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ca288914-564f-4959-9c10-76a6327678fe-kube-api-access-5vhhk podName:ca288914-564f-4959-9c10-76a6327678fe nodeName:}" failed. No retries permitted until 2026-04-28 19:16:58.155121137 +0000 UTC m=+34.118021481 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-5vhhk" (UniqueName: "kubernetes.io/projected/ca288914-564f-4959-9c10-76a6327678fe-kube-api-access-5vhhk") pod "network-check-target-hgdfg" (UID: "ca288914-564f-4959-9c10-76a6327678fe") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:16:42.155383 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:16:42.155155 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4236a3f6-5c96-4e29-bb77-8dafe3cd242d-metrics-certs podName:4236a3f6-5c96-4e29-bb77-8dafe3cd242d nodeName:}" failed. No retries permitted until 2026-04-28 19:16:58.15514678 +0000 UTC m=+34.118047125 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4236a3f6-5c96-4e29-bb77-8dafe3cd242d-metrics-certs") pod "network-metrics-daemon-8j8w9" (UID: "4236a3f6-5c96-4e29-bb77-8dafe3cd242d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:16:42.509220 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:42.509188 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8j8w9" Apr 28 19:16:42.509220 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:42.509158 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hgdfg" Apr 28 19:16:42.509468 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:16:42.509355 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8j8w9" podUID="4236a3f6-5c96-4e29-bb77-8dafe3cd242d" Apr 28 19:16:42.509527 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:16:42.509499 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hgdfg" podUID="ca288914-564f-4959-9c10-76a6327678fe" Apr 28 19:16:44.442197 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:44.442092 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podedd42d80_2884_4124_a4b2_2aea5543b72b.slice/crio-c1941277c7637ca0330fa2cd354797145af15e74378fc8220fcd83d3e741e26c WatchSource:0}: Error finding container c1941277c7637ca0330fa2cd354797145af15e74378fc8220fcd83d3e741e26c: Status 404 returned error can't find the container with id c1941277c7637ca0330fa2cd354797145af15e74378fc8220fcd83d3e741e26c Apr 28 19:16:44.510234 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:44.510211 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8j8w9" Apr 28 19:16:44.510348 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:16:44.510323 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8j8w9" podUID="4236a3f6-5c96-4e29-bb77-8dafe3cd242d" Apr 28 19:16:44.510417 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:44.510384 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hgdfg" Apr 28 19:16:44.510526 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:16:44.510507 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hgdfg" podUID="ca288914-564f-4959-9c10-76a6327678fe" Apr 28 19:16:44.615274 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:44.615243 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-rb8nk" event={"ID":"edd42d80-2884-4124-a4b2-2aea5543b72b","Type":"ContainerStarted","Data":"c1941277c7637ca0330fa2cd354797145af15e74378fc8220fcd83d3e741e26c"} Apr 28 19:16:45.620217 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:45.619972 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5tdk8_ff2f50e1-de53-4f11-a477-9236b340536b/ovn-acl-logging/0.log" Apr 28 19:16:45.620984 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:45.620454 2576 generic.go:358] "Generic (PLEG): container finished" podID="ff2f50e1-de53-4f11-a477-9236b340536b" containerID="13c2780aec95c6dc7e5d67af193a4fb877c8012a19e2c34f7b3ff92c032a6413" exitCode=1 Apr 28 19:16:45.620984 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:45.620496 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5tdk8" event={"ID":"ff2f50e1-de53-4f11-a477-9236b340536b","Type":"ContainerStarted","Data":"0fd104e8cb95f1967e5c0cd1455230db96edd4347f9d868f903d6529fc114840"} Apr 28 19:16:45.620984 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:45.620530 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5tdk8" event={"ID":"ff2f50e1-de53-4f11-a477-9236b340536b","Type":"ContainerStarted","Data":"250bc075dea4db7e676d30002e8864f9312b5e6c1bac978a10ce45b259a2b4d0"} Apr 28 19:16:45.620984 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:45.620544 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5tdk8" event={"ID":"ff2f50e1-de53-4f11-a477-9236b340536b","Type":"ContainerStarted","Data":"8dfceed9031db3f4ae5fcad30dab21ef1bb1f51085e7bd0f0873ad4431443c90"} Apr 28 19:16:45.620984 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:45.620553 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5tdk8" event={"ID":"ff2f50e1-de53-4f11-a477-9236b340536b","Type":"ContainerStarted","Data":"94666d8037399db7f205e1c221e6ea6d6800035155e12269366f3b8e9cdbfd76"} Apr 28 19:16:45.620984 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:45.620562 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5tdk8" event={"ID":"ff2f50e1-de53-4f11-a477-9236b340536b","Type":"ContainerDied","Data":"13c2780aec95c6dc7e5d67af193a4fb877c8012a19e2c34f7b3ff92c032a6413"} Apr 28 19:16:45.620984 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:45.620572 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5tdk8" event={"ID":"ff2f50e1-de53-4f11-a477-9236b340536b","Type":"ContainerStarted","Data":"61c0d6593a7ed1ed1657689adeafd9a2c0a2667e8d25366b6e874d350514a145"} Apr 28 19:16:45.621912 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:45.621882 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-rb8nk" event={"ID":"edd42d80-2884-4124-a4b2-2aea5543b72b","Type":"ContainerStarted","Data":"09d734a144560ed4bc8e0048d682e58730d2553c27377495c55b38da09313876"} Apr 28 19:16:45.623299 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:45.623265 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-xfk6k" event={"ID":"b5119881-7aaa-4ea1-8738-f8463adc7b0c","Type":"ContainerStarted","Data":"32a7601c304007c6fa96df422035ec86168c675c03d9466bdde92d2f8fffec5f"} Apr 28 19:16:45.624575 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:45.624547 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-rktwr" event={"ID":"d4d04cef-5f05-4c8f-82a1-c1e8350c738c","Type":"ContainerStarted","Data":"3d46019e85552b659025574605f762331c56b5ab6c9d0944c39d045236e5b8e0"} Apr 28 19:16:45.625922 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:45.625886 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-j966f" event={"ID":"c1267be0-6760-422e-b253-3d9b5132c496","Type":"ContainerStarted","Data":"e3df2f4808c4af968c7b00423e70cb9539e29a0074ef28257ea3f023f6e26c8c"} Apr 28 19:16:45.627648 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:45.627623 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vqwcj" event={"ID":"b58c1c28-103c-4a87-a33c-367210699fab","Type":"ContainerStarted","Data":"9a5107086a7f464b3ce3a529ff61ccc856927280e8680f7913b0b5425021c0d5"} Apr 28 19:16:45.628833 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:45.628811 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-z4tgm" event={"ID":"84fd7385-382e-46ca-a795-1826538a8901","Type":"ContainerStarted","Data":"3f8f4bfa99e56c08b4c806cbfd1ea8219cc9a08e505b270764cbcd92ca59569e"} Apr 28 19:16:45.630173 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:45.630150 2576 generic.go:358] "Generic (PLEG): container finished" podID="51e675bf-bae4-491c-adfc-eae81fef84bf" containerID="eef334db38c333c968b06492099215eccc240f65167814a1f5b3663c0861ea2a" exitCode=0 Apr 28 19:16:45.630268 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:45.630188 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wt99f" event={"ID":"51e675bf-bae4-491c-adfc-eae81fef84bf","Type":"ContainerDied","Data":"eef334db38c333c968b06492099215eccc240f65167814a1f5b3663c0861ea2a"} Apr 28 19:16:45.639443 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:45.639392 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-rb8nk" podStartSLOduration=8.639372318 podStartE2EDuration="8.639372318s" podCreationTimestamp="2026-04-28 19:16:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:16:45.638379849 +0000 UTC m=+21.601280226" watchObservedRunningTime="2026-04-28 19:16:45.639372318 +0000 UTC m=+21.602272686" Apr 28 19:16:45.677828 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:45.677773 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-z4tgm" podStartSLOduration=4.427124181 podStartE2EDuration="21.677758432s" podCreationTimestamp="2026-04-28 19:16:24 +0000 UTC" firstStartedPulling="2026-04-28 19:16:27.181858719 +0000 UTC m=+3.144759067" lastFinishedPulling="2026-04-28 19:16:44.432492965 +0000 UTC m=+20.395393318" observedRunningTime="2026-04-28 19:16:45.658251218 +0000 UTC m=+21.621151588" watchObservedRunningTime="2026-04-28 19:16:45.677758432 +0000 UTC m=+21.640658805" Apr 28 19:16:45.677978 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:45.677907 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-rktwr" podStartSLOduration=4.394313465 podStartE2EDuration="21.677902878s" podCreationTimestamp="2026-04-28 19:16:24 +0000 UTC" firstStartedPulling="2026-04-28 19:16:27.157896926 +0000 UTC m=+3.120797271" lastFinishedPulling="2026-04-28 19:16:44.441486325 +0000 UTC m=+20.404386684" observedRunningTime="2026-04-28 19:16:45.677336608 +0000 UTC m=+21.640236974" watchObservedRunningTime="2026-04-28 19:16:45.677902878 +0000 UTC m=+21.640803288" Apr 28 19:16:45.694424 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:45.694371 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-j966f" podStartSLOduration=4.47926795 podStartE2EDuration="21.694356447s" podCreationTimestamp="2026-04-28 19:16:24 +0000 UTC" firstStartedPulling="2026-04-28 19:16:27.181881036 +0000 UTC m=+3.144781380" lastFinishedPulling="2026-04-28 19:16:44.396969528 +0000 UTC m=+20.359869877" observedRunningTime="2026-04-28 19:16:45.694149925 +0000 UTC m=+21.657050311" watchObservedRunningTime="2026-04-28 19:16:45.694356447 +0000 UTC m=+21.657256812" Apr 28 19:16:45.710533 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:45.710482 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-xfk6k" podStartSLOduration=4.495353761 podStartE2EDuration="21.710460136s" podCreationTimestamp="2026-04-28 19:16:24 +0000 UTC" firstStartedPulling="2026-04-28 19:16:27.181861223 +0000 UTC m=+3.144761571" lastFinishedPulling="2026-04-28 19:16:44.396967589 +0000 UTC m=+20.359867946" observedRunningTime="2026-04-28 19:16:45.710064239 +0000 UTC m=+21.672964605" watchObservedRunningTime="2026-04-28 19:16:45.710460136 +0000 UTC m=+21.673360501" Apr 28 19:16:46.206254 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:46.205487 2576 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 28 19:16:46.487053 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:46.486893 2576 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-28T19:16:46.205849019Z","UUID":"d858d3cb-ed1f-4f4f-813d-74573084805a","Handler":null,"Name":"","Endpoint":""} Apr 28 19:16:46.489509 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:46.489492 2576 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 28 19:16:46.489509 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:46.489514 2576 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 28 19:16:46.509112 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:46.509081 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hgdfg" Apr 28 19:16:46.509112 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:46.509095 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8j8w9" Apr 28 19:16:46.509297 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:16:46.509203 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hgdfg" podUID="ca288914-564f-4959-9c10-76a6327678fe" Apr 28 19:16:46.509347 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:16:46.509320 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8j8w9" podUID="4236a3f6-5c96-4e29-bb77-8dafe3cd242d" Apr 28 19:16:46.634098 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:46.634063 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vqwcj" event={"ID":"b58c1c28-103c-4a87-a33c-367210699fab","Type":"ContainerStarted","Data":"fabeecad5f89999d3ec084d42e685aa3918ab04aaeaaaf98fd8a576f54a9953f"} Apr 28 19:16:46.635669 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:46.635637 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-z9bgv" event={"ID":"35de0ddf-e6a6-49cd-b5bd-9d110f16b469","Type":"ContainerStarted","Data":"3c31c60bb0d82d5c91b4aa568cbb46088b7cb5d3ef2982bde12c00ec57c537c5"} Apr 28 19:16:46.654776 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:46.654727 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-z9bgv" podStartSLOduration=5.410825549 podStartE2EDuration="22.654712846s" podCreationTimestamp="2026-04-28 19:16:24 +0000 UTC" firstStartedPulling="2026-04-28 19:16:27.15312155 +0000 UTC m=+3.116021899" lastFinishedPulling="2026-04-28 19:16:44.397008852 +0000 UTC m=+20.359909196" observedRunningTime="2026-04-28 19:16:46.654586481 +0000 UTC m=+22.617486848" watchObservedRunningTime="2026-04-28 19:16:46.654712846 +0000 UTC m=+22.617613212" Apr 28 19:16:47.640045 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:47.639968 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vqwcj" event={"ID":"b58c1c28-103c-4a87-a33c-367210699fab","Type":"ContainerStarted","Data":"8956d7bd21c7259a62a3e1ebe5779c8869f67e3e917c572919e652011e46d66c"} Apr 28 19:16:47.643663 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:47.643646 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5tdk8_ff2f50e1-de53-4f11-a477-9236b340536b/ovn-acl-logging/0.log" Apr 28 19:16:47.644051 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:47.644027 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5tdk8" event={"ID":"ff2f50e1-de53-4f11-a477-9236b340536b","Type":"ContainerStarted","Data":"c29450785de1ae623b601bdaaee8d8acfce55d3211e712ee2ea0146b90c12e7c"} Apr 28 19:16:47.671143 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:47.671082 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vqwcj" podStartSLOduration=3.41424454 podStartE2EDuration="23.67106323s" podCreationTimestamp="2026-04-28 19:16:24 +0000 UTC" firstStartedPulling="2026-04-28 19:16:27.150337305 +0000 UTC m=+3.113237652" lastFinishedPulling="2026-04-28 19:16:47.407155984 +0000 UTC m=+23.370056342" observedRunningTime="2026-04-28 19:16:47.670765727 +0000 UTC m=+23.633666092" watchObservedRunningTime="2026-04-28 19:16:47.67106323 +0000 UTC m=+23.633963600" Apr 28 19:16:48.509324 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:48.509295 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8j8w9" Apr 28 19:16:48.509517 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:16:48.509428 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8j8w9" podUID="4236a3f6-5c96-4e29-bb77-8dafe3cd242d" Apr 28 19:16:48.509517 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:48.509483 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hgdfg" Apr 28 19:16:48.509661 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:16:48.509581 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hgdfg" podUID="ca288914-564f-4959-9c10-76a6327678fe" Apr 28 19:16:50.509491 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:50.509312 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hgdfg" Apr 28 19:16:50.510049 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:50.509313 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8j8w9" Apr 28 19:16:50.510049 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:16:50.509557 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hgdfg" podUID="ca288914-564f-4959-9c10-76a6327678fe" Apr 28 19:16:50.510049 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:16:50.509657 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8j8w9" podUID="4236a3f6-5c96-4e29-bb77-8dafe3cd242d" Apr 28 19:16:50.541438 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:50.541406 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-j966f" Apr 28 19:16:50.541979 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:50.541962 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-j966f" Apr 28 19:16:50.651219 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:50.651185 2576 generic.go:358] "Generic (PLEG): container finished" podID="51e675bf-bae4-491c-adfc-eae81fef84bf" containerID="049e91cb23abb004ca439e300c5f88c33865cbb9ea2a0f947f72adc3e0db1875" exitCode=0 Apr 28 19:16:50.651358 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:50.651257 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wt99f" event={"ID":"51e675bf-bae4-491c-adfc-eae81fef84bf","Type":"ContainerDied","Data":"049e91cb23abb004ca439e300c5f88c33865cbb9ea2a0f947f72adc3e0db1875"} Apr 28 19:16:50.654328 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:50.654309 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5tdk8_ff2f50e1-de53-4f11-a477-9236b340536b/ovn-acl-logging/0.log" Apr 28 19:16:50.654669 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:50.654644 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5tdk8" event={"ID":"ff2f50e1-de53-4f11-a477-9236b340536b","Type":"ContainerStarted","Data":"8e72b673ac104654059b1af5a19aff35b65cf14b5ba79108b789df81b3a3c7de"} Apr 28 19:16:50.654970 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:50.654935 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-j966f" Apr 28 19:16:50.655162 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:50.654982 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-5tdk8" Apr 28 19:16:50.655162 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:50.654997 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-5tdk8" Apr 28 19:16:50.655162 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:50.655010 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-5tdk8" Apr 28 19:16:50.655162 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:50.655065 2576 scope.go:117] "RemoveContainer" containerID="13c2780aec95c6dc7e5d67af193a4fb877c8012a19e2c34f7b3ff92c032a6413" Apr 28 19:16:50.656277 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:50.656257 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-j966f" Apr 28 19:16:50.672233 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:50.672214 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-5tdk8" Apr 28 19:16:50.672318 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:50.672309 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-5tdk8" Apr 28 19:16:51.658708 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:51.658439 2576 generic.go:358] "Generic (PLEG): container finished" podID="51e675bf-bae4-491c-adfc-eae81fef84bf" containerID="4e4ef36a02bec53520563abc81da8336c616522f61743b31a2f3b85c346c9d0b" exitCode=0 Apr 28 19:16:51.658708 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:51.658536 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wt99f" event={"ID":"51e675bf-bae4-491c-adfc-eae81fef84bf","Type":"ContainerDied","Data":"4e4ef36a02bec53520563abc81da8336c616522f61743b31a2f3b85c346c9d0b"} Apr 28 19:16:51.662142 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:51.662124 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5tdk8_ff2f50e1-de53-4f11-a477-9236b340536b/ovn-acl-logging/0.log" Apr 28 19:16:51.662516 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:51.662494 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5tdk8" event={"ID":"ff2f50e1-de53-4f11-a477-9236b340536b","Type":"ContainerStarted","Data":"c9eae100c2bdcdcdd023af3c6a3ef4b761633be2281aaf57a1e9b0cb5cfbf608"} Apr 28 19:16:51.720723 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:51.720671 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-5tdk8" podStartSLOduration=10.40061237 podStartE2EDuration="27.72065443s" podCreationTimestamp="2026-04-28 19:16:24 +0000 UTC" firstStartedPulling="2026-04-28 19:16:27.152500041 +0000 UTC m=+3.115400388" lastFinishedPulling="2026-04-28 19:16:44.472542104 +0000 UTC m=+20.435442448" observedRunningTime="2026-04-28 19:16:51.72038283 +0000 UTC m=+27.683283195" watchObservedRunningTime="2026-04-28 19:16:51.72065443 +0000 UTC m=+27.683554793" Apr 28 19:16:51.973515 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:51.973432 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-hgdfg"] Apr 28 19:16:51.973681 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:51.973564 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hgdfg" Apr 28 19:16:51.973739 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:16:51.973687 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hgdfg" podUID="ca288914-564f-4959-9c10-76a6327678fe" Apr 28 19:16:51.976734 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:51.976711 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-8j8w9"] Apr 28 19:16:51.976855 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:51.976824 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8j8w9" Apr 28 19:16:51.976945 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:16:51.976925 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8j8w9" podUID="4236a3f6-5c96-4e29-bb77-8dafe3cd242d" Apr 28 19:16:52.666182 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:52.666098 2576 generic.go:358] "Generic (PLEG): container finished" podID="51e675bf-bae4-491c-adfc-eae81fef84bf" containerID="bc886c942a1d1277b291f8ba7787cec65abd8730a3f61eac85d29aff116af58d" exitCode=0 Apr 28 19:16:52.666708 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:52.666189 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wt99f" event={"ID":"51e675bf-bae4-491c-adfc-eae81fef84bf","Type":"ContainerDied","Data":"bc886c942a1d1277b291f8ba7787cec65abd8730a3f61eac85d29aff116af58d"} Apr 28 19:16:53.509344 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:53.509311 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8j8w9" Apr 28 19:16:53.509512 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:53.509311 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hgdfg" Apr 28 19:16:53.509512 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:16:53.509447 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8j8w9" podUID="4236a3f6-5c96-4e29-bb77-8dafe3cd242d" Apr 28 19:16:53.509512 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:16:53.509495 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hgdfg" podUID="ca288914-564f-4959-9c10-76a6327678fe" Apr 28 19:16:55.509456 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:55.509420 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hgdfg" Apr 28 19:16:55.510236 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:55.509430 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8j8w9" Apr 28 19:16:55.510236 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:16:55.509541 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hgdfg" podUID="ca288914-564f-4959-9c10-76a6327678fe" Apr 28 19:16:55.510236 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:16:55.509649 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8j8w9" podUID="4236a3f6-5c96-4e29-bb77-8dafe3cd242d" Apr 28 19:16:57.339122 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:57.338937 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-184.ec2.internal" event="NodeReady" Apr 28 19:16:57.339641 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:57.339248 2576 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 28 19:16:57.389193 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:57.389163 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-qjvwn"] Apr 28 19:16:57.393340 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:57.393313 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-qjvwn" Apr 28 19:16:57.396236 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:57.395489 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-j8mfv"] Apr 28 19:16:57.396450 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:57.396427 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-6qdcw\"" Apr 28 19:16:57.396643 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:57.396625 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 28 19:16:57.396849 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:57.396832 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 28 19:16:57.398424 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:57.398404 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-j8mfv" Apr 28 19:16:57.400583 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:57.400551 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-qjvwn"] Apr 28 19:16:57.401428 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:57.401399 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 28 19:16:57.402735 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:57.402712 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-pzmft\"" Apr 28 19:16:57.402948 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:57.402935 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 28 19:16:57.403153 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:57.403136 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 28 19:16:57.416497 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:57.416475 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-j8mfv"] Apr 28 19:16:57.470857 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:57.470736 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/018d803a-f231-469e-8539-32dcc07e43f8-metrics-tls\") pod \"dns-default-qjvwn\" (UID: \"018d803a-f231-469e-8539-32dcc07e43f8\") " pod="openshift-dns/dns-default-qjvwn" Apr 28 19:16:57.470857 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:57.470805 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9n4n\" (UniqueName: \"kubernetes.io/projected/eb3c06b0-d193-4c98-afaf-e689e3a82af8-kube-api-access-l9n4n\") pod \"ingress-canary-j8mfv\" (UID: \"eb3c06b0-d193-4c98-afaf-e689e3a82af8\") " pod="openshift-ingress-canary/ingress-canary-j8mfv" Apr 28 19:16:57.470857 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:57.470833 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/018d803a-f231-469e-8539-32dcc07e43f8-tmp-dir\") pod \"dns-default-qjvwn\" (UID: \"018d803a-f231-469e-8539-32dcc07e43f8\") " pod="openshift-dns/dns-default-qjvwn" Apr 28 19:16:57.471085 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:57.470915 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8v5dd\" (UniqueName: \"kubernetes.io/projected/018d803a-f231-469e-8539-32dcc07e43f8-kube-api-access-8v5dd\") pod \"dns-default-qjvwn\" (UID: \"018d803a-f231-469e-8539-32dcc07e43f8\") " pod="openshift-dns/dns-default-qjvwn" Apr 28 19:16:57.471085 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:57.470955 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/018d803a-f231-469e-8539-32dcc07e43f8-config-volume\") pod \"dns-default-qjvwn\" (UID: \"018d803a-f231-469e-8539-32dcc07e43f8\") " pod="openshift-dns/dns-default-qjvwn" Apr 28 19:16:57.471085 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:57.470998 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/eb3c06b0-d193-4c98-afaf-e689e3a82af8-cert\") pod \"ingress-canary-j8mfv\" (UID: \"eb3c06b0-d193-4c98-afaf-e689e3a82af8\") " pod="openshift-ingress-canary/ingress-canary-j8mfv" Apr 28 19:16:57.509229 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:57.509203 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8j8w9" Apr 28 19:16:57.509400 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:57.509203 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hgdfg" Apr 28 19:16:57.512325 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:57.512297 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-wz9tz\"" Apr 28 19:16:57.512325 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:57.512311 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 28 19:16:57.512496 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:57.512377 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 28 19:16:57.512496 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:57.512377 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-tslnt\"" Apr 28 19:16:57.512496 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:57.512434 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 28 19:16:57.572235 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:57.572206 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/018d803a-f231-469e-8539-32dcc07e43f8-metrics-tls\") pod \"dns-default-qjvwn\" (UID: \"018d803a-f231-469e-8539-32dcc07e43f8\") " pod="openshift-dns/dns-default-qjvwn" Apr 28 19:16:57.572408 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:57.572241 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l9n4n\" (UniqueName: \"kubernetes.io/projected/eb3c06b0-d193-4c98-afaf-e689e3a82af8-kube-api-access-l9n4n\") pod \"ingress-canary-j8mfv\" (UID: \"eb3c06b0-d193-4c98-afaf-e689e3a82af8\") " pod="openshift-ingress-canary/ingress-canary-j8mfv" Apr 28 19:16:57.572408 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:57.572260 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/018d803a-f231-469e-8539-32dcc07e43f8-tmp-dir\") pod \"dns-default-qjvwn\" (UID: \"018d803a-f231-469e-8539-32dcc07e43f8\") " pod="openshift-dns/dns-default-qjvwn" Apr 28 19:16:57.572408 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:57.572339 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8v5dd\" (UniqueName: \"kubernetes.io/projected/018d803a-f231-469e-8539-32dcc07e43f8-kube-api-access-8v5dd\") pod \"dns-default-qjvwn\" (UID: \"018d803a-f231-469e-8539-32dcc07e43f8\") " pod="openshift-dns/dns-default-qjvwn" Apr 28 19:16:57.572408 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:57.572376 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/018d803a-f231-469e-8539-32dcc07e43f8-config-volume\") pod \"dns-default-qjvwn\" (UID: \"018d803a-f231-469e-8539-32dcc07e43f8\") " pod="openshift-dns/dns-default-qjvwn" Apr 28 19:16:57.572627 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:57.572421 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/eb3c06b0-d193-4c98-afaf-e689e3a82af8-cert\") pod \"ingress-canary-j8mfv\" (UID: \"eb3c06b0-d193-4c98-afaf-e689e3a82af8\") " pod="openshift-ingress-canary/ingress-canary-j8mfv" Apr 28 19:16:57.572627 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:57.572558 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/018d803a-f231-469e-8539-32dcc07e43f8-tmp-dir\") pod \"dns-default-qjvwn\" (UID: \"018d803a-f231-469e-8539-32dcc07e43f8\") " pod="openshift-dns/dns-default-qjvwn" Apr 28 19:16:57.572730 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:16:57.572660 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 28 19:16:57.572730 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:16:57.572681 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 28 19:16:57.572730 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:16:57.572720 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb3c06b0-d193-4c98-afaf-e689e3a82af8-cert podName:eb3c06b0-d193-4c98-afaf-e689e3a82af8 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:58.072701319 +0000 UTC m=+34.035601681 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/eb3c06b0-d193-4c98-afaf-e689e3a82af8-cert") pod "ingress-canary-j8mfv" (UID: "eb3c06b0-d193-4c98-afaf-e689e3a82af8") : secret "canary-serving-cert" not found Apr 28 19:16:57.572855 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:16:57.572738 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/018d803a-f231-469e-8539-32dcc07e43f8-metrics-tls podName:018d803a-f231-469e-8539-32dcc07e43f8 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:58.072729448 +0000 UTC m=+34.035629792 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/018d803a-f231-469e-8539-32dcc07e43f8-metrics-tls") pod "dns-default-qjvwn" (UID: "018d803a-f231-469e-8539-32dcc07e43f8") : secret "dns-default-metrics-tls" not found Apr 28 19:16:57.573444 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:57.573416 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/018d803a-f231-469e-8539-32dcc07e43f8-config-volume\") pod \"dns-default-qjvwn\" (UID: \"018d803a-f231-469e-8539-32dcc07e43f8\") " pod="openshift-dns/dns-default-qjvwn" Apr 28 19:16:57.587114 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:57.587087 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8v5dd\" (UniqueName: \"kubernetes.io/projected/018d803a-f231-469e-8539-32dcc07e43f8-kube-api-access-8v5dd\") pod \"dns-default-qjvwn\" (UID: \"018d803a-f231-469e-8539-32dcc07e43f8\") " pod="openshift-dns/dns-default-qjvwn" Apr 28 19:16:57.587248 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:57.587112 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9n4n\" (UniqueName: \"kubernetes.io/projected/eb3c06b0-d193-4c98-afaf-e689e3a82af8-kube-api-access-l9n4n\") pod \"ingress-canary-j8mfv\" (UID: \"eb3c06b0-d193-4c98-afaf-e689e3a82af8\") " pod="openshift-ingress-canary/ingress-canary-j8mfv" Apr 28 19:16:58.075808 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:58.075763 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/018d803a-f231-469e-8539-32dcc07e43f8-metrics-tls\") pod \"dns-default-qjvwn\" (UID: \"018d803a-f231-469e-8539-32dcc07e43f8\") " pod="openshift-dns/dns-default-qjvwn" Apr 28 19:16:58.076010 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:58.075873 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/eb3c06b0-d193-4c98-afaf-e689e3a82af8-cert\") pod \"ingress-canary-j8mfv\" (UID: \"eb3c06b0-d193-4c98-afaf-e689e3a82af8\") " pod="openshift-ingress-canary/ingress-canary-j8mfv" Apr 28 19:16:58.076010 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:16:58.075933 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 28 19:16:58.076010 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:16:58.075964 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 28 19:16:58.076167 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:16:58.076011 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/018d803a-f231-469e-8539-32dcc07e43f8-metrics-tls podName:018d803a-f231-469e-8539-32dcc07e43f8 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:59.075990966 +0000 UTC m=+35.038891312 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/018d803a-f231-469e-8539-32dcc07e43f8-metrics-tls") pod "dns-default-qjvwn" (UID: "018d803a-f231-469e-8539-32dcc07e43f8") : secret "dns-default-metrics-tls" not found Apr 28 19:16:58.076167 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:16:58.076029 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb3c06b0-d193-4c98-afaf-e689e3a82af8-cert podName:eb3c06b0-d193-4c98-afaf-e689e3a82af8 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:59.076020747 +0000 UTC m=+35.038921096 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/eb3c06b0-d193-4c98-afaf-e689e3a82af8-cert") pod "ingress-canary-j8mfv" (UID: "eb3c06b0-d193-4c98-afaf-e689e3a82af8") : secret "canary-serving-cert" not found Apr 28 19:16:58.176798 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:58.176766 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5vhhk\" (UniqueName: \"kubernetes.io/projected/ca288914-564f-4959-9c10-76a6327678fe-kube-api-access-5vhhk\") pod \"network-check-target-hgdfg\" (UID: \"ca288914-564f-4959-9c10-76a6327678fe\") " pod="openshift-network-diagnostics/network-check-target-hgdfg" Apr 28 19:16:58.176956 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:58.176831 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4236a3f6-5c96-4e29-bb77-8dafe3cd242d-metrics-certs\") pod \"network-metrics-daemon-8j8w9\" (UID: \"4236a3f6-5c96-4e29-bb77-8dafe3cd242d\") " pod="openshift-multus/network-metrics-daemon-8j8w9" Apr 28 19:16:58.176956 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:16:58.176917 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 28 19:16:58.177036 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:16:58.176969 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4236a3f6-5c96-4e29-bb77-8dafe3cd242d-metrics-certs podName:4236a3f6-5c96-4e29-bb77-8dafe3cd242d nodeName:}" failed. No retries permitted until 2026-04-28 19:17:30.176955481 +0000 UTC m=+66.139855825 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4236a3f6-5c96-4e29-bb77-8dafe3cd242d-metrics-certs") pod "network-metrics-daemon-8j8w9" (UID: "4236a3f6-5c96-4e29-bb77-8dafe3cd242d") : secret "metrics-daemon-secret" not found Apr 28 19:16:58.179329 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:58.179304 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vhhk\" (UniqueName: \"kubernetes.io/projected/ca288914-564f-4959-9c10-76a6327678fe-kube-api-access-5vhhk\") pod \"network-check-target-hgdfg\" (UID: \"ca288914-564f-4959-9c10-76a6327678fe\") " pod="openshift-network-diagnostics/network-check-target-hgdfg" Apr 28 19:16:58.426049 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:58.425981 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hgdfg" Apr 28 19:16:58.696133 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:58.696104 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-hgdfg"] Apr 28 19:16:58.706553 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:16:58.706520 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca288914_564f_4959_9c10_76a6327678fe.slice/crio-5b6e4a922d805997d4c3f12c13c56082b3ab174dbefaf2de014603ed6431c856 WatchSource:0}: Error finding container 5b6e4a922d805997d4c3f12c13c56082b3ab174dbefaf2de014603ed6431c856: Status 404 returned error can't find the container with id 5b6e4a922d805997d4c3f12c13c56082b3ab174dbefaf2de014603ed6431c856 Apr 28 19:16:59.084191 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:59.084165 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/eb3c06b0-d193-4c98-afaf-e689e3a82af8-cert\") pod \"ingress-canary-j8mfv\" (UID: \"eb3c06b0-d193-4c98-afaf-e689e3a82af8\") " pod="openshift-ingress-canary/ingress-canary-j8mfv" Apr 28 19:16:59.084332 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:59.084215 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/018d803a-f231-469e-8539-32dcc07e43f8-metrics-tls\") pod \"dns-default-qjvwn\" (UID: \"018d803a-f231-469e-8539-32dcc07e43f8\") " pod="openshift-dns/dns-default-qjvwn" Apr 28 19:16:59.084332 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:16:59.084319 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 28 19:16:59.084332 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:16:59.084319 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 28 19:16:59.084431 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:16:59.084381 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb3c06b0-d193-4c98-afaf-e689e3a82af8-cert podName:eb3c06b0-d193-4c98-afaf-e689e3a82af8 nodeName:}" failed. No retries permitted until 2026-04-28 19:17:01.084366193 +0000 UTC m=+37.047266536 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/eb3c06b0-d193-4c98-afaf-e689e3a82af8-cert") pod "ingress-canary-j8mfv" (UID: "eb3c06b0-d193-4c98-afaf-e689e3a82af8") : secret "canary-serving-cert" not found Apr 28 19:16:59.084431 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:16:59.084395 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/018d803a-f231-469e-8539-32dcc07e43f8-metrics-tls podName:018d803a-f231-469e-8539-32dcc07e43f8 nodeName:}" failed. No retries permitted until 2026-04-28 19:17:01.08438973 +0000 UTC m=+37.047290073 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/018d803a-f231-469e-8539-32dcc07e43f8-metrics-tls") pod "dns-default-qjvwn" (UID: "018d803a-f231-469e-8539-32dcc07e43f8") : secret "dns-default-metrics-tls" not found Apr 28 19:16:59.680204 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:59.680169 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-hgdfg" event={"ID":"ca288914-564f-4959-9c10-76a6327678fe","Type":"ContainerStarted","Data":"5b6e4a922d805997d4c3f12c13c56082b3ab174dbefaf2de014603ed6431c856"} Apr 28 19:16:59.682915 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:59.682887 2576 generic.go:358] "Generic (PLEG): container finished" podID="51e675bf-bae4-491c-adfc-eae81fef84bf" containerID="46a1be5849e98d07073198163f2c8268570b5d24780c838529f5943b86c9301c" exitCode=0 Apr 28 19:16:59.683034 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:16:59.682928 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wt99f" event={"ID":"51e675bf-bae4-491c-adfc-eae81fef84bf","Type":"ContainerDied","Data":"46a1be5849e98d07073198163f2c8268570b5d24780c838529f5943b86c9301c"} Apr 28 19:17:00.687894 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:17:00.687687 2576 generic.go:358] "Generic (PLEG): container finished" podID="51e675bf-bae4-491c-adfc-eae81fef84bf" containerID="c252c7cfc919702dfb93fa337abdefb220712feec737440dbe9db0f9e627504d" exitCode=0 Apr 28 19:17:00.688324 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:17:00.687818 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wt99f" event={"ID":"51e675bf-bae4-491c-adfc-eae81fef84bf","Type":"ContainerDied","Data":"c252c7cfc919702dfb93fa337abdefb220712feec737440dbe9db0f9e627504d"} Apr 28 19:17:01.099170 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:17:01.099139 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/018d803a-f231-469e-8539-32dcc07e43f8-metrics-tls\") pod \"dns-default-qjvwn\" (UID: \"018d803a-f231-469e-8539-32dcc07e43f8\") " pod="openshift-dns/dns-default-qjvwn" Apr 28 19:17:01.099356 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:17:01.099194 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/eb3c06b0-d193-4c98-afaf-e689e3a82af8-cert\") pod \"ingress-canary-j8mfv\" (UID: \"eb3c06b0-d193-4c98-afaf-e689e3a82af8\") " pod="openshift-ingress-canary/ingress-canary-j8mfv" Apr 28 19:17:01.099356 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:17:01.099298 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 28 19:17:01.099356 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:17:01.099310 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 28 19:17:01.099356 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:17:01.099350 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb3c06b0-d193-4c98-afaf-e689e3a82af8-cert podName:eb3c06b0-d193-4c98-afaf-e689e3a82af8 nodeName:}" failed. No retries permitted until 2026-04-28 19:17:05.099337325 +0000 UTC m=+41.062237670 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/eb3c06b0-d193-4c98-afaf-e689e3a82af8-cert") pod "ingress-canary-j8mfv" (UID: "eb3c06b0-d193-4c98-afaf-e689e3a82af8") : secret "canary-serving-cert" not found Apr 28 19:17:01.099542 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:17:01.099374 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/018d803a-f231-469e-8539-32dcc07e43f8-metrics-tls podName:018d803a-f231-469e-8539-32dcc07e43f8 nodeName:}" failed. No retries permitted until 2026-04-28 19:17:05.099356975 +0000 UTC m=+41.062257324 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/018d803a-f231-469e-8539-32dcc07e43f8-metrics-tls") pod "dns-default-qjvwn" (UID: "018d803a-f231-469e-8539-32dcc07e43f8") : secret "dns-default-metrics-tls" not found Apr 28 19:17:01.693264 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:17:01.693228 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wt99f" event={"ID":"51e675bf-bae4-491c-adfc-eae81fef84bf","Type":"ContainerStarted","Data":"2566117e508f14860934a71bc3e02368687ac0a3f63ad85a30f5ef93614c0960"} Apr 28 19:17:01.723637 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:17:01.723526 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-wt99f" podStartSLOduration=6.325201314 podStartE2EDuration="37.723511746s" podCreationTimestamp="2026-04-28 19:16:24 +0000 UTC" firstStartedPulling="2026-04-28 19:16:27.156019644 +0000 UTC m=+3.118919988" lastFinishedPulling="2026-04-28 19:16:58.554330077 +0000 UTC m=+34.517230420" observedRunningTime="2026-04-28 19:17:01.72185988 +0000 UTC m=+37.684760287" watchObservedRunningTime="2026-04-28 19:17:01.723511746 +0000 UTC m=+37.686412112" Apr 28 19:17:02.696648 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:17:02.696616 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-hgdfg" event={"ID":"ca288914-564f-4959-9c10-76a6327678fe","Type":"ContainerStarted","Data":"ad00a4cfc52d0344e8d0d74155d8010f5055ff65f4e90f95e1b77abbe73cbe88"} Apr 28 19:17:02.697181 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:17:02.696875 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-hgdfg" Apr 28 19:17:02.717230 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:17:02.717181 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-hgdfg" podStartSLOduration=35.701110095 podStartE2EDuration="38.717164816s" podCreationTimestamp="2026-04-28 19:16:24 +0000 UTC" firstStartedPulling="2026-04-28 19:16:58.710581616 +0000 UTC m=+34.673481960" lastFinishedPulling="2026-04-28 19:17:01.726636337 +0000 UTC m=+37.689536681" observedRunningTime="2026-04-28 19:17:02.716839507 +0000 UTC m=+38.679739874" watchObservedRunningTime="2026-04-28 19:17:02.717164816 +0000 UTC m=+38.680065182" Apr 28 19:17:05.128028 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:17:05.127988 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/018d803a-f231-469e-8539-32dcc07e43f8-metrics-tls\") pod \"dns-default-qjvwn\" (UID: \"018d803a-f231-469e-8539-32dcc07e43f8\") " pod="openshift-dns/dns-default-qjvwn" Apr 28 19:17:05.128446 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:17:05.128058 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/eb3c06b0-d193-4c98-afaf-e689e3a82af8-cert\") pod \"ingress-canary-j8mfv\" (UID: \"eb3c06b0-d193-4c98-afaf-e689e3a82af8\") " pod="openshift-ingress-canary/ingress-canary-j8mfv" Apr 28 19:17:05.128446 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:17:05.128153 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 28 19:17:05.128446 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:17:05.128153 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 28 19:17:05.128446 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:17:05.128221 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/018d803a-f231-469e-8539-32dcc07e43f8-metrics-tls podName:018d803a-f231-469e-8539-32dcc07e43f8 nodeName:}" failed. No retries permitted until 2026-04-28 19:17:13.128201329 +0000 UTC m=+49.091101686 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/018d803a-f231-469e-8539-32dcc07e43f8-metrics-tls") pod "dns-default-qjvwn" (UID: "018d803a-f231-469e-8539-32dcc07e43f8") : secret "dns-default-metrics-tls" not found Apr 28 19:17:05.128446 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:17:05.128234 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb3c06b0-d193-4c98-afaf-e689e3a82af8-cert podName:eb3c06b0-d193-4c98-afaf-e689e3a82af8 nodeName:}" failed. No retries permitted until 2026-04-28 19:17:13.128228926 +0000 UTC m=+49.091129269 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/eb3c06b0-d193-4c98-afaf-e689e3a82af8-cert") pod "ingress-canary-j8mfv" (UID: "eb3c06b0-d193-4c98-afaf-e689e3a82af8") : secret "canary-serving-cert" not found Apr 28 19:17:13.188191 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:17:13.188152 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/018d803a-f231-469e-8539-32dcc07e43f8-metrics-tls\") pod \"dns-default-qjvwn\" (UID: \"018d803a-f231-469e-8539-32dcc07e43f8\") " pod="openshift-dns/dns-default-qjvwn" Apr 28 19:17:13.188567 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:17:13.188212 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/eb3c06b0-d193-4c98-afaf-e689e3a82af8-cert\") pod \"ingress-canary-j8mfv\" (UID: \"eb3c06b0-d193-4c98-afaf-e689e3a82af8\") " pod="openshift-ingress-canary/ingress-canary-j8mfv" Apr 28 19:17:13.188567 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:17:13.188311 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 28 19:17:13.188567 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:17:13.188313 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 28 19:17:13.188567 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:17:13.188375 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/018d803a-f231-469e-8539-32dcc07e43f8-metrics-tls podName:018d803a-f231-469e-8539-32dcc07e43f8 nodeName:}" failed. No retries permitted until 2026-04-28 19:17:29.188359935 +0000 UTC m=+65.151260280 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/018d803a-f231-469e-8539-32dcc07e43f8-metrics-tls") pod "dns-default-qjvwn" (UID: "018d803a-f231-469e-8539-32dcc07e43f8") : secret "dns-default-metrics-tls" not found Apr 28 19:17:13.188567 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:17:13.188392 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb3c06b0-d193-4c98-afaf-e689e3a82af8-cert podName:eb3c06b0-d193-4c98-afaf-e689e3a82af8 nodeName:}" failed. No retries permitted until 2026-04-28 19:17:29.188385004 +0000 UTC m=+65.151285348 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/eb3c06b0-d193-4c98-afaf-e689e3a82af8-cert") pod "ingress-canary-j8mfv" (UID: "eb3c06b0-d193-4c98-afaf-e689e3a82af8") : secret "canary-serving-cert" not found Apr 28 19:17:14.273082 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:17:14.273050 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-5c8d96db98-nj6lg"] Apr 28 19:17:14.312648 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:17:14.312591 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-5c8d96db98-nj6lg"] Apr 28 19:17:14.312799 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:17:14.312742 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5c8d96db98-nj6lg" Apr 28 19:17:14.316348 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:17:14.316324 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 28 19:17:14.317497 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:17:14.317481 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 28 19:17:14.317579 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:17:14.317486 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 28 19:17:14.317643 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:17:14.317488 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 28 19:17:14.398009 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:17:14.397978 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/f2c120fa-2092-4be2-b6ea-21bb0e34bb95-klusterlet-config\") pod \"klusterlet-addon-workmgr-5c8d96db98-nj6lg\" (UID: \"f2c120fa-2092-4be2-b6ea-21bb0e34bb95\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5c8d96db98-nj6lg" Apr 28 19:17:14.398009 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:17:14.398008 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hb2b\" (UniqueName: \"kubernetes.io/projected/f2c120fa-2092-4be2-b6ea-21bb0e34bb95-kube-api-access-8hb2b\") pod \"klusterlet-addon-workmgr-5c8d96db98-nj6lg\" (UID: \"f2c120fa-2092-4be2-b6ea-21bb0e34bb95\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5c8d96db98-nj6lg" Apr 28 19:17:14.398194 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:17:14.398038 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f2c120fa-2092-4be2-b6ea-21bb0e34bb95-tmp\") pod \"klusterlet-addon-workmgr-5c8d96db98-nj6lg\" (UID: \"f2c120fa-2092-4be2-b6ea-21bb0e34bb95\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5c8d96db98-nj6lg" Apr 28 19:17:14.498539 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:17:14.498511 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/f2c120fa-2092-4be2-b6ea-21bb0e34bb95-klusterlet-config\") pod \"klusterlet-addon-workmgr-5c8d96db98-nj6lg\" (UID: \"f2c120fa-2092-4be2-b6ea-21bb0e34bb95\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5c8d96db98-nj6lg" Apr 28 19:17:14.498724 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:17:14.498544 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8hb2b\" (UniqueName: \"kubernetes.io/projected/f2c120fa-2092-4be2-b6ea-21bb0e34bb95-kube-api-access-8hb2b\") pod \"klusterlet-addon-workmgr-5c8d96db98-nj6lg\" (UID: \"f2c120fa-2092-4be2-b6ea-21bb0e34bb95\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5c8d96db98-nj6lg" Apr 28 19:17:14.498724 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:17:14.498580 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f2c120fa-2092-4be2-b6ea-21bb0e34bb95-tmp\") pod \"klusterlet-addon-workmgr-5c8d96db98-nj6lg\" (UID: \"f2c120fa-2092-4be2-b6ea-21bb0e34bb95\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5c8d96db98-nj6lg" Apr 28 19:17:14.499009 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:17:14.498990 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f2c120fa-2092-4be2-b6ea-21bb0e34bb95-tmp\") pod \"klusterlet-addon-workmgr-5c8d96db98-nj6lg\" (UID: \"f2c120fa-2092-4be2-b6ea-21bb0e34bb95\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5c8d96db98-nj6lg" Apr 28 19:17:14.502339 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:17:14.502314 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/f2c120fa-2092-4be2-b6ea-21bb0e34bb95-klusterlet-config\") pod \"klusterlet-addon-workmgr-5c8d96db98-nj6lg\" (UID: \"f2c120fa-2092-4be2-b6ea-21bb0e34bb95\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5c8d96db98-nj6lg" Apr 28 19:17:14.510123 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:17:14.510101 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hb2b\" (UniqueName: \"kubernetes.io/projected/f2c120fa-2092-4be2-b6ea-21bb0e34bb95-kube-api-access-8hb2b\") pod \"klusterlet-addon-workmgr-5c8d96db98-nj6lg\" (UID: \"f2c120fa-2092-4be2-b6ea-21bb0e34bb95\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5c8d96db98-nj6lg" Apr 28 19:17:14.622776 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:17:14.622685 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5c8d96db98-nj6lg" Apr 28 19:17:14.755241 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:17:14.755212 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-5c8d96db98-nj6lg"] Apr 28 19:17:14.769381 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:17:14.769351 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf2c120fa_2092_4be2_b6ea_21bb0e34bb95.slice/crio-fabdc9c8d87c99b1a62e5b26949a89b8d242eca719c7ada71fe7b2b1f7505e47 WatchSource:0}: Error finding container fabdc9c8d87c99b1a62e5b26949a89b8d242eca719c7ada71fe7b2b1f7505e47: Status 404 returned error can't find the container with id fabdc9c8d87c99b1a62e5b26949a89b8d242eca719c7ada71fe7b2b1f7505e47 Apr 28 19:17:15.723080 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:17:15.723039 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5c8d96db98-nj6lg" event={"ID":"f2c120fa-2092-4be2-b6ea-21bb0e34bb95","Type":"ContainerStarted","Data":"fabdc9c8d87c99b1a62e5b26949a89b8d242eca719c7ada71fe7b2b1f7505e47"} Apr 28 19:17:20.734221 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:17:20.734146 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5c8d96db98-nj6lg" event={"ID":"f2c120fa-2092-4be2-b6ea-21bb0e34bb95","Type":"ContainerStarted","Data":"6f137cf984aabb5ba3ebf3c402f683d4fed54bb4650d5db4bf45a040ce5a4aca"} Apr 28 19:17:20.734570 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:17:20.734348 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5c8d96db98-nj6lg" Apr 28 19:17:20.735819 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:17:20.735799 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5c8d96db98-nj6lg" Apr 28 19:17:20.753676 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:17:20.753634 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5c8d96db98-nj6lg" podStartSLOduration=1.065223618 podStartE2EDuration="6.753622159s" podCreationTimestamp="2026-04-28 19:17:14 +0000 UTC" firstStartedPulling="2026-04-28 19:17:14.771291692 +0000 UTC m=+50.734192036" lastFinishedPulling="2026-04-28 19:17:20.459690233 +0000 UTC m=+56.422590577" observedRunningTime="2026-04-28 19:17:20.75289375 +0000 UTC m=+56.715794120" watchObservedRunningTime="2026-04-28 19:17:20.753622159 +0000 UTC m=+56.716522517" Apr 28 19:17:22.677672 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:17:22.677647 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-5tdk8" Apr 28 19:17:29.203919 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:17:29.203879 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/eb3c06b0-d193-4c98-afaf-e689e3a82af8-cert\") pod \"ingress-canary-j8mfv\" (UID: \"eb3c06b0-d193-4c98-afaf-e689e3a82af8\") " pod="openshift-ingress-canary/ingress-canary-j8mfv" Apr 28 19:17:29.204332 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:17:29.203941 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/018d803a-f231-469e-8539-32dcc07e43f8-metrics-tls\") pod \"dns-default-qjvwn\" (UID: \"018d803a-f231-469e-8539-32dcc07e43f8\") " pod="openshift-dns/dns-default-qjvwn" Apr 28 19:17:29.204332 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:17:29.204036 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 28 19:17:29.204332 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:17:29.204043 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 28 19:17:29.204332 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:17:29.204101 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/018d803a-f231-469e-8539-32dcc07e43f8-metrics-tls podName:018d803a-f231-469e-8539-32dcc07e43f8 nodeName:}" failed. No retries permitted until 2026-04-28 19:18:01.204086498 +0000 UTC m=+97.166986842 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/018d803a-f231-469e-8539-32dcc07e43f8-metrics-tls") pod "dns-default-qjvwn" (UID: "018d803a-f231-469e-8539-32dcc07e43f8") : secret "dns-default-metrics-tls" not found Apr 28 19:17:29.204332 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:17:29.204116 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb3c06b0-d193-4c98-afaf-e689e3a82af8-cert podName:eb3c06b0-d193-4c98-afaf-e689e3a82af8 nodeName:}" failed. No retries permitted until 2026-04-28 19:18:01.204109364 +0000 UTC m=+97.167009709 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/eb3c06b0-d193-4c98-afaf-e689e3a82af8-cert") pod "ingress-canary-j8mfv" (UID: "eb3c06b0-d193-4c98-afaf-e689e3a82af8") : secret "canary-serving-cert" not found Apr 28 19:17:30.210308 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:17:30.210271 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4236a3f6-5c96-4e29-bb77-8dafe3cd242d-metrics-certs\") pod \"network-metrics-daemon-8j8w9\" (UID: \"4236a3f6-5c96-4e29-bb77-8dafe3cd242d\") " pod="openshift-multus/network-metrics-daemon-8j8w9" Apr 28 19:17:30.210919 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:17:30.210449 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 28 19:17:30.210919 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:17:30.210544 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4236a3f6-5c96-4e29-bb77-8dafe3cd242d-metrics-certs podName:4236a3f6-5c96-4e29-bb77-8dafe3cd242d nodeName:}" failed. No retries permitted until 2026-04-28 19:18:34.210522806 +0000 UTC m=+130.173423171 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4236a3f6-5c96-4e29-bb77-8dafe3cd242d-metrics-certs") pod "network-metrics-daemon-8j8w9" (UID: "4236a3f6-5c96-4e29-bb77-8dafe3cd242d") : secret "metrics-daemon-secret" not found Apr 28 19:17:33.700479 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:17:33.700443 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-hgdfg" Apr 28 19:18:01.229227 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:01.229090 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/eb3c06b0-d193-4c98-afaf-e689e3a82af8-cert\") pod \"ingress-canary-j8mfv\" (UID: \"eb3c06b0-d193-4c98-afaf-e689e3a82af8\") " pod="openshift-ingress-canary/ingress-canary-j8mfv" Apr 28 19:18:01.229227 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:01.229185 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/018d803a-f231-469e-8539-32dcc07e43f8-metrics-tls\") pod \"dns-default-qjvwn\" (UID: \"018d803a-f231-469e-8539-32dcc07e43f8\") " pod="openshift-dns/dns-default-qjvwn" Apr 28 19:18:01.229775 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:18:01.229251 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 28 19:18:01.229775 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:18:01.229283 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 28 19:18:01.229775 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:18:01.229337 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/018d803a-f231-469e-8539-32dcc07e43f8-metrics-tls podName:018d803a-f231-469e-8539-32dcc07e43f8 nodeName:}" failed. No retries permitted until 2026-04-28 19:19:05.229318552 +0000 UTC m=+161.192218900 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/018d803a-f231-469e-8539-32dcc07e43f8-metrics-tls") pod "dns-default-qjvwn" (UID: "018d803a-f231-469e-8539-32dcc07e43f8") : secret "dns-default-metrics-tls" not found Apr 28 19:18:01.229775 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:18:01.229352 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb3c06b0-d193-4c98-afaf-e689e3a82af8-cert podName:eb3c06b0-d193-4c98-afaf-e689e3a82af8 nodeName:}" failed. No retries permitted until 2026-04-28 19:19:05.229344701 +0000 UTC m=+161.192245049 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/eb3c06b0-d193-4c98-afaf-e689e3a82af8-cert") pod "ingress-canary-j8mfv" (UID: "eb3c06b0-d193-4c98-afaf-e689e3a82af8") : secret "canary-serving-cert" not found Apr 28 19:18:34.262431 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:34.262389 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4236a3f6-5c96-4e29-bb77-8dafe3cd242d-metrics-certs\") pod \"network-metrics-daemon-8j8w9\" (UID: \"4236a3f6-5c96-4e29-bb77-8dafe3cd242d\") " pod="openshift-multus/network-metrics-daemon-8j8w9" Apr 28 19:18:34.263016 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:18:34.262506 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 28 19:18:34.263016 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:18:34.262582 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4236a3f6-5c96-4e29-bb77-8dafe3cd242d-metrics-certs podName:4236a3f6-5c96-4e29-bb77-8dafe3cd242d nodeName:}" failed. No retries permitted until 2026-04-28 19:20:36.262566808 +0000 UTC m=+252.225467152 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4236a3f6-5c96-4e29-bb77-8dafe3cd242d-metrics-certs") pod "network-metrics-daemon-8j8w9" (UID: "4236a3f6-5c96-4e29-bb77-8dafe3cd242d") : secret "metrics-daemon-secret" not found Apr 28 19:18:49.491752 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:49.491717 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-smtv9"] Apr 28 19:18:49.494408 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:49.494390 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-smtv9" Apr 28 19:18:49.500416 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:49.500389 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 28 19:18:49.500562 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:49.500453 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-5zq5f\"" Apr 28 19:18:49.501575 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:49.501557 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 28 19:18:49.501696 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:49.501558 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 28 19:18:49.501696 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:49.501627 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 28 19:18:49.508389 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:49.508368 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-smtv9"] Apr 28 19:18:49.599532 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:49.599495 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-fts6r"] Apr 28 19:18:49.602423 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:49.602392 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-fts6r" Apr 28 19:18:49.606873 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:49.606849 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 28 19:18:49.609357 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:49.609327 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 28 19:18:49.614400 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:49.614375 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 28 19:18:49.614400 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:49.614391 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-sh259\"" Apr 28 19:18:49.615380 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:49.615365 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 28 19:18:49.615951 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:49.615921 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 28 19:18:49.621718 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:49.621694 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-fts6r"] Apr 28 19:18:49.668558 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:49.668522 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/6f21f574-c1af-4f40-9435-416276a65b15-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-smtv9\" (UID: \"6f21f574-c1af-4f40-9435-416276a65b15\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-smtv9" Apr 28 19:18:49.668752 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:49.668631 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvbsz\" (UniqueName: \"kubernetes.io/projected/6f21f574-c1af-4f40-9435-416276a65b15-kube-api-access-xvbsz\") pod \"cluster-monitoring-operator-75587bd455-smtv9\" (UID: \"6f21f574-c1af-4f40-9435-416276a65b15\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-smtv9" Apr 28 19:18:49.668752 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:49.668657 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/6f21f574-c1af-4f40-9435-416276a65b15-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-smtv9\" (UID: \"6f21f574-c1af-4f40-9435-416276a65b15\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-smtv9" Apr 28 19:18:49.705662 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:49.705627 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-xq7sp"] Apr 28 19:18:49.708333 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:49.708318 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-xq7sp" Apr 28 19:18:49.711795 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:49.711760 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 28 19:18:49.711995 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:49.711800 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 28 19:18:49.711995 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:49.711920 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 28 19:18:49.712200 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:49.712090 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 28 19:18:49.712343 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:49.712196 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-wx6bw\"" Apr 28 19:18:49.717401 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:49.717378 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 28 19:18:49.718643 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:49.718626 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-64f9844bff-crx4m"] Apr 28 19:18:49.724351 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:49.724331 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-xq7sp"] Apr 28 19:18:49.724445 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:49.724427 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-64f9844bff-crx4m" Apr 28 19:18:49.727526 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:49.727503 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 28 19:18:49.727651 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:49.727506 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 28 19:18:49.727651 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:49.727570 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-6x2jf\"" Apr 28 19:18:49.727864 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:49.727839 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 28 19:18:49.734255 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:49.734235 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 28 19:18:49.740837 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:49.740811 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-64f9844bff-crx4m"] Apr 28 19:18:49.769179 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:49.769089 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6551d5d0-2583-4478-98ab-1efc22016165-serving-cert\") pod \"console-operator-9d4b6777b-fts6r\" (UID: \"6551d5d0-2583-4478-98ab-1efc22016165\") " pod="openshift-console-operator/console-operator-9d4b6777b-fts6r" Apr 28 19:18:49.769179 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:49.769157 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xvbsz\" (UniqueName: \"kubernetes.io/projected/6f21f574-c1af-4f40-9435-416276a65b15-kube-api-access-xvbsz\") pod \"cluster-monitoring-operator-75587bd455-smtv9\" (UID: \"6f21f574-c1af-4f40-9435-416276a65b15\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-smtv9" Apr 28 19:18:49.769179 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:49.769179 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/6f21f574-c1af-4f40-9435-416276a65b15-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-smtv9\" (UID: \"6f21f574-c1af-4f40-9435-416276a65b15\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-smtv9" Apr 28 19:18:49.769442 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:49.769198 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6551d5d0-2583-4478-98ab-1efc22016165-trusted-ca\") pod \"console-operator-9d4b6777b-fts6r\" (UID: \"6551d5d0-2583-4478-98ab-1efc22016165\") " pod="openshift-console-operator/console-operator-9d4b6777b-fts6r" Apr 28 19:18:49.769442 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:49.769214 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pf5s2\" (UniqueName: \"kubernetes.io/projected/6551d5d0-2583-4478-98ab-1efc22016165-kube-api-access-pf5s2\") pod \"console-operator-9d4b6777b-fts6r\" (UID: \"6551d5d0-2583-4478-98ab-1efc22016165\") " pod="openshift-console-operator/console-operator-9d4b6777b-fts6r" Apr 28 19:18:49.769442 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:49.769244 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/6f21f574-c1af-4f40-9435-416276a65b15-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-smtv9\" (UID: \"6f21f574-c1af-4f40-9435-416276a65b15\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-smtv9" Apr 28 19:18:49.769442 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:49.769262 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6551d5d0-2583-4478-98ab-1efc22016165-config\") pod \"console-operator-9d4b6777b-fts6r\" (UID: \"6551d5d0-2583-4478-98ab-1efc22016165\") " pod="openshift-console-operator/console-operator-9d4b6777b-fts6r" Apr 28 19:18:49.769442 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:18:49.769333 2576 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 28 19:18:49.769442 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:18:49.769402 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6f21f574-c1af-4f40-9435-416276a65b15-cluster-monitoring-operator-tls podName:6f21f574-c1af-4f40-9435-416276a65b15 nodeName:}" failed. No retries permitted until 2026-04-28 19:18:50.269386098 +0000 UTC m=+146.232286442 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/6f21f574-c1af-4f40-9435-416276a65b15-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-smtv9" (UID: "6f21f574-c1af-4f40-9435-416276a65b15") : secret "cluster-monitoring-operator-tls" not found Apr 28 19:18:49.769927 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:49.769908 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/6f21f574-c1af-4f40-9435-416276a65b15-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-smtv9\" (UID: \"6f21f574-c1af-4f40-9435-416276a65b15\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-smtv9" Apr 28 19:18:49.779870 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:49.779838 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvbsz\" (UniqueName: \"kubernetes.io/projected/6f21f574-c1af-4f40-9435-416276a65b15-kube-api-access-xvbsz\") pod \"cluster-monitoring-operator-75587bd455-smtv9\" (UID: \"6f21f574-c1af-4f40-9435-416276a65b15\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-smtv9" Apr 28 19:18:49.869953 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:49.869913 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/59cffdfd-1e99-4ed5-96e7-7b40825f9cbe-installation-pull-secrets\") pod \"image-registry-64f9844bff-crx4m\" (UID: \"59cffdfd-1e99-4ed5-96e7-7b40825f9cbe\") " pod="openshift-image-registry/image-registry-64f9844bff-crx4m" Apr 28 19:18:49.869953 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:49.869953 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/59cffdfd-1e99-4ed5-96e7-7b40825f9cbe-bound-sa-token\") pod \"image-registry-64f9844bff-crx4m\" (UID: \"59cffdfd-1e99-4ed5-96e7-7b40825f9cbe\") " pod="openshift-image-registry/image-registry-64f9844bff-crx4m" Apr 28 19:18:49.870151 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:49.869982 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/65cc2665-0cf0-4e9a-9316-292edb21e2bc-service-ca-bundle\") pod \"insights-operator-585dfdc468-xq7sp\" (UID: \"65cc2665-0cf0-4e9a-9316-292edb21e2bc\") " pod="openshift-insights/insights-operator-585dfdc468-xq7sp" Apr 28 19:18:49.870151 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:49.870045 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/59cffdfd-1e99-4ed5-96e7-7b40825f9cbe-registry-certificates\") pod \"image-registry-64f9844bff-crx4m\" (UID: \"59cffdfd-1e99-4ed5-96e7-7b40825f9cbe\") " pod="openshift-image-registry/image-registry-64f9844bff-crx4m" Apr 28 19:18:49.870151 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:49.870087 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/59cffdfd-1e99-4ed5-96e7-7b40825f9cbe-trusted-ca\") pod \"image-registry-64f9844bff-crx4m\" (UID: \"59cffdfd-1e99-4ed5-96e7-7b40825f9cbe\") " pod="openshift-image-registry/image-registry-64f9844bff-crx4m" Apr 28 19:18:49.870151 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:49.870113 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pf5s2\" (UniqueName: \"kubernetes.io/projected/6551d5d0-2583-4478-98ab-1efc22016165-kube-api-access-pf5s2\") pod \"console-operator-9d4b6777b-fts6r\" (UID: \"6551d5d0-2583-4478-98ab-1efc22016165\") " pod="openshift-console-operator/console-operator-9d4b6777b-fts6r" Apr 28 19:18:49.870151 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:49.870130 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/65cc2665-0cf0-4e9a-9316-292edb21e2bc-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-xq7sp\" (UID: \"65cc2665-0cf0-4e9a-9316-292edb21e2bc\") " pod="openshift-insights/insights-operator-585dfdc468-xq7sp" Apr 28 19:18:49.870151 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:49.870151 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/65cc2665-0cf0-4e9a-9316-292edb21e2bc-snapshots\") pod \"insights-operator-585dfdc468-xq7sp\" (UID: \"65cc2665-0cf0-4e9a-9316-292edb21e2bc\") " pod="openshift-insights/insights-operator-585dfdc468-xq7sp" Apr 28 19:18:49.870331 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:49.870172 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/59cffdfd-1e99-4ed5-96e7-7b40825f9cbe-registry-tls\") pod \"image-registry-64f9844bff-crx4m\" (UID: \"59cffdfd-1e99-4ed5-96e7-7b40825f9cbe\") " pod="openshift-image-registry/image-registry-64f9844bff-crx4m" Apr 28 19:18:49.870331 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:49.870231 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/65cc2665-0cf0-4e9a-9316-292edb21e2bc-serving-cert\") pod \"insights-operator-585dfdc468-xq7sp\" (UID: \"65cc2665-0cf0-4e9a-9316-292edb21e2bc\") " pod="openshift-insights/insights-operator-585dfdc468-xq7sp" Apr 28 19:18:49.870331 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:49.870275 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjs5x\" (UniqueName: \"kubernetes.io/projected/65cc2665-0cf0-4e9a-9316-292edb21e2bc-kube-api-access-bjs5x\") pod \"insights-operator-585dfdc468-xq7sp\" (UID: \"65cc2665-0cf0-4e9a-9316-292edb21e2bc\") " pod="openshift-insights/insights-operator-585dfdc468-xq7sp" Apr 28 19:18:49.870331 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:49.870311 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trwnv\" (UniqueName: \"kubernetes.io/projected/59cffdfd-1e99-4ed5-96e7-7b40825f9cbe-kube-api-access-trwnv\") pod \"image-registry-64f9844bff-crx4m\" (UID: \"59cffdfd-1e99-4ed5-96e7-7b40825f9cbe\") " pod="openshift-image-registry/image-registry-64f9844bff-crx4m" Apr 28 19:18:49.870464 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:49.870330 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6551d5d0-2583-4478-98ab-1efc22016165-trusted-ca\") pod \"console-operator-9d4b6777b-fts6r\" (UID: \"6551d5d0-2583-4478-98ab-1efc22016165\") " pod="openshift-console-operator/console-operator-9d4b6777b-fts6r" Apr 28 19:18:49.870464 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:49.870361 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6551d5d0-2583-4478-98ab-1efc22016165-config\") pod \"console-operator-9d4b6777b-fts6r\" (UID: \"6551d5d0-2583-4478-98ab-1efc22016165\") " pod="openshift-console-operator/console-operator-9d4b6777b-fts6r" Apr 28 19:18:49.870464 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:49.870385 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6551d5d0-2583-4478-98ab-1efc22016165-serving-cert\") pod \"console-operator-9d4b6777b-fts6r\" (UID: \"6551d5d0-2583-4478-98ab-1efc22016165\") " pod="openshift-console-operator/console-operator-9d4b6777b-fts6r" Apr 28 19:18:49.870464 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:49.870400 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/65cc2665-0cf0-4e9a-9316-292edb21e2bc-tmp\") pod \"insights-operator-585dfdc468-xq7sp\" (UID: \"65cc2665-0cf0-4e9a-9316-292edb21e2bc\") " pod="openshift-insights/insights-operator-585dfdc468-xq7sp" Apr 28 19:18:49.870464 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:49.870421 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/59cffdfd-1e99-4ed5-96e7-7b40825f9cbe-image-registry-private-configuration\") pod \"image-registry-64f9844bff-crx4m\" (UID: \"59cffdfd-1e99-4ed5-96e7-7b40825f9cbe\") " pod="openshift-image-registry/image-registry-64f9844bff-crx4m" Apr 28 19:18:49.870653 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:49.870459 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/59cffdfd-1e99-4ed5-96e7-7b40825f9cbe-ca-trust-extracted\") pod \"image-registry-64f9844bff-crx4m\" (UID: \"59cffdfd-1e99-4ed5-96e7-7b40825f9cbe\") " pod="openshift-image-registry/image-registry-64f9844bff-crx4m" Apr 28 19:18:49.871035 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:49.871007 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6551d5d0-2583-4478-98ab-1efc22016165-config\") pod \"console-operator-9d4b6777b-fts6r\" (UID: \"6551d5d0-2583-4478-98ab-1efc22016165\") " pod="openshift-console-operator/console-operator-9d4b6777b-fts6r" Apr 28 19:18:49.871146 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:49.871115 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6551d5d0-2583-4478-98ab-1efc22016165-trusted-ca\") pod \"console-operator-9d4b6777b-fts6r\" (UID: \"6551d5d0-2583-4478-98ab-1efc22016165\") " pod="openshift-console-operator/console-operator-9d4b6777b-fts6r" Apr 28 19:18:49.872889 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:49.872873 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6551d5d0-2583-4478-98ab-1efc22016165-serving-cert\") pod \"console-operator-9d4b6777b-fts6r\" (UID: \"6551d5d0-2583-4478-98ab-1efc22016165\") " pod="openshift-console-operator/console-operator-9d4b6777b-fts6r" Apr 28 19:18:49.879878 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:49.879855 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pf5s2\" (UniqueName: \"kubernetes.io/projected/6551d5d0-2583-4478-98ab-1efc22016165-kube-api-access-pf5s2\") pod \"console-operator-9d4b6777b-fts6r\" (UID: \"6551d5d0-2583-4478-98ab-1efc22016165\") " pod="openshift-console-operator/console-operator-9d4b6777b-fts6r" Apr 28 19:18:49.915383 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:49.915348 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-fts6r" Apr 28 19:18:49.971311 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:49.971277 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/65cc2665-0cf0-4e9a-9316-292edb21e2bc-tmp\") pod \"insights-operator-585dfdc468-xq7sp\" (UID: \"65cc2665-0cf0-4e9a-9316-292edb21e2bc\") " pod="openshift-insights/insights-operator-585dfdc468-xq7sp" Apr 28 19:18:49.971311 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:49.971316 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/59cffdfd-1e99-4ed5-96e7-7b40825f9cbe-image-registry-private-configuration\") pod \"image-registry-64f9844bff-crx4m\" (UID: \"59cffdfd-1e99-4ed5-96e7-7b40825f9cbe\") " pod="openshift-image-registry/image-registry-64f9844bff-crx4m" Apr 28 19:18:49.971523 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:49.971338 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/59cffdfd-1e99-4ed5-96e7-7b40825f9cbe-ca-trust-extracted\") pod \"image-registry-64f9844bff-crx4m\" (UID: \"59cffdfd-1e99-4ed5-96e7-7b40825f9cbe\") " pod="openshift-image-registry/image-registry-64f9844bff-crx4m" Apr 28 19:18:49.971523 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:49.971379 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/59cffdfd-1e99-4ed5-96e7-7b40825f9cbe-installation-pull-secrets\") pod \"image-registry-64f9844bff-crx4m\" (UID: \"59cffdfd-1e99-4ed5-96e7-7b40825f9cbe\") " pod="openshift-image-registry/image-registry-64f9844bff-crx4m" Apr 28 19:18:49.971523 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:49.971402 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/59cffdfd-1e99-4ed5-96e7-7b40825f9cbe-bound-sa-token\") pod \"image-registry-64f9844bff-crx4m\" (UID: \"59cffdfd-1e99-4ed5-96e7-7b40825f9cbe\") " pod="openshift-image-registry/image-registry-64f9844bff-crx4m" Apr 28 19:18:49.971523 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:49.971434 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/65cc2665-0cf0-4e9a-9316-292edb21e2bc-service-ca-bundle\") pod \"insights-operator-585dfdc468-xq7sp\" (UID: \"65cc2665-0cf0-4e9a-9316-292edb21e2bc\") " pod="openshift-insights/insights-operator-585dfdc468-xq7sp" Apr 28 19:18:49.971523 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:49.971461 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/59cffdfd-1e99-4ed5-96e7-7b40825f9cbe-registry-certificates\") pod \"image-registry-64f9844bff-crx4m\" (UID: \"59cffdfd-1e99-4ed5-96e7-7b40825f9cbe\") " pod="openshift-image-registry/image-registry-64f9844bff-crx4m" Apr 28 19:18:49.971523 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:49.971489 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/59cffdfd-1e99-4ed5-96e7-7b40825f9cbe-trusted-ca\") pod \"image-registry-64f9844bff-crx4m\" (UID: \"59cffdfd-1e99-4ed5-96e7-7b40825f9cbe\") " pod="openshift-image-registry/image-registry-64f9844bff-crx4m" Apr 28 19:18:49.971523 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:49.971517 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/65cc2665-0cf0-4e9a-9316-292edb21e2bc-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-xq7sp\" (UID: \"65cc2665-0cf0-4e9a-9316-292edb21e2bc\") " pod="openshift-insights/insights-operator-585dfdc468-xq7sp" Apr 28 19:18:49.971897 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:49.971549 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/65cc2665-0cf0-4e9a-9316-292edb21e2bc-snapshots\") pod \"insights-operator-585dfdc468-xq7sp\" (UID: \"65cc2665-0cf0-4e9a-9316-292edb21e2bc\") " pod="openshift-insights/insights-operator-585dfdc468-xq7sp" Apr 28 19:18:49.971897 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:49.971574 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/59cffdfd-1e99-4ed5-96e7-7b40825f9cbe-registry-tls\") pod \"image-registry-64f9844bff-crx4m\" (UID: \"59cffdfd-1e99-4ed5-96e7-7b40825f9cbe\") " pod="openshift-image-registry/image-registry-64f9844bff-crx4m" Apr 28 19:18:49.971897 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:49.971652 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/65cc2665-0cf0-4e9a-9316-292edb21e2bc-serving-cert\") pod \"insights-operator-585dfdc468-xq7sp\" (UID: \"65cc2665-0cf0-4e9a-9316-292edb21e2bc\") " pod="openshift-insights/insights-operator-585dfdc468-xq7sp" Apr 28 19:18:49.971897 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:49.971692 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bjs5x\" (UniqueName: \"kubernetes.io/projected/65cc2665-0cf0-4e9a-9316-292edb21e2bc-kube-api-access-bjs5x\") pod \"insights-operator-585dfdc468-xq7sp\" (UID: \"65cc2665-0cf0-4e9a-9316-292edb21e2bc\") " pod="openshift-insights/insights-operator-585dfdc468-xq7sp" Apr 28 19:18:49.971897 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:49.971745 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/65cc2665-0cf0-4e9a-9316-292edb21e2bc-tmp\") pod \"insights-operator-585dfdc468-xq7sp\" (UID: \"65cc2665-0cf0-4e9a-9316-292edb21e2bc\") " pod="openshift-insights/insights-operator-585dfdc468-xq7sp" Apr 28 19:18:49.971897 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:49.971749 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-trwnv\" (UniqueName: \"kubernetes.io/projected/59cffdfd-1e99-4ed5-96e7-7b40825f9cbe-kube-api-access-trwnv\") pod \"image-registry-64f9844bff-crx4m\" (UID: \"59cffdfd-1e99-4ed5-96e7-7b40825f9cbe\") " pod="openshift-image-registry/image-registry-64f9844bff-crx4m" Apr 28 19:18:49.972180 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:18:49.972153 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 28 19:18:49.972180 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:18:49.972170 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-64f9844bff-crx4m: secret "image-registry-tls" not found Apr 28 19:18:49.972275 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:49.972198 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/65cc2665-0cf0-4e9a-9316-292edb21e2bc-service-ca-bundle\") pod \"insights-operator-585dfdc468-xq7sp\" (UID: \"65cc2665-0cf0-4e9a-9316-292edb21e2bc\") " pod="openshift-insights/insights-operator-585dfdc468-xq7sp" Apr 28 19:18:49.972275 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:18:49.972228 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/59cffdfd-1e99-4ed5-96e7-7b40825f9cbe-registry-tls podName:59cffdfd-1e99-4ed5-96e7-7b40825f9cbe nodeName:}" failed. No retries permitted until 2026-04-28 19:18:50.472211906 +0000 UTC m=+146.435112251 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/59cffdfd-1e99-4ed5-96e7-7b40825f9cbe-registry-tls") pod "image-registry-64f9844bff-crx4m" (UID: "59cffdfd-1e99-4ed5-96e7-7b40825f9cbe") : secret "image-registry-tls" not found Apr 28 19:18:49.972381 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:49.972318 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/65cc2665-0cf0-4e9a-9316-292edb21e2bc-snapshots\") pod \"insights-operator-585dfdc468-xq7sp\" (UID: \"65cc2665-0cf0-4e9a-9316-292edb21e2bc\") " pod="openshift-insights/insights-operator-585dfdc468-xq7sp" Apr 28 19:18:49.973004 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:49.972889 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/59cffdfd-1e99-4ed5-96e7-7b40825f9cbe-registry-certificates\") pod \"image-registry-64f9844bff-crx4m\" (UID: \"59cffdfd-1e99-4ed5-96e7-7b40825f9cbe\") " pod="openshift-image-registry/image-registry-64f9844bff-crx4m" Apr 28 19:18:49.973533 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:49.973504 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/59cffdfd-1e99-4ed5-96e7-7b40825f9cbe-ca-trust-extracted\") pod \"image-registry-64f9844bff-crx4m\" (UID: \"59cffdfd-1e99-4ed5-96e7-7b40825f9cbe\") " pod="openshift-image-registry/image-registry-64f9844bff-crx4m" Apr 28 19:18:49.975203 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:49.975110 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/59cffdfd-1e99-4ed5-96e7-7b40825f9cbe-trusted-ca\") pod \"image-registry-64f9844bff-crx4m\" (UID: \"59cffdfd-1e99-4ed5-96e7-7b40825f9cbe\") " pod="openshift-image-registry/image-registry-64f9844bff-crx4m" Apr 28 19:18:49.975981 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:49.975557 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/65cc2665-0cf0-4e9a-9316-292edb21e2bc-serving-cert\") pod \"insights-operator-585dfdc468-xq7sp\" (UID: \"65cc2665-0cf0-4e9a-9316-292edb21e2bc\") " pod="openshift-insights/insights-operator-585dfdc468-xq7sp" Apr 28 19:18:49.975981 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:49.975632 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/59cffdfd-1e99-4ed5-96e7-7b40825f9cbe-installation-pull-secrets\") pod \"image-registry-64f9844bff-crx4m\" (UID: \"59cffdfd-1e99-4ed5-96e7-7b40825f9cbe\") " pod="openshift-image-registry/image-registry-64f9844bff-crx4m" Apr 28 19:18:49.976295 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:49.976251 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/59cffdfd-1e99-4ed5-96e7-7b40825f9cbe-image-registry-private-configuration\") pod \"image-registry-64f9844bff-crx4m\" (UID: \"59cffdfd-1e99-4ed5-96e7-7b40825f9cbe\") " pod="openshift-image-registry/image-registry-64f9844bff-crx4m" Apr 28 19:18:49.976544 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:49.976519 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/65cc2665-0cf0-4e9a-9316-292edb21e2bc-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-xq7sp\" (UID: \"65cc2665-0cf0-4e9a-9316-292edb21e2bc\") " pod="openshift-insights/insights-operator-585dfdc468-xq7sp" Apr 28 19:18:49.982175 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:49.982124 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-trwnv\" (UniqueName: \"kubernetes.io/projected/59cffdfd-1e99-4ed5-96e7-7b40825f9cbe-kube-api-access-trwnv\") pod \"image-registry-64f9844bff-crx4m\" (UID: \"59cffdfd-1e99-4ed5-96e7-7b40825f9cbe\") " pod="openshift-image-registry/image-registry-64f9844bff-crx4m" Apr 28 19:18:49.983771 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:49.983749 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/59cffdfd-1e99-4ed5-96e7-7b40825f9cbe-bound-sa-token\") pod \"image-registry-64f9844bff-crx4m\" (UID: \"59cffdfd-1e99-4ed5-96e7-7b40825f9cbe\") " pod="openshift-image-registry/image-registry-64f9844bff-crx4m" Apr 28 19:18:49.983860 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:49.983813 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjs5x\" (UniqueName: \"kubernetes.io/projected/65cc2665-0cf0-4e9a-9316-292edb21e2bc-kube-api-access-bjs5x\") pod \"insights-operator-585dfdc468-xq7sp\" (UID: \"65cc2665-0cf0-4e9a-9316-292edb21e2bc\") " pod="openshift-insights/insights-operator-585dfdc468-xq7sp" Apr 28 19:18:50.017911 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:50.017879 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-xq7sp" Apr 28 19:18:50.049273 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:50.049243 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-fts6r"] Apr 28 19:18:50.052017 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:18:50.051971 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6551d5d0_2583_4478_98ab_1efc22016165.slice/crio-e7258deb5fc5dc68b2962d8cff4e309593d2de15d102668e4676d7ba88d5f4e0 WatchSource:0}: Error finding container e7258deb5fc5dc68b2962d8cff4e309593d2de15d102668e4676d7ba88d5f4e0: Status 404 returned error can't find the container with id e7258deb5fc5dc68b2962d8cff4e309593d2de15d102668e4676d7ba88d5f4e0 Apr 28 19:18:50.142320 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:50.142281 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-xq7sp"] Apr 28 19:18:50.146005 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:18:50.145976 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod65cc2665_0cf0_4e9a_9316_292edb21e2bc.slice/crio-31e4af9e2f0db9cd1a2e080c145db0393595b8cac160ec8100c68c5e6b4c7c6c WatchSource:0}: Error finding container 31e4af9e2f0db9cd1a2e080c145db0393595b8cac160ec8100c68c5e6b4c7c6c: Status 404 returned error can't find the container with id 31e4af9e2f0db9cd1a2e080c145db0393595b8cac160ec8100c68c5e6b4c7c6c Apr 28 19:18:50.274361 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:50.274318 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/6f21f574-c1af-4f40-9435-416276a65b15-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-smtv9\" (UID: \"6f21f574-c1af-4f40-9435-416276a65b15\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-smtv9" Apr 28 19:18:50.274533 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:18:50.274434 2576 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 28 19:18:50.274533 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:18:50.274492 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6f21f574-c1af-4f40-9435-416276a65b15-cluster-monitoring-operator-tls podName:6f21f574-c1af-4f40-9435-416276a65b15 nodeName:}" failed. No retries permitted until 2026-04-28 19:18:51.274477102 +0000 UTC m=+147.237377446 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/6f21f574-c1af-4f40-9435-416276a65b15-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-smtv9" (UID: "6f21f574-c1af-4f40-9435-416276a65b15") : secret "cluster-monitoring-operator-tls" not found Apr 28 19:18:50.475703 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:50.475591 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/59cffdfd-1e99-4ed5-96e7-7b40825f9cbe-registry-tls\") pod \"image-registry-64f9844bff-crx4m\" (UID: \"59cffdfd-1e99-4ed5-96e7-7b40825f9cbe\") " pod="openshift-image-registry/image-registry-64f9844bff-crx4m" Apr 28 19:18:50.475857 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:18:50.475744 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 28 19:18:50.475857 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:18:50.475764 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-64f9844bff-crx4m: secret "image-registry-tls" not found Apr 28 19:18:50.475857 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:18:50.475821 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/59cffdfd-1e99-4ed5-96e7-7b40825f9cbe-registry-tls podName:59cffdfd-1e99-4ed5-96e7-7b40825f9cbe nodeName:}" failed. No retries permitted until 2026-04-28 19:18:51.475805965 +0000 UTC m=+147.438706309 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/59cffdfd-1e99-4ed5-96e7-7b40825f9cbe-registry-tls") pod "image-registry-64f9844bff-crx4m" (UID: "59cffdfd-1e99-4ed5-96e7-7b40825f9cbe") : secret "image-registry-tls" not found Apr 28 19:18:50.910263 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:50.910207 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-fts6r" event={"ID":"6551d5d0-2583-4478-98ab-1efc22016165","Type":"ContainerStarted","Data":"e7258deb5fc5dc68b2962d8cff4e309593d2de15d102668e4676d7ba88d5f4e0"} Apr 28 19:18:50.911407 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:50.911375 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-xq7sp" event={"ID":"65cc2665-0cf0-4e9a-9316-292edb21e2bc","Type":"ContainerStarted","Data":"31e4af9e2f0db9cd1a2e080c145db0393595b8cac160ec8100c68c5e6b4c7c6c"} Apr 28 19:18:51.282899 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:51.282844 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/6f21f574-c1af-4f40-9435-416276a65b15-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-smtv9\" (UID: \"6f21f574-c1af-4f40-9435-416276a65b15\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-smtv9" Apr 28 19:18:51.283112 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:18:51.283087 2576 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 28 19:18:51.283202 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:18:51.283181 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6f21f574-c1af-4f40-9435-416276a65b15-cluster-monitoring-operator-tls podName:6f21f574-c1af-4f40-9435-416276a65b15 nodeName:}" failed. No retries permitted until 2026-04-28 19:18:53.283158408 +0000 UTC m=+149.246058762 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/6f21f574-c1af-4f40-9435-416276a65b15-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-smtv9" (UID: "6f21f574-c1af-4f40-9435-416276a65b15") : secret "cluster-monitoring-operator-tls" not found Apr 28 19:18:51.484993 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:51.484953 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/59cffdfd-1e99-4ed5-96e7-7b40825f9cbe-registry-tls\") pod \"image-registry-64f9844bff-crx4m\" (UID: \"59cffdfd-1e99-4ed5-96e7-7b40825f9cbe\") " pod="openshift-image-registry/image-registry-64f9844bff-crx4m" Apr 28 19:18:51.485155 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:18:51.485132 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 28 19:18:51.485221 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:18:51.485158 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-64f9844bff-crx4m: secret "image-registry-tls" not found Apr 28 19:18:51.485221 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:18:51.485217 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/59cffdfd-1e99-4ed5-96e7-7b40825f9cbe-registry-tls podName:59cffdfd-1e99-4ed5-96e7-7b40825f9cbe nodeName:}" failed. No retries permitted until 2026-04-28 19:18:53.4851988 +0000 UTC m=+149.448099144 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/59cffdfd-1e99-4ed5-96e7-7b40825f9cbe-registry-tls") pod "image-registry-64f9844bff-crx4m" (UID: "59cffdfd-1e99-4ed5-96e7-7b40825f9cbe") : secret "image-registry-tls" not found Apr 28 19:18:52.916639 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:52.916591 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-xq7sp" event={"ID":"65cc2665-0cf0-4e9a-9316-292edb21e2bc","Type":"ContainerStarted","Data":"8c830bcdf1cb4bd2a3e6438c9f9e83698d2171ab02e460ebf4e086340c496547"} Apr 28 19:18:52.918087 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:52.918063 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fts6r_6551d5d0-2583-4478-98ab-1efc22016165/console-operator/0.log" Apr 28 19:18:52.918193 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:52.918109 2576 generic.go:358] "Generic (PLEG): container finished" podID="6551d5d0-2583-4478-98ab-1efc22016165" containerID="c61b7bd183edafd8f3c95a119a8d2346bc39977c8ac82135aa83e6b9c51ce706" exitCode=255 Apr 28 19:18:52.918193 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:52.918140 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-fts6r" event={"ID":"6551d5d0-2583-4478-98ab-1efc22016165","Type":"ContainerDied","Data":"c61b7bd183edafd8f3c95a119a8d2346bc39977c8ac82135aa83e6b9c51ce706"} Apr 28 19:18:52.918362 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:52.918349 2576 scope.go:117] "RemoveContainer" containerID="c61b7bd183edafd8f3c95a119a8d2346bc39977c8ac82135aa83e6b9c51ce706" Apr 28 19:18:52.941632 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:52.941570 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-xq7sp" podStartSLOduration=1.746635685 podStartE2EDuration="3.941555421s" podCreationTimestamp="2026-04-28 19:18:49 +0000 UTC" firstStartedPulling="2026-04-28 19:18:50.147772128 +0000 UTC m=+146.110672472" lastFinishedPulling="2026-04-28 19:18:52.342691861 +0000 UTC m=+148.305592208" observedRunningTime="2026-04-28 19:18:52.93957277 +0000 UTC m=+148.902473139" watchObservedRunningTime="2026-04-28 19:18:52.941555421 +0000 UTC m=+148.904455787" Apr 28 19:18:53.298865 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:53.298822 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/6f21f574-c1af-4f40-9435-416276a65b15-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-smtv9\" (UID: \"6f21f574-c1af-4f40-9435-416276a65b15\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-smtv9" Apr 28 19:18:53.299027 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:18:53.298990 2576 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 28 19:18:53.299085 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:18:53.299074 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6f21f574-c1af-4f40-9435-416276a65b15-cluster-monitoring-operator-tls podName:6f21f574-c1af-4f40-9435-416276a65b15 nodeName:}" failed. No retries permitted until 2026-04-28 19:18:57.299053436 +0000 UTC m=+153.261953783 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/6f21f574-c1af-4f40-9435-416276a65b15-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-smtv9" (UID: "6f21f574-c1af-4f40-9435-416276a65b15") : secret "cluster-monitoring-operator-tls" not found Apr 28 19:18:53.500351 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:53.500307 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/59cffdfd-1e99-4ed5-96e7-7b40825f9cbe-registry-tls\") pod \"image-registry-64f9844bff-crx4m\" (UID: \"59cffdfd-1e99-4ed5-96e7-7b40825f9cbe\") " pod="openshift-image-registry/image-registry-64f9844bff-crx4m" Apr 28 19:18:53.500520 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:18:53.500439 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 28 19:18:53.500520 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:18:53.500453 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-64f9844bff-crx4m: secret "image-registry-tls" not found Apr 28 19:18:53.500520 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:18:53.500504 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/59cffdfd-1e99-4ed5-96e7-7b40825f9cbe-registry-tls podName:59cffdfd-1e99-4ed5-96e7-7b40825f9cbe nodeName:}" failed. No retries permitted until 2026-04-28 19:18:57.500490838 +0000 UTC m=+153.463391182 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/59cffdfd-1e99-4ed5-96e7-7b40825f9cbe-registry-tls") pod "image-registry-64f9844bff-crx4m" (UID: "59cffdfd-1e99-4ed5-96e7-7b40825f9cbe") : secret "image-registry-tls" not found Apr 28 19:18:53.922258 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:53.922232 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fts6r_6551d5d0-2583-4478-98ab-1efc22016165/console-operator/1.log" Apr 28 19:18:53.922680 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:53.922578 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fts6r_6551d5d0-2583-4478-98ab-1efc22016165/console-operator/0.log" Apr 28 19:18:53.922680 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:53.922627 2576 generic.go:358] "Generic (PLEG): container finished" podID="6551d5d0-2583-4478-98ab-1efc22016165" containerID="9cd10fad5281b41863729d553269312eebad898193b2306af53619de147e9d57" exitCode=255 Apr 28 19:18:53.922767 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:53.922706 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-fts6r" event={"ID":"6551d5d0-2583-4478-98ab-1efc22016165","Type":"ContainerDied","Data":"9cd10fad5281b41863729d553269312eebad898193b2306af53619de147e9d57"} Apr 28 19:18:53.922767 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:53.922756 2576 scope.go:117] "RemoveContainer" containerID="c61b7bd183edafd8f3c95a119a8d2346bc39977c8ac82135aa83e6b9c51ce706" Apr 28 19:18:53.923067 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:53.923042 2576 scope.go:117] "RemoveContainer" containerID="9cd10fad5281b41863729d553269312eebad898193b2306af53619de147e9d57" Apr 28 19:18:53.923242 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:18:53.923222 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-fts6r_openshift-console-operator(6551d5d0-2583-4478-98ab-1efc22016165)\"" pod="openshift-console-operator/console-operator-9d4b6777b-fts6r" podUID="6551d5d0-2583-4478-98ab-1efc22016165" Apr 28 19:18:54.678137 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:54.678100 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-dq4vl"] Apr 28 19:18:54.681161 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:54.681146 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-dq4vl" Apr 28 19:18:54.684330 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:54.684306 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-t8fh7\"" Apr 28 19:18:54.685529 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:54.685507 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 28 19:18:54.685710 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:54.685507 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 28 19:18:54.692936 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:54.692914 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-dq4vl"] Apr 28 19:18:54.710445 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:54.710418 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrnms\" (UniqueName: \"kubernetes.io/projected/2ceb6eda-ba8b-4e64-86be-238acc7be78a-kube-api-access-jrnms\") pod \"migrator-74bb7799d9-dq4vl\" (UID: \"2ceb6eda-ba8b-4e64-86be-238acc7be78a\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-dq4vl" Apr 28 19:18:54.811086 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:54.811050 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jrnms\" (UniqueName: \"kubernetes.io/projected/2ceb6eda-ba8b-4e64-86be-238acc7be78a-kube-api-access-jrnms\") pod \"migrator-74bb7799d9-dq4vl\" (UID: \"2ceb6eda-ba8b-4e64-86be-238acc7be78a\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-dq4vl" Apr 28 19:18:54.821078 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:54.821044 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrnms\" (UniqueName: \"kubernetes.io/projected/2ceb6eda-ba8b-4e64-86be-238acc7be78a-kube-api-access-jrnms\") pod \"migrator-74bb7799d9-dq4vl\" (UID: \"2ceb6eda-ba8b-4e64-86be-238acc7be78a\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-dq4vl" Apr 28 19:18:54.926419 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:54.926393 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fts6r_6551d5d0-2583-4478-98ab-1efc22016165/console-operator/1.log" Apr 28 19:18:54.926804 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:54.926766 2576 scope.go:117] "RemoveContainer" containerID="9cd10fad5281b41863729d553269312eebad898193b2306af53619de147e9d57" Apr 28 19:18:54.926944 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:18:54.926927 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-fts6r_openshift-console-operator(6551d5d0-2583-4478-98ab-1efc22016165)\"" pod="openshift-console-operator/console-operator-9d4b6777b-fts6r" podUID="6551d5d0-2583-4478-98ab-1efc22016165" Apr 28 19:18:54.989552 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:54.989516 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-dq4vl" Apr 28 19:18:55.122425 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:55.122398 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-dq4vl"] Apr 28 19:18:55.125420 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:18:55.125394 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ceb6eda_ba8b_4e64_86be_238acc7be78a.slice/crio-610340aad4882f3f93c9ab031c21aefe3b983bbb8a6cb0846e1a441895cfa673 WatchSource:0}: Error finding container 610340aad4882f3f93c9ab031c21aefe3b983bbb8a6cb0846e1a441895cfa673: Status 404 returned error can't find the container with id 610340aad4882f3f93c9ab031c21aefe3b983bbb8a6cb0846e1a441895cfa673 Apr 28 19:18:55.929844 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:55.929807 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-dq4vl" event={"ID":"2ceb6eda-ba8b-4e64-86be-238acc7be78a","Type":"ContainerStarted","Data":"610340aad4882f3f93c9ab031c21aefe3b983bbb8a6cb0846e1a441895cfa673"} Apr 28 19:18:56.619886 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:56.619810 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-rb8nk_edd42d80-2884-4124-a4b2-2aea5543b72b/dns-node-resolver/0.log" Apr 28 19:18:56.934203 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:56.934107 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-dq4vl" event={"ID":"2ceb6eda-ba8b-4e64-86be-238acc7be78a","Type":"ContainerStarted","Data":"65fba2fdb8c4929bd504e70755783f7ea0bdf30e629f3035701ae4da7d1cffa0"} Apr 28 19:18:56.934203 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:56.934150 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-dq4vl" event={"ID":"2ceb6eda-ba8b-4e64-86be-238acc7be78a","Type":"ContainerStarted","Data":"b23baa24053f48cea77dd8f477a30b959c2f587f01fc82cca3e4d23764a354e6"} Apr 28 19:18:56.952410 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:56.952360 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-dq4vl" podStartSLOduration=1.822033848 podStartE2EDuration="2.952343165s" podCreationTimestamp="2026-04-28 19:18:54 +0000 UTC" firstStartedPulling="2026-04-28 19:18:55.127548823 +0000 UTC m=+151.090449183" lastFinishedPulling="2026-04-28 19:18:56.257858155 +0000 UTC m=+152.220758500" observedRunningTime="2026-04-28 19:18:56.951722312 +0000 UTC m=+152.914622678" watchObservedRunningTime="2026-04-28 19:18:56.952343165 +0000 UTC m=+152.915243536" Apr 28 19:18:57.329871 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:57.329825 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/6f21f574-c1af-4f40-9435-416276a65b15-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-smtv9\" (UID: \"6f21f574-c1af-4f40-9435-416276a65b15\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-smtv9" Apr 28 19:18:57.330066 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:18:57.329979 2576 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 28 19:18:57.330066 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:18:57.330048 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6f21f574-c1af-4f40-9435-416276a65b15-cluster-monitoring-operator-tls podName:6f21f574-c1af-4f40-9435-416276a65b15 nodeName:}" failed. No retries permitted until 2026-04-28 19:19:05.330033069 +0000 UTC m=+161.292933413 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/6f21f574-c1af-4f40-9435-416276a65b15-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-smtv9" (UID: "6f21f574-c1af-4f40-9435-416276a65b15") : secret "cluster-monitoring-operator-tls" not found Apr 28 19:18:57.531691 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:57.531642 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/59cffdfd-1e99-4ed5-96e7-7b40825f9cbe-registry-tls\") pod \"image-registry-64f9844bff-crx4m\" (UID: \"59cffdfd-1e99-4ed5-96e7-7b40825f9cbe\") " pod="openshift-image-registry/image-registry-64f9844bff-crx4m" Apr 28 19:18:57.531873 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:18:57.531780 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 28 19:18:57.531873 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:18:57.531801 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-64f9844bff-crx4m: secret "image-registry-tls" not found Apr 28 19:18:57.531873 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:18:57.531860 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/59cffdfd-1e99-4ed5-96e7-7b40825f9cbe-registry-tls podName:59cffdfd-1e99-4ed5-96e7-7b40825f9cbe nodeName:}" failed. No retries permitted until 2026-04-28 19:19:05.53184368 +0000 UTC m=+161.494744024 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/59cffdfd-1e99-4ed5-96e7-7b40825f9cbe-registry-tls") pod "image-registry-64f9844bff-crx4m" (UID: "59cffdfd-1e99-4ed5-96e7-7b40825f9cbe") : secret "image-registry-tls" not found Apr 28 19:18:57.621835 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:57.621755 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-xfk6k_b5119881-7aaa-4ea1-8738-f8463adc7b0c/node-ca/0.log" Apr 28 19:18:59.916457 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:59.916410 2576 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-fts6r" Apr 28 19:18:59.916457 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:59.916469 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-fts6r" Apr 28 19:18:59.916907 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:18:59.916857 2576 scope.go:117] "RemoveContainer" containerID="9cd10fad5281b41863729d553269312eebad898193b2306af53619de147e9d57" Apr 28 19:18:59.917076 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:18:59.917055 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-fts6r_openshift-console-operator(6551d5d0-2583-4478-98ab-1efc22016165)\"" pod="openshift-console-operator/console-operator-9d4b6777b-fts6r" podUID="6551d5d0-2583-4478-98ab-1efc22016165" Apr 28 19:19:00.407367 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:19:00.407315 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-qjvwn" podUID="018d803a-f231-469e-8539-32dcc07e43f8" Apr 28 19:19:00.413413 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:19:00.413386 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-j8mfv" podUID="eb3c06b0-d193-4c98-afaf-e689e3a82af8" Apr 28 19:19:00.520316 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:19:00.520271 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-8j8w9" podUID="4236a3f6-5c96-4e29-bb77-8dafe3cd242d" Apr 28 19:19:00.944353 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:00.944317 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-qjvwn" Apr 28 19:19:00.944763 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:00.944317 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-j8mfv" Apr 28 19:19:05.294754 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:05.294714 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/018d803a-f231-469e-8539-32dcc07e43f8-metrics-tls\") pod \"dns-default-qjvwn\" (UID: \"018d803a-f231-469e-8539-32dcc07e43f8\") " pod="openshift-dns/dns-default-qjvwn" Apr 28 19:19:05.295253 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:05.294823 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/eb3c06b0-d193-4c98-afaf-e689e3a82af8-cert\") pod \"ingress-canary-j8mfv\" (UID: \"eb3c06b0-d193-4c98-afaf-e689e3a82af8\") " pod="openshift-ingress-canary/ingress-canary-j8mfv" Apr 28 19:19:05.297336 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:05.297304 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/018d803a-f231-469e-8539-32dcc07e43f8-metrics-tls\") pod \"dns-default-qjvwn\" (UID: \"018d803a-f231-469e-8539-32dcc07e43f8\") " pod="openshift-dns/dns-default-qjvwn" Apr 28 19:19:05.297336 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:05.297332 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/eb3c06b0-d193-4c98-afaf-e689e3a82af8-cert\") pod \"ingress-canary-j8mfv\" (UID: \"eb3c06b0-d193-4c98-afaf-e689e3a82af8\") " pod="openshift-ingress-canary/ingress-canary-j8mfv" Apr 28 19:19:05.395579 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:05.395526 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/6f21f574-c1af-4f40-9435-416276a65b15-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-smtv9\" (UID: \"6f21f574-c1af-4f40-9435-416276a65b15\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-smtv9" Apr 28 19:19:05.395776 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:19:05.395692 2576 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 28 19:19:05.395776 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:19:05.395759 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6f21f574-c1af-4f40-9435-416276a65b15-cluster-monitoring-operator-tls podName:6f21f574-c1af-4f40-9435-416276a65b15 nodeName:}" failed. No retries permitted until 2026-04-28 19:19:21.39574313 +0000 UTC m=+177.358643473 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/6f21f574-c1af-4f40-9435-416276a65b15-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-smtv9" (UID: "6f21f574-c1af-4f40-9435-416276a65b15") : secret "cluster-monitoring-operator-tls" not found Apr 28 19:19:05.449578 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:05.449548 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-pzmft\"" Apr 28 19:19:05.449578 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:05.449549 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-6qdcw\"" Apr 28 19:19:05.455997 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:05.455973 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-j8mfv" Apr 28 19:19:05.456066 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:05.455998 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-qjvwn" Apr 28 19:19:05.586476 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:05.586405 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-j8mfv"] Apr 28 19:19:05.589116 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:19:05.589083 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb3c06b0_d193_4c98_afaf_e689e3a82af8.slice/crio-fa75236c8365b9a5f75e60336617a682f84d76e1f488e1d5b55a2b5d8fc046bb WatchSource:0}: Error finding container fa75236c8365b9a5f75e60336617a682f84d76e1f488e1d5b55a2b5d8fc046bb: Status 404 returned error can't find the container with id fa75236c8365b9a5f75e60336617a682f84d76e1f488e1d5b55a2b5d8fc046bb Apr 28 19:19:05.596452 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:05.596429 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/59cffdfd-1e99-4ed5-96e7-7b40825f9cbe-registry-tls\") pod \"image-registry-64f9844bff-crx4m\" (UID: \"59cffdfd-1e99-4ed5-96e7-7b40825f9cbe\") " pod="openshift-image-registry/image-registry-64f9844bff-crx4m" Apr 28 19:19:05.598935 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:05.598905 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/59cffdfd-1e99-4ed5-96e7-7b40825f9cbe-registry-tls\") pod \"image-registry-64f9844bff-crx4m\" (UID: \"59cffdfd-1e99-4ed5-96e7-7b40825f9cbe\") " pod="openshift-image-registry/image-registry-64f9844bff-crx4m" Apr 28 19:19:05.605183 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:05.605158 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-qjvwn"] Apr 28 19:19:05.608228 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:19:05.608201 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod018d803a_f231_469e_8539_32dcc07e43f8.slice/crio-ddf7aee8cfe8b820cb95341edc70a6a1910ba139baffcfc1166ecf0740a4131d WatchSource:0}: Error finding container ddf7aee8cfe8b820cb95341edc70a6a1910ba139baffcfc1166ecf0740a4131d: Status 404 returned error can't find the container with id ddf7aee8cfe8b820cb95341edc70a6a1910ba139baffcfc1166ecf0740a4131d Apr 28 19:19:05.634736 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:05.634702 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-64f9844bff-crx4m" Apr 28 19:19:05.754043 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:05.754014 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-64f9844bff-crx4m"] Apr 28 19:19:05.756513 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:19:05.756485 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod59cffdfd_1e99_4ed5_96e7_7b40825f9cbe.slice/crio-50bbdf1eca16be5f65e476bf15651f3b7eb4cec87745d9626fa129dd0bc504f6 WatchSource:0}: Error finding container 50bbdf1eca16be5f65e476bf15651f3b7eb4cec87745d9626fa129dd0bc504f6: Status 404 returned error can't find the container with id 50bbdf1eca16be5f65e476bf15651f3b7eb4cec87745d9626fa129dd0bc504f6 Apr 28 19:19:05.958146 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:05.958058 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-qjvwn" event={"ID":"018d803a-f231-469e-8539-32dcc07e43f8","Type":"ContainerStarted","Data":"ddf7aee8cfe8b820cb95341edc70a6a1910ba139baffcfc1166ecf0740a4131d"} Apr 28 19:19:05.959394 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:05.959355 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-64f9844bff-crx4m" event={"ID":"59cffdfd-1e99-4ed5-96e7-7b40825f9cbe","Type":"ContainerStarted","Data":"71410e8647cb3549f8dc6fa40fca7a28ae1bf4928b689621460c5925e1383b3e"} Apr 28 19:19:05.959394 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:05.959396 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-64f9844bff-crx4m" event={"ID":"59cffdfd-1e99-4ed5-96e7-7b40825f9cbe","Type":"ContainerStarted","Data":"50bbdf1eca16be5f65e476bf15651f3b7eb4cec87745d9626fa129dd0bc504f6"} Apr 28 19:19:05.959623 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:05.959432 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-64f9844bff-crx4m" Apr 28 19:19:05.960480 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:05.960457 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-j8mfv" event={"ID":"eb3c06b0-d193-4c98-afaf-e689e3a82af8","Type":"ContainerStarted","Data":"fa75236c8365b9a5f75e60336617a682f84d76e1f488e1d5b55a2b5d8fc046bb"} Apr 28 19:19:05.983981 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:05.983934 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-64f9844bff-crx4m" podStartSLOduration=16.983920833 podStartE2EDuration="16.983920833s" podCreationTimestamp="2026-04-28 19:18:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:19:05.983630111 +0000 UTC m=+161.946530478" watchObservedRunningTime="2026-04-28 19:19:05.983920833 +0000 UTC m=+161.946821195" Apr 28 19:19:07.970580 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:07.970527 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-qjvwn" event={"ID":"018d803a-f231-469e-8539-32dcc07e43f8","Type":"ContainerStarted","Data":"952fa4c04050a84b962aedf5174c1574b1e6e21f609f733a7950c1c0db44ff29"} Apr 28 19:19:07.970580 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:07.970585 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-qjvwn" event={"ID":"018d803a-f231-469e-8539-32dcc07e43f8","Type":"ContainerStarted","Data":"9a479c88b4dde15a655ac09ea76584b4d42d978d2e2684c587bd42c7be5501f9"} Apr 28 19:19:07.971088 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:07.970925 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-qjvwn" Apr 28 19:19:07.972140 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:07.972117 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-j8mfv" event={"ID":"eb3c06b0-d193-4c98-afaf-e689e3a82af8","Type":"ContainerStarted","Data":"e0866898b5c2e4c9aebd3072c96541633c750a7043f43a600cce17b14c42dc11"} Apr 28 19:19:07.993926 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:07.993865 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-qjvwn" podStartSLOduration=129.011218233 podStartE2EDuration="2m10.993850601s" podCreationTimestamp="2026-04-28 19:16:57 +0000 UTC" firstStartedPulling="2026-04-28 19:19:05.610062822 +0000 UTC m=+161.572963166" lastFinishedPulling="2026-04-28 19:19:07.592695188 +0000 UTC m=+163.555595534" observedRunningTime="2026-04-28 19:19:07.991565217 +0000 UTC m=+163.954465583" watchObservedRunningTime="2026-04-28 19:19:07.993850601 +0000 UTC m=+163.956750967" Apr 28 19:19:08.017594 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:08.017539 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-j8mfv" podStartSLOduration=129.01242669 podStartE2EDuration="2m11.01752253s" podCreationTimestamp="2026-04-28 19:16:57 +0000 UTC" firstStartedPulling="2026-04-28 19:19:05.590936754 +0000 UTC m=+161.553837098" lastFinishedPulling="2026-04-28 19:19:07.59603259 +0000 UTC m=+163.558932938" observedRunningTime="2026-04-28 19:19:08.016990532 +0000 UTC m=+163.979890897" watchObservedRunningTime="2026-04-28 19:19:08.01752253 +0000 UTC m=+163.980422896" Apr 28 19:19:12.510033 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:12.509950 2576 scope.go:117] "RemoveContainer" containerID="9cd10fad5281b41863729d553269312eebad898193b2306af53619de147e9d57" Apr 28 19:19:12.987474 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:12.987449 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fts6r_6551d5d0-2583-4478-98ab-1efc22016165/console-operator/1.log" Apr 28 19:19:12.987639 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:12.987509 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-fts6r" event={"ID":"6551d5d0-2583-4478-98ab-1efc22016165","Type":"ContainerStarted","Data":"4b292ab13a930b12eab8c5d8791b36516bff98ab7a33c59ea68fc2132c5d7e92"} Apr 28 19:19:12.987793 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:12.987774 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-fts6r" Apr 28 19:19:13.007732 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:13.007689 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-fts6r" podStartSLOduration=21.721387557 podStartE2EDuration="24.007678627s" podCreationTimestamp="2026-04-28 19:18:49 +0000 UTC" firstStartedPulling="2026-04-28 19:18:50.053718901 +0000 UTC m=+146.016619245" lastFinishedPulling="2026-04-28 19:18:52.340009969 +0000 UTC m=+148.302910315" observedRunningTime="2026-04-28 19:19:13.007476005 +0000 UTC m=+168.970376372" watchObservedRunningTime="2026-04-28 19:19:13.007678627 +0000 UTC m=+168.970578992" Apr 28 19:19:13.757135 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:13.757104 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-fts6r" Apr 28 19:19:14.510396 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:14.510366 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8j8w9" Apr 28 19:19:17.589012 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:17.588980 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-hgljc"] Apr 28 19:19:17.593693 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:17.593670 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-hgljc" Apr 28 19:19:17.598412 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:17.598387 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-4gwhj\"" Apr 28 19:19:17.598593 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:17.598572 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 28 19:19:17.599698 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:17.599680 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 28 19:19:17.609105 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:17.609085 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-b7hh9"] Apr 28 19:19:17.612039 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:17.612022 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-b7hh9" Apr 28 19:19:17.614501 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:17.614481 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-hgljc"] Apr 28 19:19:17.614916 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:17.614895 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-v5snm\"" Apr 28 19:19:17.615238 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:17.615221 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 28 19:19:17.615362 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:17.615345 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 28 19:19:17.634990 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:17.634960 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-b7hh9"] Apr 28 19:19:17.655935 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:17.655904 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-64f9844bff-crx4m"] Apr 28 19:19:17.696069 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:17.696037 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/448e32b7-e5cf-4771-a258-5be2467dc339-data-volume\") pod \"insights-runtime-extractor-b7hh9\" (UID: \"448e32b7-e5cf-4771-a258-5be2467dc339\") " pod="openshift-insights/insights-runtime-extractor-b7hh9" Apr 28 19:19:17.696069 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:17.696071 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4w2nf\" (UniqueName: \"kubernetes.io/projected/f38d2ea1-629f-4e29-8ffd-33cc8928f1b9-kube-api-access-4w2nf\") pod \"downloads-6bcc868b7-hgljc\" (UID: \"f38d2ea1-629f-4e29-8ffd-33cc8928f1b9\") " pod="openshift-console/downloads-6bcc868b7-hgljc" Apr 28 19:19:17.696264 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:17.696091 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/448e32b7-e5cf-4771-a258-5be2467dc339-crio-socket\") pod \"insights-runtime-extractor-b7hh9\" (UID: \"448e32b7-e5cf-4771-a258-5be2467dc339\") " pod="openshift-insights/insights-runtime-extractor-b7hh9" Apr 28 19:19:17.696264 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:17.696169 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2cm94\" (UniqueName: \"kubernetes.io/projected/448e32b7-e5cf-4771-a258-5be2467dc339-kube-api-access-2cm94\") pod \"insights-runtime-extractor-b7hh9\" (UID: \"448e32b7-e5cf-4771-a258-5be2467dc339\") " pod="openshift-insights/insights-runtime-extractor-b7hh9" Apr 28 19:19:17.696264 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:17.696214 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/448e32b7-e5cf-4771-a258-5be2467dc339-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-b7hh9\" (UID: \"448e32b7-e5cf-4771-a258-5be2467dc339\") " pod="openshift-insights/insights-runtime-extractor-b7hh9" Apr 28 19:19:17.696264 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:17.696232 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/448e32b7-e5cf-4771-a258-5be2467dc339-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-b7hh9\" (UID: \"448e32b7-e5cf-4771-a258-5be2467dc339\") " pod="openshift-insights/insights-runtime-extractor-b7hh9" Apr 28 19:19:17.715547 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:17.715518 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-6b4b455f45-wj89m"] Apr 28 19:19:17.718687 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:17.718666 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6b4b455f45-wj89m" Apr 28 19:19:17.740009 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:17.739982 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6b4b455f45-wj89m"] Apr 28 19:19:17.797117 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:17.797087 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/6083dc36-b003-4755-9d7d-93340b1b3f4e-image-registry-private-configuration\") pod \"image-registry-6b4b455f45-wj89m\" (UID: \"6083dc36-b003-4755-9d7d-93340b1b3f4e\") " pod="openshift-image-registry/image-registry-6b4b455f45-wj89m" Apr 28 19:19:17.797238 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:17.797130 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/448e32b7-e5cf-4771-a258-5be2467dc339-data-volume\") pod \"insights-runtime-extractor-b7hh9\" (UID: \"448e32b7-e5cf-4771-a258-5be2467dc339\") " pod="openshift-insights/insights-runtime-extractor-b7hh9" Apr 28 19:19:17.797238 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:17.797162 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4w2nf\" (UniqueName: \"kubernetes.io/projected/f38d2ea1-629f-4e29-8ffd-33cc8928f1b9-kube-api-access-4w2nf\") pod \"downloads-6bcc868b7-hgljc\" (UID: \"f38d2ea1-629f-4e29-8ffd-33cc8928f1b9\") " pod="openshift-console/downloads-6bcc868b7-hgljc" Apr 28 19:19:17.797238 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:17.797188 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6083dc36-b003-4755-9d7d-93340b1b3f4e-registry-tls\") pod \"image-registry-6b4b455f45-wj89m\" (UID: \"6083dc36-b003-4755-9d7d-93340b1b3f4e\") " pod="openshift-image-registry/image-registry-6b4b455f45-wj89m" Apr 28 19:19:17.797238 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:17.797214 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6083dc36-b003-4755-9d7d-93340b1b3f4e-bound-sa-token\") pod \"image-registry-6b4b455f45-wj89m\" (UID: \"6083dc36-b003-4755-9d7d-93340b1b3f4e\") " pod="openshift-image-registry/image-registry-6b4b455f45-wj89m" Apr 28 19:19:17.797419 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:17.797240 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/448e32b7-e5cf-4771-a258-5be2467dc339-crio-socket\") pod \"insights-runtime-extractor-b7hh9\" (UID: \"448e32b7-e5cf-4771-a258-5be2467dc339\") " pod="openshift-insights/insights-runtime-extractor-b7hh9" Apr 28 19:19:17.797419 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:17.797394 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/448e32b7-e5cf-4771-a258-5be2467dc339-data-volume\") pod \"insights-runtime-extractor-b7hh9\" (UID: \"448e32b7-e5cf-4771-a258-5be2467dc339\") " pod="openshift-insights/insights-runtime-extractor-b7hh9" Apr 28 19:19:17.797419 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:17.797395 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/448e32b7-e5cf-4771-a258-5be2467dc339-crio-socket\") pod \"insights-runtime-extractor-b7hh9\" (UID: \"448e32b7-e5cf-4771-a258-5be2467dc339\") " pod="openshift-insights/insights-runtime-extractor-b7hh9" Apr 28 19:19:17.797537 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:17.797432 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pm5g\" (UniqueName: \"kubernetes.io/projected/6083dc36-b003-4755-9d7d-93340b1b3f4e-kube-api-access-8pm5g\") pod \"image-registry-6b4b455f45-wj89m\" (UID: \"6083dc36-b003-4755-9d7d-93340b1b3f4e\") " pod="openshift-image-registry/image-registry-6b4b455f45-wj89m" Apr 28 19:19:17.797537 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:17.797478 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2cm94\" (UniqueName: \"kubernetes.io/projected/448e32b7-e5cf-4771-a258-5be2467dc339-kube-api-access-2cm94\") pod \"insights-runtime-extractor-b7hh9\" (UID: \"448e32b7-e5cf-4771-a258-5be2467dc339\") " pod="openshift-insights/insights-runtime-extractor-b7hh9" Apr 28 19:19:17.797537 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:17.797494 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6083dc36-b003-4755-9d7d-93340b1b3f4e-ca-trust-extracted\") pod \"image-registry-6b4b455f45-wj89m\" (UID: \"6083dc36-b003-4755-9d7d-93340b1b3f4e\") " pod="openshift-image-registry/image-registry-6b4b455f45-wj89m" Apr 28 19:19:17.797537 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:17.797528 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6083dc36-b003-4755-9d7d-93340b1b3f4e-trusted-ca\") pod \"image-registry-6b4b455f45-wj89m\" (UID: \"6083dc36-b003-4755-9d7d-93340b1b3f4e\") " pod="openshift-image-registry/image-registry-6b4b455f45-wj89m" Apr 28 19:19:17.797735 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:17.797546 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6083dc36-b003-4755-9d7d-93340b1b3f4e-installation-pull-secrets\") pod \"image-registry-6b4b455f45-wj89m\" (UID: \"6083dc36-b003-4755-9d7d-93340b1b3f4e\") " pod="openshift-image-registry/image-registry-6b4b455f45-wj89m" Apr 28 19:19:17.797735 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:17.797573 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/448e32b7-e5cf-4771-a258-5be2467dc339-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-b7hh9\" (UID: \"448e32b7-e5cf-4771-a258-5be2467dc339\") " pod="openshift-insights/insights-runtime-extractor-b7hh9" Apr 28 19:19:17.797735 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:17.797594 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/448e32b7-e5cf-4771-a258-5be2467dc339-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-b7hh9\" (UID: \"448e32b7-e5cf-4771-a258-5be2467dc339\") " pod="openshift-insights/insights-runtime-extractor-b7hh9" Apr 28 19:19:17.797735 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:17.797649 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6083dc36-b003-4755-9d7d-93340b1b3f4e-registry-certificates\") pod \"image-registry-6b4b455f45-wj89m\" (UID: \"6083dc36-b003-4755-9d7d-93340b1b3f4e\") " pod="openshift-image-registry/image-registry-6b4b455f45-wj89m" Apr 28 19:19:17.798133 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:17.798115 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/448e32b7-e5cf-4771-a258-5be2467dc339-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-b7hh9\" (UID: \"448e32b7-e5cf-4771-a258-5be2467dc339\") " pod="openshift-insights/insights-runtime-extractor-b7hh9" Apr 28 19:19:17.799902 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:17.799874 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/448e32b7-e5cf-4771-a258-5be2467dc339-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-b7hh9\" (UID: \"448e32b7-e5cf-4771-a258-5be2467dc339\") " pod="openshift-insights/insights-runtime-extractor-b7hh9" Apr 28 19:19:17.806205 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:17.806187 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4w2nf\" (UniqueName: \"kubernetes.io/projected/f38d2ea1-629f-4e29-8ffd-33cc8928f1b9-kube-api-access-4w2nf\") pod \"downloads-6bcc868b7-hgljc\" (UID: \"f38d2ea1-629f-4e29-8ffd-33cc8928f1b9\") " pod="openshift-console/downloads-6bcc868b7-hgljc" Apr 28 19:19:17.812297 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:17.812264 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2cm94\" (UniqueName: \"kubernetes.io/projected/448e32b7-e5cf-4771-a258-5be2467dc339-kube-api-access-2cm94\") pod \"insights-runtime-extractor-b7hh9\" (UID: \"448e32b7-e5cf-4771-a258-5be2467dc339\") " pod="openshift-insights/insights-runtime-extractor-b7hh9" Apr 28 19:19:17.899081 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:17.899012 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6083dc36-b003-4755-9d7d-93340b1b3f4e-registry-tls\") pod \"image-registry-6b4b455f45-wj89m\" (UID: \"6083dc36-b003-4755-9d7d-93340b1b3f4e\") " pod="openshift-image-registry/image-registry-6b4b455f45-wj89m" Apr 28 19:19:17.899081 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:17.899048 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6083dc36-b003-4755-9d7d-93340b1b3f4e-bound-sa-token\") pod \"image-registry-6b4b455f45-wj89m\" (UID: \"6083dc36-b003-4755-9d7d-93340b1b3f4e\") " pod="openshift-image-registry/image-registry-6b4b455f45-wj89m" Apr 28 19:19:17.899081 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:17.899077 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8pm5g\" (UniqueName: \"kubernetes.io/projected/6083dc36-b003-4755-9d7d-93340b1b3f4e-kube-api-access-8pm5g\") pod \"image-registry-6b4b455f45-wj89m\" (UID: \"6083dc36-b003-4755-9d7d-93340b1b3f4e\") " pod="openshift-image-registry/image-registry-6b4b455f45-wj89m" Apr 28 19:19:17.899311 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:17.899120 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6083dc36-b003-4755-9d7d-93340b1b3f4e-ca-trust-extracted\") pod \"image-registry-6b4b455f45-wj89m\" (UID: \"6083dc36-b003-4755-9d7d-93340b1b3f4e\") " pod="openshift-image-registry/image-registry-6b4b455f45-wj89m" Apr 28 19:19:17.899311 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:17.899156 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6083dc36-b003-4755-9d7d-93340b1b3f4e-trusted-ca\") pod \"image-registry-6b4b455f45-wj89m\" (UID: \"6083dc36-b003-4755-9d7d-93340b1b3f4e\") " pod="openshift-image-registry/image-registry-6b4b455f45-wj89m" Apr 28 19:19:17.899311 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:17.899178 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6083dc36-b003-4755-9d7d-93340b1b3f4e-installation-pull-secrets\") pod \"image-registry-6b4b455f45-wj89m\" (UID: \"6083dc36-b003-4755-9d7d-93340b1b3f4e\") " pod="openshift-image-registry/image-registry-6b4b455f45-wj89m" Apr 28 19:19:17.899311 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:17.899211 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6083dc36-b003-4755-9d7d-93340b1b3f4e-registry-certificates\") pod \"image-registry-6b4b455f45-wj89m\" (UID: \"6083dc36-b003-4755-9d7d-93340b1b3f4e\") " pod="openshift-image-registry/image-registry-6b4b455f45-wj89m" Apr 28 19:19:17.899311 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:17.899267 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/6083dc36-b003-4755-9d7d-93340b1b3f4e-image-registry-private-configuration\") pod \"image-registry-6b4b455f45-wj89m\" (UID: \"6083dc36-b003-4755-9d7d-93340b1b3f4e\") " pod="openshift-image-registry/image-registry-6b4b455f45-wj89m" Apr 28 19:19:17.899641 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:17.899620 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6083dc36-b003-4755-9d7d-93340b1b3f4e-ca-trust-extracted\") pod \"image-registry-6b4b455f45-wj89m\" (UID: \"6083dc36-b003-4755-9d7d-93340b1b3f4e\") " pod="openshift-image-registry/image-registry-6b4b455f45-wj89m" Apr 28 19:19:17.900112 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:17.900093 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6083dc36-b003-4755-9d7d-93340b1b3f4e-trusted-ca\") pod \"image-registry-6b4b455f45-wj89m\" (UID: \"6083dc36-b003-4755-9d7d-93340b1b3f4e\") " pod="openshift-image-registry/image-registry-6b4b455f45-wj89m" Apr 28 19:19:17.900380 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:17.900356 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6083dc36-b003-4755-9d7d-93340b1b3f4e-registry-certificates\") pod \"image-registry-6b4b455f45-wj89m\" (UID: \"6083dc36-b003-4755-9d7d-93340b1b3f4e\") " pod="openshift-image-registry/image-registry-6b4b455f45-wj89m" Apr 28 19:19:17.901592 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:17.901565 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6083dc36-b003-4755-9d7d-93340b1b3f4e-registry-tls\") pod \"image-registry-6b4b455f45-wj89m\" (UID: \"6083dc36-b003-4755-9d7d-93340b1b3f4e\") " pod="openshift-image-registry/image-registry-6b4b455f45-wj89m" Apr 28 19:19:17.901721 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:17.901694 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6083dc36-b003-4755-9d7d-93340b1b3f4e-installation-pull-secrets\") pod \"image-registry-6b4b455f45-wj89m\" (UID: \"6083dc36-b003-4755-9d7d-93340b1b3f4e\") " pod="openshift-image-registry/image-registry-6b4b455f45-wj89m" Apr 28 19:19:17.901849 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:17.901827 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/6083dc36-b003-4755-9d7d-93340b1b3f4e-image-registry-private-configuration\") pod \"image-registry-6b4b455f45-wj89m\" (UID: \"6083dc36-b003-4755-9d7d-93340b1b3f4e\") " pod="openshift-image-registry/image-registry-6b4b455f45-wj89m" Apr 28 19:19:17.904855 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:17.904837 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-hgljc" Apr 28 19:19:17.908683 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:17.908660 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pm5g\" (UniqueName: \"kubernetes.io/projected/6083dc36-b003-4755-9d7d-93340b1b3f4e-kube-api-access-8pm5g\") pod \"image-registry-6b4b455f45-wj89m\" (UID: \"6083dc36-b003-4755-9d7d-93340b1b3f4e\") " pod="openshift-image-registry/image-registry-6b4b455f45-wj89m" Apr 28 19:19:17.911726 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:17.911704 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6083dc36-b003-4755-9d7d-93340b1b3f4e-bound-sa-token\") pod \"image-registry-6b4b455f45-wj89m\" (UID: \"6083dc36-b003-4755-9d7d-93340b1b3f4e\") " pod="openshift-image-registry/image-registry-6b4b455f45-wj89m" Apr 28 19:19:17.921531 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:17.921512 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-b7hh9" Apr 28 19:19:17.991298 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:17.990669 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-qjvwn" Apr 28 19:19:18.026852 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:18.026816 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6b4b455f45-wj89m" Apr 28 19:19:18.064854 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:18.064822 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-hgljc"] Apr 28 19:19:18.066684 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:19:18.066645 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf38d2ea1_629f_4e29_8ffd_33cc8928f1b9.slice/crio-3d7d3ec337979a2e2502d8bac135976079c873f544d9e5bd8ae3c46112f1ef00 WatchSource:0}: Error finding container 3d7d3ec337979a2e2502d8bac135976079c873f544d9e5bd8ae3c46112f1ef00: Status 404 returned error can't find the container with id 3d7d3ec337979a2e2502d8bac135976079c873f544d9e5bd8ae3c46112f1ef00 Apr 28 19:19:18.097620 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:18.097582 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-b7hh9"] Apr 28 19:19:18.101237 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:19:18.101204 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod448e32b7_e5cf_4771_a258_5be2467dc339.slice/crio-e6072893b05355c416461cb265ca83b403110dfdcedb99e8301f9ff58fd6b043 WatchSource:0}: Error finding container e6072893b05355c416461cb265ca83b403110dfdcedb99e8301f9ff58fd6b043: Status 404 returned error can't find the container with id e6072893b05355c416461cb265ca83b403110dfdcedb99e8301f9ff58fd6b043 Apr 28 19:19:18.161103 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:18.161080 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6b4b455f45-wj89m"] Apr 28 19:19:18.163877 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:19:18.163853 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6083dc36_b003_4755_9d7d_93340b1b3f4e.slice/crio-2ce1db1eaa30264be6e7289fd2d47ef99b65e3fbd86bb923a14297de98b54690 WatchSource:0}: Error finding container 2ce1db1eaa30264be6e7289fd2d47ef99b65e3fbd86bb923a14297de98b54690: Status 404 returned error can't find the container with id 2ce1db1eaa30264be6e7289fd2d47ef99b65e3fbd86bb923a14297de98b54690 Apr 28 19:19:19.020996 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:19.020953 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-b7hh9" event={"ID":"448e32b7-e5cf-4771-a258-5be2467dc339","Type":"ContainerStarted","Data":"3a1b93aa25dccda90fb45c4a32a2e77f50b0e605a2f2c67599f9d44e98680097"} Apr 28 19:19:19.020996 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:19.020997 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-b7hh9" event={"ID":"448e32b7-e5cf-4771-a258-5be2467dc339","Type":"ContainerStarted","Data":"20a2532bb6beb42cb4b405d0b0073c5abdd1f751b61c38ecb8820575cc69f65e"} Apr 28 19:19:19.021422 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:19.021010 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-b7hh9" event={"ID":"448e32b7-e5cf-4771-a258-5be2467dc339","Type":"ContainerStarted","Data":"e6072893b05355c416461cb265ca83b403110dfdcedb99e8301f9ff58fd6b043"} Apr 28 19:19:19.022209 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:19.022183 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-hgljc" event={"ID":"f38d2ea1-629f-4e29-8ffd-33cc8928f1b9","Type":"ContainerStarted","Data":"3d7d3ec337979a2e2502d8bac135976079c873f544d9e5bd8ae3c46112f1ef00"} Apr 28 19:19:19.023475 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:19.023446 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6b4b455f45-wj89m" event={"ID":"6083dc36-b003-4755-9d7d-93340b1b3f4e","Type":"ContainerStarted","Data":"7acd4a3c25d67689fb7ce4a0592294927f7235a080777a3c5166356bb590d447"} Apr 28 19:19:19.023570 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:19.023480 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6b4b455f45-wj89m" event={"ID":"6083dc36-b003-4755-9d7d-93340b1b3f4e","Type":"ContainerStarted","Data":"2ce1db1eaa30264be6e7289fd2d47ef99b65e3fbd86bb923a14297de98b54690"} Apr 28 19:19:19.023656 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:19.023634 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-6b4b455f45-wj89m" Apr 28 19:19:19.047979 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:19.047930 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-6b4b455f45-wj89m" podStartSLOduration=2.047914757 podStartE2EDuration="2.047914757s" podCreationTimestamp="2026-04-28 19:19:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:19:19.04686499 +0000 UTC m=+175.009765355" watchObservedRunningTime="2026-04-28 19:19:19.047914757 +0000 UTC m=+175.010815151" Apr 28 19:19:20.735067 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:20.735005 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5c8d96db98-nj6lg" podUID="f2c120fa-2092-4be2-b6ea-21bb0e34bb95" containerName="acm-agent" probeResult="failure" output="Get \"http://10.134.0.7:8000/readyz\": dial tcp 10.134.0.7:8000: connect: connection refused" Apr 28 19:19:21.031161 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:21.031127 2576 generic.go:358] "Generic (PLEG): container finished" podID="f2c120fa-2092-4be2-b6ea-21bb0e34bb95" containerID="6f137cf984aabb5ba3ebf3c402f683d4fed54bb4650d5db4bf45a040ce5a4aca" exitCode=1 Apr 28 19:19:21.031357 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:21.031208 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5c8d96db98-nj6lg" event={"ID":"f2c120fa-2092-4be2-b6ea-21bb0e34bb95","Type":"ContainerDied","Data":"6f137cf984aabb5ba3ebf3c402f683d4fed54bb4650d5db4bf45a040ce5a4aca"} Apr 28 19:19:21.031650 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:21.031628 2576 scope.go:117] "RemoveContainer" containerID="6f137cf984aabb5ba3ebf3c402f683d4fed54bb4650d5db4bf45a040ce5a4aca" Apr 28 19:19:21.033319 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:21.033296 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-b7hh9" event={"ID":"448e32b7-e5cf-4771-a258-5be2467dc339","Type":"ContainerStarted","Data":"dce359cd9eb34357694f0390217dd9956c98f71719a12ca736134003390f5de1"} Apr 28 19:19:21.098507 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:21.098448 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-b7hh9" podStartSLOduration=1.87315457 podStartE2EDuration="4.098429088s" podCreationTimestamp="2026-04-28 19:19:17 +0000 UTC" firstStartedPulling="2026-04-28 19:19:18.163167947 +0000 UTC m=+174.126068292" lastFinishedPulling="2026-04-28 19:19:20.38844245 +0000 UTC m=+176.351342810" observedRunningTime="2026-04-28 19:19:21.098368391 +0000 UTC m=+177.061268748" watchObservedRunningTime="2026-04-28 19:19:21.098429088 +0000 UTC m=+177.061329455" Apr 28 19:19:21.442227 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:21.442128 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/6f21f574-c1af-4f40-9435-416276a65b15-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-smtv9\" (UID: \"6f21f574-c1af-4f40-9435-416276a65b15\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-smtv9" Apr 28 19:19:21.445112 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:21.445080 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/6f21f574-c1af-4f40-9435-416276a65b15-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-smtv9\" (UID: \"6f21f574-c1af-4f40-9435-416276a65b15\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-smtv9" Apr 28 19:19:21.603637 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:21.603581 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-smtv9" Apr 28 19:19:21.758092 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:21.757855 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-smtv9"] Apr 28 19:19:21.761239 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:19:21.761191 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f21f574_c1af_4f40_9435_416276a65b15.slice/crio-2d997f941b449b93b98e021851d56853552e6a2699acabfc49fa1acbe16b2283 WatchSource:0}: Error finding container 2d997f941b449b93b98e021851d56853552e6a2699acabfc49fa1acbe16b2283: Status 404 returned error can't find the container with id 2d997f941b449b93b98e021851d56853552e6a2699acabfc49fa1acbe16b2283 Apr 28 19:19:22.037935 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:22.037896 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-smtv9" event={"ID":"6f21f574-c1af-4f40-9435-416276a65b15","Type":"ContainerStarted","Data":"2d997f941b449b93b98e021851d56853552e6a2699acabfc49fa1acbe16b2283"} Apr 28 19:19:22.039841 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:22.039809 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5c8d96db98-nj6lg" event={"ID":"f2c120fa-2092-4be2-b6ea-21bb0e34bb95","Type":"ContainerStarted","Data":"0f11308e8204c3fdef5c1cac6a7b01ab1e0a1a7192c0820b619fdffbc12bdaa5"} Apr 28 19:19:22.040331 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:22.040256 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5c8d96db98-nj6lg" Apr 28 19:19:22.040913 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:22.040887 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5c8d96db98-nj6lg" Apr 28 19:19:22.298133 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:22.298045 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5488c59d85-xkp8m"] Apr 28 19:19:22.335051 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:22.335018 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5488c59d85-xkp8m"] Apr 28 19:19:22.335195 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:22.335109 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5488c59d85-xkp8m" Apr 28 19:19:22.340151 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:22.339934 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-c9kx6\"" Apr 28 19:19:22.340151 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:22.339985 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 28 19:19:22.340151 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:22.340006 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 28 19:19:22.340151 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:22.339934 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 28 19:19:22.340517 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:22.340256 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 28 19:19:22.340563 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:22.340523 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 28 19:19:22.451893 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:22.451857 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwvbz\" (UniqueName: \"kubernetes.io/projected/7da9f477-8a77-4f71-9500-d482754f1af1-kube-api-access-lwvbz\") pod \"console-5488c59d85-xkp8m\" (UID: \"7da9f477-8a77-4f71-9500-d482754f1af1\") " pod="openshift-console/console-5488c59d85-xkp8m" Apr 28 19:19:22.452104 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:22.451908 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7da9f477-8a77-4f71-9500-d482754f1af1-console-config\") pod \"console-5488c59d85-xkp8m\" (UID: \"7da9f477-8a77-4f71-9500-d482754f1af1\") " pod="openshift-console/console-5488c59d85-xkp8m" Apr 28 19:19:22.452104 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:22.451987 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7da9f477-8a77-4f71-9500-d482754f1af1-console-oauth-config\") pod \"console-5488c59d85-xkp8m\" (UID: \"7da9f477-8a77-4f71-9500-d482754f1af1\") " pod="openshift-console/console-5488c59d85-xkp8m" Apr 28 19:19:22.452104 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:22.452078 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7da9f477-8a77-4f71-9500-d482754f1af1-console-serving-cert\") pod \"console-5488c59d85-xkp8m\" (UID: \"7da9f477-8a77-4f71-9500-d482754f1af1\") " pod="openshift-console/console-5488c59d85-xkp8m" Apr 28 19:19:22.452330 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:22.452137 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7da9f477-8a77-4f71-9500-d482754f1af1-service-ca\") pod \"console-5488c59d85-xkp8m\" (UID: \"7da9f477-8a77-4f71-9500-d482754f1af1\") " pod="openshift-console/console-5488c59d85-xkp8m" Apr 28 19:19:22.452330 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:22.452156 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7da9f477-8a77-4f71-9500-d482754f1af1-oauth-serving-cert\") pod \"console-5488c59d85-xkp8m\" (UID: \"7da9f477-8a77-4f71-9500-d482754f1af1\") " pod="openshift-console/console-5488c59d85-xkp8m" Apr 28 19:19:22.552979 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:22.552903 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7da9f477-8a77-4f71-9500-d482754f1af1-console-config\") pod \"console-5488c59d85-xkp8m\" (UID: \"7da9f477-8a77-4f71-9500-d482754f1af1\") " pod="openshift-console/console-5488c59d85-xkp8m" Apr 28 19:19:22.552979 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:22.552960 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7da9f477-8a77-4f71-9500-d482754f1af1-console-oauth-config\") pod \"console-5488c59d85-xkp8m\" (UID: \"7da9f477-8a77-4f71-9500-d482754f1af1\") " pod="openshift-console/console-5488c59d85-xkp8m" Apr 28 19:19:22.553188 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:22.553012 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7da9f477-8a77-4f71-9500-d482754f1af1-console-serving-cert\") pod \"console-5488c59d85-xkp8m\" (UID: \"7da9f477-8a77-4f71-9500-d482754f1af1\") " pod="openshift-console/console-5488c59d85-xkp8m" Apr 28 19:19:22.553188 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:22.553061 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7da9f477-8a77-4f71-9500-d482754f1af1-service-ca\") pod \"console-5488c59d85-xkp8m\" (UID: \"7da9f477-8a77-4f71-9500-d482754f1af1\") " pod="openshift-console/console-5488c59d85-xkp8m" Apr 28 19:19:22.553188 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:22.553089 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7da9f477-8a77-4f71-9500-d482754f1af1-oauth-serving-cert\") pod \"console-5488c59d85-xkp8m\" (UID: \"7da9f477-8a77-4f71-9500-d482754f1af1\") " pod="openshift-console/console-5488c59d85-xkp8m" Apr 28 19:19:22.553188 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:22.553134 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lwvbz\" (UniqueName: \"kubernetes.io/projected/7da9f477-8a77-4f71-9500-d482754f1af1-kube-api-access-lwvbz\") pod \"console-5488c59d85-xkp8m\" (UID: \"7da9f477-8a77-4f71-9500-d482754f1af1\") " pod="openshift-console/console-5488c59d85-xkp8m" Apr 28 19:19:22.553749 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:22.553725 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7da9f477-8a77-4f71-9500-d482754f1af1-console-config\") pod \"console-5488c59d85-xkp8m\" (UID: \"7da9f477-8a77-4f71-9500-d482754f1af1\") " pod="openshift-console/console-5488c59d85-xkp8m" Apr 28 19:19:22.553865 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:22.553838 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7da9f477-8a77-4f71-9500-d482754f1af1-service-ca\") pod \"console-5488c59d85-xkp8m\" (UID: \"7da9f477-8a77-4f71-9500-d482754f1af1\") " pod="openshift-console/console-5488c59d85-xkp8m" Apr 28 19:19:22.553925 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:22.553880 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7da9f477-8a77-4f71-9500-d482754f1af1-oauth-serving-cert\") pod \"console-5488c59d85-xkp8m\" (UID: \"7da9f477-8a77-4f71-9500-d482754f1af1\") " pod="openshift-console/console-5488c59d85-xkp8m" Apr 28 19:19:22.555990 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:22.555968 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7da9f477-8a77-4f71-9500-d482754f1af1-console-serving-cert\") pod \"console-5488c59d85-xkp8m\" (UID: \"7da9f477-8a77-4f71-9500-d482754f1af1\") " pod="openshift-console/console-5488c59d85-xkp8m" Apr 28 19:19:22.556177 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:22.556155 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7da9f477-8a77-4f71-9500-d482754f1af1-console-oauth-config\") pod \"console-5488c59d85-xkp8m\" (UID: \"7da9f477-8a77-4f71-9500-d482754f1af1\") " pod="openshift-console/console-5488c59d85-xkp8m" Apr 28 19:19:22.570737 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:22.570711 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwvbz\" (UniqueName: \"kubernetes.io/projected/7da9f477-8a77-4f71-9500-d482754f1af1-kube-api-access-lwvbz\") pod \"console-5488c59d85-xkp8m\" (UID: \"7da9f477-8a77-4f71-9500-d482754f1af1\") " pod="openshift-console/console-5488c59d85-xkp8m" Apr 28 19:19:22.646586 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:22.646538 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5488c59d85-xkp8m" Apr 28 19:19:22.805364 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:22.805243 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5488c59d85-xkp8m"] Apr 28 19:19:22.809820 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:19:22.809786 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7da9f477_8a77_4f71_9500_d482754f1af1.slice/crio-b2ce265889d518a9c2dc15e26112329741f2415755576ba24a8447f762bcd1b6 WatchSource:0}: Error finding container b2ce265889d518a9c2dc15e26112329741f2415755576ba24a8447f762bcd1b6: Status 404 returned error can't find the container with id b2ce265889d518a9c2dc15e26112329741f2415755576ba24a8447f762bcd1b6 Apr 28 19:19:23.044166 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:23.044129 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5488c59d85-xkp8m" event={"ID":"7da9f477-8a77-4f71-9500-d482754f1af1","Type":"ContainerStarted","Data":"b2ce265889d518a9c2dc15e26112329741f2415755576ba24a8447f762bcd1b6"} Apr 28 19:19:25.053207 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:25.053173 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-smtv9" event={"ID":"6f21f574-c1af-4f40-9435-416276a65b15","Type":"ContainerStarted","Data":"de68d54fac321d2d063829c01421c2b245c580bd77fba8acd3524c4f09c79d31"} Apr 28 19:19:25.084344 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:25.084280 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-smtv9" podStartSLOduration=33.866891885 podStartE2EDuration="36.084259466s" podCreationTimestamp="2026-04-28 19:18:49 +0000 UTC" firstStartedPulling="2026-04-28 19:19:21.76352794 +0000 UTC m=+177.726428290" lastFinishedPulling="2026-04-28 19:19:23.98089551 +0000 UTC m=+179.943795871" observedRunningTime="2026-04-28 19:19:25.083803793 +0000 UTC m=+181.046704163" watchObservedRunningTime="2026-04-28 19:19:25.084259466 +0000 UTC m=+181.047159832" Apr 28 19:19:27.062299 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:27.062251 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5488c59d85-xkp8m" event={"ID":"7da9f477-8a77-4f71-9500-d482754f1af1","Type":"ContainerStarted","Data":"1f4cfe4f9ef4b3cd7e6c47d12ad95eb95bd4a6d5750a8f4f5ea1339b9e0fcfee"} Apr 28 19:19:27.094890 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:27.094824 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5488c59d85-xkp8m" podStartSLOduration=1.8966494969999999 podStartE2EDuration="5.094802117s" podCreationTimestamp="2026-04-28 19:19:22 +0000 UTC" firstStartedPulling="2026-04-28 19:19:22.812387173 +0000 UTC m=+178.775287522" lastFinishedPulling="2026-04-28 19:19:26.010539771 +0000 UTC m=+181.973440142" observedRunningTime="2026-04-28 19:19:27.092250554 +0000 UTC m=+183.055150925" watchObservedRunningTime="2026-04-28 19:19:27.094802117 +0000 UTC m=+183.057702485" Apr 28 19:19:27.662105 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:27.662078 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-64f9844bff-crx4m" Apr 28 19:19:32.117398 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:32.117358 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-lpcfl"] Apr 28 19:19:32.122081 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:32.122055 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-lpcfl" Apr 28 19:19:32.130900 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:32.130874 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 28 19:19:32.131214 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:32.131191 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 28 19:19:32.131304 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:32.131218 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 28 19:19:32.131564 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:32.131542 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-4l8px\"" Apr 28 19:19:32.134720 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:32.134702 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 28 19:19:32.241060 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:32.241023 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c23a392b-1dc0-46ed-aa02-2f51cadbca4c-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-lpcfl\" (UID: \"c23a392b-1dc0-46ed-aa02-2f51cadbca4c\") " pod="openshift-monitoring/node-exporter-lpcfl" Apr 28 19:19:32.241231 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:32.241075 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c23a392b-1dc0-46ed-aa02-2f51cadbca4c-sys\") pod \"node-exporter-lpcfl\" (UID: \"c23a392b-1dc0-46ed-aa02-2f51cadbca4c\") " pod="openshift-monitoring/node-exporter-lpcfl" Apr 28 19:19:32.241231 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:32.241127 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/c23a392b-1dc0-46ed-aa02-2f51cadbca4c-node-exporter-wtmp\") pod \"node-exporter-lpcfl\" (UID: \"c23a392b-1dc0-46ed-aa02-2f51cadbca4c\") " pod="openshift-monitoring/node-exporter-lpcfl" Apr 28 19:19:32.241231 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:32.241147 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c23a392b-1dc0-46ed-aa02-2f51cadbca4c-metrics-client-ca\") pod \"node-exporter-lpcfl\" (UID: \"c23a392b-1dc0-46ed-aa02-2f51cadbca4c\") " pod="openshift-monitoring/node-exporter-lpcfl" Apr 28 19:19:32.241231 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:32.241211 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/c23a392b-1dc0-46ed-aa02-2f51cadbca4c-node-exporter-textfile\") pod \"node-exporter-lpcfl\" (UID: \"c23a392b-1dc0-46ed-aa02-2f51cadbca4c\") " pod="openshift-monitoring/node-exporter-lpcfl" Apr 28 19:19:32.241382 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:32.241254 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/c23a392b-1dc0-46ed-aa02-2f51cadbca4c-node-exporter-tls\") pod \"node-exporter-lpcfl\" (UID: \"c23a392b-1dc0-46ed-aa02-2f51cadbca4c\") " pod="openshift-monitoring/node-exporter-lpcfl" Apr 28 19:19:32.241382 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:32.241273 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/c23a392b-1dc0-46ed-aa02-2f51cadbca4c-node-exporter-accelerators-collector-config\") pod \"node-exporter-lpcfl\" (UID: \"c23a392b-1dc0-46ed-aa02-2f51cadbca4c\") " pod="openshift-monitoring/node-exporter-lpcfl" Apr 28 19:19:32.241382 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:32.241333 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sg9ck\" (UniqueName: \"kubernetes.io/projected/c23a392b-1dc0-46ed-aa02-2f51cadbca4c-kube-api-access-sg9ck\") pod \"node-exporter-lpcfl\" (UID: \"c23a392b-1dc0-46ed-aa02-2f51cadbca4c\") " pod="openshift-monitoring/node-exporter-lpcfl" Apr 28 19:19:32.241382 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:32.241369 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/c23a392b-1dc0-46ed-aa02-2f51cadbca4c-root\") pod \"node-exporter-lpcfl\" (UID: \"c23a392b-1dc0-46ed-aa02-2f51cadbca4c\") " pod="openshift-monitoring/node-exporter-lpcfl" Apr 28 19:19:32.341941 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:32.341901 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/c23a392b-1dc0-46ed-aa02-2f51cadbca4c-node-exporter-accelerators-collector-config\") pod \"node-exporter-lpcfl\" (UID: \"c23a392b-1dc0-46ed-aa02-2f51cadbca4c\") " pod="openshift-monitoring/node-exporter-lpcfl" Apr 28 19:19:32.342117 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:32.341955 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sg9ck\" (UniqueName: \"kubernetes.io/projected/c23a392b-1dc0-46ed-aa02-2f51cadbca4c-kube-api-access-sg9ck\") pod \"node-exporter-lpcfl\" (UID: \"c23a392b-1dc0-46ed-aa02-2f51cadbca4c\") " pod="openshift-monitoring/node-exporter-lpcfl" Apr 28 19:19:32.342117 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:32.341982 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/c23a392b-1dc0-46ed-aa02-2f51cadbca4c-root\") pod \"node-exporter-lpcfl\" (UID: \"c23a392b-1dc0-46ed-aa02-2f51cadbca4c\") " pod="openshift-monitoring/node-exporter-lpcfl" Apr 28 19:19:32.342117 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:32.342013 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c23a392b-1dc0-46ed-aa02-2f51cadbca4c-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-lpcfl\" (UID: \"c23a392b-1dc0-46ed-aa02-2f51cadbca4c\") " pod="openshift-monitoring/node-exporter-lpcfl" Apr 28 19:19:32.342117 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:32.342053 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c23a392b-1dc0-46ed-aa02-2f51cadbca4c-sys\") pod \"node-exporter-lpcfl\" (UID: \"c23a392b-1dc0-46ed-aa02-2f51cadbca4c\") " pod="openshift-monitoring/node-exporter-lpcfl" Apr 28 19:19:32.342117 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:32.342110 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c23a392b-1dc0-46ed-aa02-2f51cadbca4c-sys\") pod \"node-exporter-lpcfl\" (UID: \"c23a392b-1dc0-46ed-aa02-2f51cadbca4c\") " pod="openshift-monitoring/node-exporter-lpcfl" Apr 28 19:19:32.342359 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:32.342128 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/c23a392b-1dc0-46ed-aa02-2f51cadbca4c-root\") pod \"node-exporter-lpcfl\" (UID: \"c23a392b-1dc0-46ed-aa02-2f51cadbca4c\") " pod="openshift-monitoring/node-exporter-lpcfl" Apr 28 19:19:32.342359 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:32.342238 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/c23a392b-1dc0-46ed-aa02-2f51cadbca4c-node-exporter-wtmp\") pod \"node-exporter-lpcfl\" (UID: \"c23a392b-1dc0-46ed-aa02-2f51cadbca4c\") " pod="openshift-monitoring/node-exporter-lpcfl" Apr 28 19:19:32.342359 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:32.342267 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c23a392b-1dc0-46ed-aa02-2f51cadbca4c-metrics-client-ca\") pod \"node-exporter-lpcfl\" (UID: \"c23a392b-1dc0-46ed-aa02-2f51cadbca4c\") " pod="openshift-monitoring/node-exporter-lpcfl" Apr 28 19:19:32.342359 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:32.342299 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/c23a392b-1dc0-46ed-aa02-2f51cadbca4c-node-exporter-textfile\") pod \"node-exporter-lpcfl\" (UID: \"c23a392b-1dc0-46ed-aa02-2f51cadbca4c\") " pod="openshift-monitoring/node-exporter-lpcfl" Apr 28 19:19:32.342359 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:32.342353 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/c23a392b-1dc0-46ed-aa02-2f51cadbca4c-node-exporter-tls\") pod \"node-exporter-lpcfl\" (UID: \"c23a392b-1dc0-46ed-aa02-2f51cadbca4c\") " pod="openshift-monitoring/node-exporter-lpcfl" Apr 28 19:19:32.342572 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:32.342424 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/c23a392b-1dc0-46ed-aa02-2f51cadbca4c-node-exporter-wtmp\") pod \"node-exporter-lpcfl\" (UID: \"c23a392b-1dc0-46ed-aa02-2f51cadbca4c\") " pod="openshift-monitoring/node-exporter-lpcfl" Apr 28 19:19:32.342572 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:19:32.342454 2576 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 28 19:19:32.342572 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:19:32.342510 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c23a392b-1dc0-46ed-aa02-2f51cadbca4c-node-exporter-tls podName:c23a392b-1dc0-46ed-aa02-2f51cadbca4c nodeName:}" failed. No retries permitted until 2026-04-28 19:19:32.842490344 +0000 UTC m=+188.805390696 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/c23a392b-1dc0-46ed-aa02-2f51cadbca4c-node-exporter-tls") pod "node-exporter-lpcfl" (UID: "c23a392b-1dc0-46ed-aa02-2f51cadbca4c") : secret "node-exporter-tls" not found Apr 28 19:19:32.343031 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:32.343009 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/c23a392b-1dc0-46ed-aa02-2f51cadbca4c-node-exporter-textfile\") pod \"node-exporter-lpcfl\" (UID: \"c23a392b-1dc0-46ed-aa02-2f51cadbca4c\") " pod="openshift-monitoring/node-exporter-lpcfl" Apr 28 19:19:32.343203 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:32.343185 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c23a392b-1dc0-46ed-aa02-2f51cadbca4c-metrics-client-ca\") pod \"node-exporter-lpcfl\" (UID: \"c23a392b-1dc0-46ed-aa02-2f51cadbca4c\") " pod="openshift-monitoring/node-exporter-lpcfl" Apr 28 19:19:32.343341 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:32.343315 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/c23a392b-1dc0-46ed-aa02-2f51cadbca4c-node-exporter-accelerators-collector-config\") pod \"node-exporter-lpcfl\" (UID: \"c23a392b-1dc0-46ed-aa02-2f51cadbca4c\") " pod="openshift-monitoring/node-exporter-lpcfl" Apr 28 19:19:32.344700 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:32.344676 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c23a392b-1dc0-46ed-aa02-2f51cadbca4c-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-lpcfl\" (UID: \"c23a392b-1dc0-46ed-aa02-2f51cadbca4c\") " pod="openshift-monitoring/node-exporter-lpcfl" Apr 28 19:19:32.354412 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:32.354388 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sg9ck\" (UniqueName: \"kubernetes.io/projected/c23a392b-1dc0-46ed-aa02-2f51cadbca4c-kube-api-access-sg9ck\") pod \"node-exporter-lpcfl\" (UID: \"c23a392b-1dc0-46ed-aa02-2f51cadbca4c\") " pod="openshift-monitoring/node-exporter-lpcfl" Apr 28 19:19:32.647215 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:32.647175 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5488c59d85-xkp8m" Apr 28 19:19:32.647404 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:32.647231 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5488c59d85-xkp8m" Apr 28 19:19:32.652775 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:32.652751 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5488c59d85-xkp8m" Apr 28 19:19:32.847572 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:32.847538 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/c23a392b-1dc0-46ed-aa02-2f51cadbca4c-node-exporter-tls\") pod \"node-exporter-lpcfl\" (UID: \"c23a392b-1dc0-46ed-aa02-2f51cadbca4c\") " pod="openshift-monitoring/node-exporter-lpcfl" Apr 28 19:19:32.850343 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:32.850317 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/c23a392b-1dc0-46ed-aa02-2f51cadbca4c-node-exporter-tls\") pod \"node-exporter-lpcfl\" (UID: \"c23a392b-1dc0-46ed-aa02-2f51cadbca4c\") " pod="openshift-monitoring/node-exporter-lpcfl" Apr 28 19:19:33.032573 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:33.032545 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-lpcfl" Apr 28 19:19:33.087093 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:33.087059 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5488c59d85-xkp8m" Apr 28 19:19:34.633724 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:19:34.633686 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc23a392b_1dc0_46ed_aa02_2f51cadbca4c.slice/crio-2e7a55c4df8d41afc76b5ff19bc4d764e31cada9b582f34a9e4590e945c648af WatchSource:0}: Error finding container 2e7a55c4df8d41afc76b5ff19bc4d764e31cada9b582f34a9e4590e945c648af: Status 404 returned error can't find the container with id 2e7a55c4df8d41afc76b5ff19bc4d764e31cada9b582f34a9e4590e945c648af Apr 28 19:19:35.093384 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:35.093347 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-hgljc" event={"ID":"f38d2ea1-629f-4e29-8ffd-33cc8928f1b9","Type":"ContainerStarted","Data":"e1b054431898b210d707c0a2a782a2433ae92592b9059df698b6fd46744fc64b"} Apr 28 19:19:35.094194 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:35.094113 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-hgljc" Apr 28 19:19:35.095692 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:35.095222 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-lpcfl" event={"ID":"c23a392b-1dc0-46ed-aa02-2f51cadbca4c","Type":"ContainerStarted","Data":"2e7a55c4df8d41afc76b5ff19bc4d764e31cada9b582f34a9e4590e945c648af"} Apr 28 19:19:35.106295 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:35.106243 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-hgljc" Apr 28 19:19:35.119327 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:35.119268 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-hgljc" podStartSLOduration=1.473844641 podStartE2EDuration="18.119248809s" podCreationTimestamp="2026-04-28 19:19:17 +0000 UTC" firstStartedPulling="2026-04-28 19:19:18.069072795 +0000 UTC m=+174.031973140" lastFinishedPulling="2026-04-28 19:19:34.71447696 +0000 UTC m=+190.677377308" observedRunningTime="2026-04-28 19:19:35.117258613 +0000 UTC m=+191.080158980" watchObservedRunningTime="2026-04-28 19:19:35.119248809 +0000 UTC m=+191.082149175" Apr 28 19:19:36.100804 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:36.100762 2576 generic.go:358] "Generic (PLEG): container finished" podID="c23a392b-1dc0-46ed-aa02-2f51cadbca4c" containerID="19983e21b9095b4d66b85fe727ae51aa5ee163c7342f07d481d36f0b290f363e" exitCode=0 Apr 28 19:19:36.101282 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:36.100854 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-lpcfl" event={"ID":"c23a392b-1dc0-46ed-aa02-2f51cadbca4c","Type":"ContainerDied","Data":"19983e21b9095b4d66b85fe727ae51aa5ee163c7342f07d481d36f0b290f363e"} Apr 28 19:19:37.106553 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:37.106519 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-lpcfl" event={"ID":"c23a392b-1dc0-46ed-aa02-2f51cadbca4c","Type":"ContainerStarted","Data":"87024644d99b9f0992ec786f6d98783789acdb25295d41bcb3aa20fa5b31f031"} Apr 28 19:19:37.106985 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:37.106561 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-lpcfl" event={"ID":"c23a392b-1dc0-46ed-aa02-2f51cadbca4c","Type":"ContainerStarted","Data":"912a601ec08b14e9216fec2d77e497f3387c7c137ce3742cb090952df372b5d5"} Apr 28 19:19:40.031515 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:40.031483 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-6b4b455f45-wj89m" Apr 28 19:19:40.068435 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:40.068379 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-lpcfl" podStartSLOduration=7.305184935 podStartE2EDuration="8.068360747s" podCreationTimestamp="2026-04-28 19:19:32 +0000 UTC" firstStartedPulling="2026-04-28 19:19:34.635519062 +0000 UTC m=+190.598419405" lastFinishedPulling="2026-04-28 19:19:35.398694853 +0000 UTC m=+191.361595217" observedRunningTime="2026-04-28 19:19:37.146847911 +0000 UTC m=+193.109748288" watchObservedRunningTime="2026-04-28 19:19:40.068360747 +0000 UTC m=+196.031261113" Apr 28 19:19:42.675084 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:42.675037 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-64f9844bff-crx4m" podUID="59cffdfd-1e99-4ed5-96e7-7b40825f9cbe" containerName="registry" containerID="cri-o://71410e8647cb3549f8dc6fa40fca7a28ae1bf4928b689621460c5925e1383b3e" gracePeriod=30 Apr 28 19:19:42.962262 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:42.962238 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-64f9844bff-crx4m" Apr 28 19:19:43.043860 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:43.043818 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/59cffdfd-1e99-4ed5-96e7-7b40825f9cbe-ca-trust-extracted\") pod \"59cffdfd-1e99-4ed5-96e7-7b40825f9cbe\" (UID: \"59cffdfd-1e99-4ed5-96e7-7b40825f9cbe\") " Apr 28 19:19:43.044051 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:43.043879 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/59cffdfd-1e99-4ed5-96e7-7b40825f9cbe-installation-pull-secrets\") pod \"59cffdfd-1e99-4ed5-96e7-7b40825f9cbe\" (UID: \"59cffdfd-1e99-4ed5-96e7-7b40825f9cbe\") " Apr 28 19:19:43.044051 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:43.043918 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-trwnv\" (UniqueName: \"kubernetes.io/projected/59cffdfd-1e99-4ed5-96e7-7b40825f9cbe-kube-api-access-trwnv\") pod \"59cffdfd-1e99-4ed5-96e7-7b40825f9cbe\" (UID: \"59cffdfd-1e99-4ed5-96e7-7b40825f9cbe\") " Apr 28 19:19:43.044051 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:43.043936 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/59cffdfd-1e99-4ed5-96e7-7b40825f9cbe-bound-sa-token\") pod \"59cffdfd-1e99-4ed5-96e7-7b40825f9cbe\" (UID: \"59cffdfd-1e99-4ed5-96e7-7b40825f9cbe\") " Apr 28 19:19:43.044051 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:43.043951 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/59cffdfd-1e99-4ed5-96e7-7b40825f9cbe-registry-tls\") pod \"59cffdfd-1e99-4ed5-96e7-7b40825f9cbe\" (UID: \"59cffdfd-1e99-4ed5-96e7-7b40825f9cbe\") " Apr 28 19:19:43.044051 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:43.043978 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/59cffdfd-1e99-4ed5-96e7-7b40825f9cbe-trusted-ca\") pod \"59cffdfd-1e99-4ed5-96e7-7b40825f9cbe\" (UID: \"59cffdfd-1e99-4ed5-96e7-7b40825f9cbe\") " Apr 28 19:19:43.044051 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:43.044003 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/59cffdfd-1e99-4ed5-96e7-7b40825f9cbe-image-registry-private-configuration\") pod \"59cffdfd-1e99-4ed5-96e7-7b40825f9cbe\" (UID: \"59cffdfd-1e99-4ed5-96e7-7b40825f9cbe\") " Apr 28 19:19:43.044051 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:43.044037 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/59cffdfd-1e99-4ed5-96e7-7b40825f9cbe-registry-certificates\") pod \"59cffdfd-1e99-4ed5-96e7-7b40825f9cbe\" (UID: \"59cffdfd-1e99-4ed5-96e7-7b40825f9cbe\") " Apr 28 19:19:43.044499 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:43.044414 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59cffdfd-1e99-4ed5-96e7-7b40825f9cbe-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "59cffdfd-1e99-4ed5-96e7-7b40825f9cbe" (UID: "59cffdfd-1e99-4ed5-96e7-7b40825f9cbe"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:19:43.044928 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:43.044695 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59cffdfd-1e99-4ed5-96e7-7b40825f9cbe-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "59cffdfd-1e99-4ed5-96e7-7b40825f9cbe" (UID: "59cffdfd-1e99-4ed5-96e7-7b40825f9cbe"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:19:43.046631 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:43.046544 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59cffdfd-1e99-4ed5-96e7-7b40825f9cbe-kube-api-access-trwnv" (OuterVolumeSpecName: "kube-api-access-trwnv") pod "59cffdfd-1e99-4ed5-96e7-7b40825f9cbe" (UID: "59cffdfd-1e99-4ed5-96e7-7b40825f9cbe"). InnerVolumeSpecName "kube-api-access-trwnv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:19:43.046735 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:43.046678 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59cffdfd-1e99-4ed5-96e7-7b40825f9cbe-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "59cffdfd-1e99-4ed5-96e7-7b40825f9cbe" (UID: "59cffdfd-1e99-4ed5-96e7-7b40825f9cbe"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:19:43.046857 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:43.046830 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59cffdfd-1e99-4ed5-96e7-7b40825f9cbe-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "59cffdfd-1e99-4ed5-96e7-7b40825f9cbe" (UID: "59cffdfd-1e99-4ed5-96e7-7b40825f9cbe"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:19:43.046947 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:43.046918 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59cffdfd-1e99-4ed5-96e7-7b40825f9cbe-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "59cffdfd-1e99-4ed5-96e7-7b40825f9cbe" (UID: "59cffdfd-1e99-4ed5-96e7-7b40825f9cbe"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:19:43.047182 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:43.047159 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59cffdfd-1e99-4ed5-96e7-7b40825f9cbe-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "59cffdfd-1e99-4ed5-96e7-7b40825f9cbe" (UID: "59cffdfd-1e99-4ed5-96e7-7b40825f9cbe"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:19:43.055892 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:43.055864 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59cffdfd-1e99-4ed5-96e7-7b40825f9cbe-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "59cffdfd-1e99-4ed5-96e7-7b40825f9cbe" (UID: "59cffdfd-1e99-4ed5-96e7-7b40825f9cbe"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 19:19:43.126702 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:43.126663 2576 generic.go:358] "Generic (PLEG): container finished" podID="59cffdfd-1e99-4ed5-96e7-7b40825f9cbe" containerID="71410e8647cb3549f8dc6fa40fca7a28ae1bf4928b689621460c5925e1383b3e" exitCode=0 Apr 28 19:19:43.126886 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:43.126709 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-64f9844bff-crx4m" event={"ID":"59cffdfd-1e99-4ed5-96e7-7b40825f9cbe","Type":"ContainerDied","Data":"71410e8647cb3549f8dc6fa40fca7a28ae1bf4928b689621460c5925e1383b3e"} Apr 28 19:19:43.126886 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:43.126731 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-64f9844bff-crx4m" Apr 28 19:19:43.126886 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:43.126764 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-64f9844bff-crx4m" event={"ID":"59cffdfd-1e99-4ed5-96e7-7b40825f9cbe","Type":"ContainerDied","Data":"50bbdf1eca16be5f65e476bf15651f3b7eb4cec87745d9626fa129dd0bc504f6"} Apr 28 19:19:43.126886 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:43.126785 2576 scope.go:117] "RemoveContainer" containerID="71410e8647cb3549f8dc6fa40fca7a28ae1bf4928b689621460c5925e1383b3e" Apr 28 19:19:43.136196 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:43.136176 2576 scope.go:117] "RemoveContainer" containerID="71410e8647cb3549f8dc6fa40fca7a28ae1bf4928b689621460c5925e1383b3e" Apr 28 19:19:43.136563 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:19:43.136541 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71410e8647cb3549f8dc6fa40fca7a28ae1bf4928b689621460c5925e1383b3e\": container with ID starting with 71410e8647cb3549f8dc6fa40fca7a28ae1bf4928b689621460c5925e1383b3e not found: ID does not exist" containerID="71410e8647cb3549f8dc6fa40fca7a28ae1bf4928b689621460c5925e1383b3e" Apr 28 19:19:43.136671 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:43.136571 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71410e8647cb3549f8dc6fa40fca7a28ae1bf4928b689621460c5925e1383b3e"} err="failed to get container status \"71410e8647cb3549f8dc6fa40fca7a28ae1bf4928b689621460c5925e1383b3e\": rpc error: code = NotFound desc = could not find container \"71410e8647cb3549f8dc6fa40fca7a28ae1bf4928b689621460c5925e1383b3e\": container with ID starting with 71410e8647cb3549f8dc6fa40fca7a28ae1bf4928b689621460c5925e1383b3e not found: ID does not exist" Apr 28 19:19:43.145578 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:43.145548 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-trwnv\" (UniqueName: \"kubernetes.io/projected/59cffdfd-1e99-4ed5-96e7-7b40825f9cbe-kube-api-access-trwnv\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 28 19:19:43.145578 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:43.145574 2576 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/59cffdfd-1e99-4ed5-96e7-7b40825f9cbe-bound-sa-token\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 28 19:19:43.145578 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:43.145584 2576 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/59cffdfd-1e99-4ed5-96e7-7b40825f9cbe-registry-tls\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 28 19:19:43.145776 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:43.145593 2576 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/59cffdfd-1e99-4ed5-96e7-7b40825f9cbe-trusted-ca\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 28 19:19:43.145776 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:43.145633 2576 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/59cffdfd-1e99-4ed5-96e7-7b40825f9cbe-image-registry-private-configuration\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 28 19:19:43.145776 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:43.145649 2576 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/59cffdfd-1e99-4ed5-96e7-7b40825f9cbe-registry-certificates\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 28 19:19:43.145776 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:43.145661 2576 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/59cffdfd-1e99-4ed5-96e7-7b40825f9cbe-ca-trust-extracted\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 28 19:19:43.145776 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:43.145673 2576 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/59cffdfd-1e99-4ed5-96e7-7b40825f9cbe-installation-pull-secrets\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 28 19:19:43.150510 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:43.150486 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-64f9844bff-crx4m"] Apr 28 19:19:43.153506 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:43.153483 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-64f9844bff-crx4m"] Apr 28 19:19:44.514021 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:44.513984 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59cffdfd-1e99-4ed5-96e7-7b40825f9cbe" path="/var/lib/kubelet/pods/59cffdfd-1e99-4ed5-96e7-7b40825f9cbe/volumes" Apr 28 19:19:48.832208 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:19:48.832179 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5488c59d85-xkp8m"] Apr 28 19:20:09.198084 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:20:09.198050 2576 generic.go:358] "Generic (PLEG): container finished" podID="65cc2665-0cf0-4e9a-9316-292edb21e2bc" containerID="8c830bcdf1cb4bd2a3e6438c9f9e83698d2171ab02e460ebf4e086340c496547" exitCode=0 Apr 28 19:20:09.198489 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:20:09.198116 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-xq7sp" event={"ID":"65cc2665-0cf0-4e9a-9316-292edb21e2bc","Type":"ContainerDied","Data":"8c830bcdf1cb4bd2a3e6438c9f9e83698d2171ab02e460ebf4e086340c496547"} Apr 28 19:20:09.198489 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:20:09.198454 2576 scope.go:117] "RemoveContainer" containerID="8c830bcdf1cb4bd2a3e6438c9f9e83698d2171ab02e460ebf4e086340c496547" Apr 28 19:20:09.944441 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:20:09.944411 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-qjvwn_018d803a-f231-469e-8539-32dcc07e43f8/dns/0.log" Apr 28 19:20:09.952475 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:20:09.952454 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-qjvwn_018d803a-f231-469e-8539-32dcc07e43f8/kube-rbac-proxy/0.log" Apr 28 19:20:10.202310 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:20:10.202229 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-xq7sp" event={"ID":"65cc2665-0cf0-4e9a-9316-292edb21e2bc","Type":"ContainerStarted","Data":"f1cf2ab20b381831776734c5b9f73bd931a60ccf97c3ac9153c9e19bc8d885c6"} Apr 28 19:20:10.411931 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:20:10.411896 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-rb8nk_edd42d80-2884-4124-a4b2-2aea5543b72b/dns-node-resolver/0.log" Apr 28 19:20:13.851563 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:20:13.851509 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-5488c59d85-xkp8m" podUID="7da9f477-8a77-4f71-9500-d482754f1af1" containerName="console" containerID="cri-o://1f4cfe4f9ef4b3cd7e6c47d12ad95eb95bd4a6d5750a8f4f5ea1339b9e0fcfee" gracePeriod=15 Apr 28 19:20:14.123485 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:20:14.123460 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5488c59d85-xkp8m_7da9f477-8a77-4f71-9500-d482754f1af1/console/0.log" Apr 28 19:20:14.123633 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:20:14.123541 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5488c59d85-xkp8m" Apr 28 19:20:14.213971 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:20:14.213943 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5488c59d85-xkp8m_7da9f477-8a77-4f71-9500-d482754f1af1/console/0.log" Apr 28 19:20:14.214122 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:20:14.213984 2576 generic.go:358] "Generic (PLEG): container finished" podID="7da9f477-8a77-4f71-9500-d482754f1af1" containerID="1f4cfe4f9ef4b3cd7e6c47d12ad95eb95bd4a6d5750a8f4f5ea1339b9e0fcfee" exitCode=2 Apr 28 19:20:14.214122 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:20:14.214046 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5488c59d85-xkp8m" Apr 28 19:20:14.214122 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:20:14.214060 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5488c59d85-xkp8m" event={"ID":"7da9f477-8a77-4f71-9500-d482754f1af1","Type":"ContainerDied","Data":"1f4cfe4f9ef4b3cd7e6c47d12ad95eb95bd4a6d5750a8f4f5ea1339b9e0fcfee"} Apr 28 19:20:14.214122 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:20:14.214085 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5488c59d85-xkp8m" event={"ID":"7da9f477-8a77-4f71-9500-d482754f1af1","Type":"ContainerDied","Data":"b2ce265889d518a9c2dc15e26112329741f2415755576ba24a8447f762bcd1b6"} Apr 28 19:20:14.214122 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:20:14.214102 2576 scope.go:117] "RemoveContainer" containerID="1f4cfe4f9ef4b3cd7e6c47d12ad95eb95bd4a6d5750a8f4f5ea1339b9e0fcfee" Apr 28 19:20:14.221440 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:20:14.221415 2576 scope.go:117] "RemoveContainer" containerID="1f4cfe4f9ef4b3cd7e6c47d12ad95eb95bd4a6d5750a8f4f5ea1339b9e0fcfee" Apr 28 19:20:14.221695 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:20:14.221676 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f4cfe4f9ef4b3cd7e6c47d12ad95eb95bd4a6d5750a8f4f5ea1339b9e0fcfee\": container with ID starting with 1f4cfe4f9ef4b3cd7e6c47d12ad95eb95bd4a6d5750a8f4f5ea1339b9e0fcfee not found: ID does not exist" containerID="1f4cfe4f9ef4b3cd7e6c47d12ad95eb95bd4a6d5750a8f4f5ea1339b9e0fcfee" Apr 28 19:20:14.221752 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:20:14.221705 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f4cfe4f9ef4b3cd7e6c47d12ad95eb95bd4a6d5750a8f4f5ea1339b9e0fcfee"} err="failed to get container status \"1f4cfe4f9ef4b3cd7e6c47d12ad95eb95bd4a6d5750a8f4f5ea1339b9e0fcfee\": rpc error: code = NotFound desc = could not find container \"1f4cfe4f9ef4b3cd7e6c47d12ad95eb95bd4a6d5750a8f4f5ea1339b9e0fcfee\": container with ID starting with 1f4cfe4f9ef4b3cd7e6c47d12ad95eb95bd4a6d5750a8f4f5ea1339b9e0fcfee not found: ID does not exist" Apr 28 19:20:14.308213 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:20:14.308190 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7da9f477-8a77-4f71-9500-d482754f1af1-oauth-serving-cert\") pod \"7da9f477-8a77-4f71-9500-d482754f1af1\" (UID: \"7da9f477-8a77-4f71-9500-d482754f1af1\") " Apr 28 19:20:14.308340 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:20:14.308224 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7da9f477-8a77-4f71-9500-d482754f1af1-console-oauth-config\") pod \"7da9f477-8a77-4f71-9500-d482754f1af1\" (UID: \"7da9f477-8a77-4f71-9500-d482754f1af1\") " Apr 28 19:20:14.308340 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:20:14.308241 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7da9f477-8a77-4f71-9500-d482754f1af1-console-serving-cert\") pod \"7da9f477-8a77-4f71-9500-d482754f1af1\" (UID: \"7da9f477-8a77-4f71-9500-d482754f1af1\") " Apr 28 19:20:14.308340 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:20:14.308275 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7da9f477-8a77-4f71-9500-d482754f1af1-service-ca\") pod \"7da9f477-8a77-4f71-9500-d482754f1af1\" (UID: \"7da9f477-8a77-4f71-9500-d482754f1af1\") " Apr 28 19:20:14.308340 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:20:14.308308 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7da9f477-8a77-4f71-9500-d482754f1af1-console-config\") pod \"7da9f477-8a77-4f71-9500-d482754f1af1\" (UID: \"7da9f477-8a77-4f71-9500-d482754f1af1\") " Apr 28 19:20:14.308542 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:20:14.308359 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lwvbz\" (UniqueName: \"kubernetes.io/projected/7da9f477-8a77-4f71-9500-d482754f1af1-kube-api-access-lwvbz\") pod \"7da9f477-8a77-4f71-9500-d482754f1af1\" (UID: \"7da9f477-8a77-4f71-9500-d482754f1af1\") " Apr 28 19:20:14.308678 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:20:14.308650 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7da9f477-8a77-4f71-9500-d482754f1af1-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "7da9f477-8a77-4f71-9500-d482754f1af1" (UID: "7da9f477-8a77-4f71-9500-d482754f1af1"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:20:14.308791 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:20:14.308721 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7da9f477-8a77-4f71-9500-d482754f1af1-service-ca" (OuterVolumeSpecName: "service-ca") pod "7da9f477-8a77-4f71-9500-d482754f1af1" (UID: "7da9f477-8a77-4f71-9500-d482754f1af1"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:20:14.308791 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:20:14.308722 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7da9f477-8a77-4f71-9500-d482754f1af1-console-config" (OuterVolumeSpecName: "console-config") pod "7da9f477-8a77-4f71-9500-d482754f1af1" (UID: "7da9f477-8a77-4f71-9500-d482754f1af1"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:20:14.308791 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:20:14.308739 2576 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7da9f477-8a77-4f71-9500-d482754f1af1-oauth-serving-cert\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 28 19:20:14.310774 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:20:14.310750 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7da9f477-8a77-4f71-9500-d482754f1af1-kube-api-access-lwvbz" (OuterVolumeSpecName: "kube-api-access-lwvbz") pod "7da9f477-8a77-4f71-9500-d482754f1af1" (UID: "7da9f477-8a77-4f71-9500-d482754f1af1"). InnerVolumeSpecName "kube-api-access-lwvbz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:20:14.311002 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:20:14.310973 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7da9f477-8a77-4f71-9500-d482754f1af1-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "7da9f477-8a77-4f71-9500-d482754f1af1" (UID: "7da9f477-8a77-4f71-9500-d482754f1af1"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:20:14.311083 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:20:14.310996 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7da9f477-8a77-4f71-9500-d482754f1af1-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "7da9f477-8a77-4f71-9500-d482754f1af1" (UID: "7da9f477-8a77-4f71-9500-d482754f1af1"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:20:14.409589 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:20:14.409496 2576 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7da9f477-8a77-4f71-9500-d482754f1af1-console-config\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 28 19:20:14.409589 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:20:14.409530 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lwvbz\" (UniqueName: \"kubernetes.io/projected/7da9f477-8a77-4f71-9500-d482754f1af1-kube-api-access-lwvbz\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 28 19:20:14.409589 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:20:14.409540 2576 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7da9f477-8a77-4f71-9500-d482754f1af1-console-oauth-config\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 28 19:20:14.409589 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:20:14.409548 2576 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7da9f477-8a77-4f71-9500-d482754f1af1-console-serving-cert\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 28 19:20:14.409589 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:20:14.409557 2576 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7da9f477-8a77-4f71-9500-d482754f1af1-service-ca\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 28 19:20:14.540704 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:20:14.540672 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5488c59d85-xkp8m"] Apr 28 19:20:14.543319 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:20:14.543291 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5488c59d85-xkp8m"] Apr 28 19:20:16.513205 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:20:16.513169 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7da9f477-8a77-4f71-9500-d482754f1af1" path="/var/lib/kubelet/pods/7da9f477-8a77-4f71-9500-d482754f1af1/volumes" Apr 28 19:20:36.277715 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:20:36.277666 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4236a3f6-5c96-4e29-bb77-8dafe3cd242d-metrics-certs\") pod \"network-metrics-daemon-8j8w9\" (UID: \"4236a3f6-5c96-4e29-bb77-8dafe3cd242d\") " pod="openshift-multus/network-metrics-daemon-8j8w9" Apr 28 19:20:36.280100 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:20:36.280078 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4236a3f6-5c96-4e29-bb77-8dafe3cd242d-metrics-certs\") pod \"network-metrics-daemon-8j8w9\" (UID: \"4236a3f6-5c96-4e29-bb77-8dafe3cd242d\") " pod="openshift-multus/network-metrics-daemon-8j8w9" Apr 28 19:20:36.414308 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:20:36.414280 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-wz9tz\"" Apr 28 19:20:36.421744 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:20:36.421726 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8j8w9" Apr 28 19:20:36.559521 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:20:36.559447 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-8j8w9"] Apr 28 19:20:36.562494 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:20:36.562468 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4236a3f6_5c96_4e29_bb77_8dafe3cd242d.slice/crio-6b11ca5f00ab419c834d0f433b2d41c25fdc713f79553b570b36f8f622015dfb WatchSource:0}: Error finding container 6b11ca5f00ab419c834d0f433b2d41c25fdc713f79553b570b36f8f622015dfb: Status 404 returned error can't find the container with id 6b11ca5f00ab419c834d0f433b2d41c25fdc713f79553b570b36f8f622015dfb Apr 28 19:20:37.282269 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:20:37.282231 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-8j8w9" event={"ID":"4236a3f6-5c96-4e29-bb77-8dafe3cd242d","Type":"ContainerStarted","Data":"6b11ca5f00ab419c834d0f433b2d41c25fdc713f79553b570b36f8f622015dfb"} Apr 28 19:20:38.286811 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:20:38.286765 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-8j8w9" event={"ID":"4236a3f6-5c96-4e29-bb77-8dafe3cd242d","Type":"ContainerStarted","Data":"0a4e39b1f8d6713b55bcfd579b563182f6df682c034266d32a577cfa7d4fe581"} Apr 28 19:20:38.286811 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:20:38.286811 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-8j8w9" event={"ID":"4236a3f6-5c96-4e29-bb77-8dafe3cd242d","Type":"ContainerStarted","Data":"71fee31bfa5e4fce75977bf79fe7077b60b645a65877917797418ed8a11b79ca"} Apr 28 19:20:38.315817 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:20:38.315765 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-8j8w9" podStartSLOduration=253.410217817 podStartE2EDuration="4m14.315745742s" podCreationTimestamp="2026-04-28 19:16:24 +0000 UTC" firstStartedPulling="2026-04-28 19:20:36.564774422 +0000 UTC m=+252.527674766" lastFinishedPulling="2026-04-28 19:20:37.470302347 +0000 UTC m=+253.433202691" observedRunningTime="2026-04-28 19:20:38.315670724 +0000 UTC m=+254.278571090" watchObservedRunningTime="2026-04-28 19:20:38.315745742 +0000 UTC m=+254.278646109" Apr 28 19:21:01.277169 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:21:01.277137 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6b5cc65b89-qd9zn"] Apr 28 19:21:01.277503 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:21:01.277385 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="59cffdfd-1e99-4ed5-96e7-7b40825f9cbe" containerName="registry" Apr 28 19:21:01.277503 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:21:01.277398 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="59cffdfd-1e99-4ed5-96e7-7b40825f9cbe" containerName="registry" Apr 28 19:21:01.277503 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:21:01.277452 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7da9f477-8a77-4f71-9500-d482754f1af1" containerName="console" Apr 28 19:21:01.277503 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:21:01.277458 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="7da9f477-8a77-4f71-9500-d482754f1af1" containerName="console" Apr 28 19:21:01.277711 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:21:01.277515 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="7da9f477-8a77-4f71-9500-d482754f1af1" containerName="console" Apr 28 19:21:01.277711 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:21:01.277528 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="59cffdfd-1e99-4ed5-96e7-7b40825f9cbe" containerName="registry" Apr 28 19:21:01.280948 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:21:01.280926 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6b5cc65b89-qd9zn" Apr 28 19:21:01.284373 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:21:01.284350 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 28 19:21:01.285710 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:21:01.285666 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 28 19:21:01.285808 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:21:01.285713 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 28 19:21:01.285808 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:21:01.285735 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 28 19:21:01.285888 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:21:01.285680 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 28 19:21:01.289451 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:21:01.289392 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-c9kx6\"" Apr 28 19:21:01.295312 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:21:01.295289 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 28 19:21:01.295435 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:21:01.295352 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6b5cc65b89-qd9zn"] Apr 28 19:21:01.355304 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:21:01.355268 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/48761ea5-8ce6-45cb-8a6a-5c69554bafc5-console-config\") pod \"console-6b5cc65b89-qd9zn\" (UID: \"48761ea5-8ce6-45cb-8a6a-5c69554bafc5\") " pod="openshift-console/console-6b5cc65b89-qd9zn" Apr 28 19:21:01.355476 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:21:01.355310 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pcfgz\" (UniqueName: \"kubernetes.io/projected/48761ea5-8ce6-45cb-8a6a-5c69554bafc5-kube-api-access-pcfgz\") pod \"console-6b5cc65b89-qd9zn\" (UID: \"48761ea5-8ce6-45cb-8a6a-5c69554bafc5\") " pod="openshift-console/console-6b5cc65b89-qd9zn" Apr 28 19:21:01.355476 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:21:01.355343 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/48761ea5-8ce6-45cb-8a6a-5c69554bafc5-oauth-serving-cert\") pod \"console-6b5cc65b89-qd9zn\" (UID: \"48761ea5-8ce6-45cb-8a6a-5c69554bafc5\") " pod="openshift-console/console-6b5cc65b89-qd9zn" Apr 28 19:21:01.355476 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:21:01.355359 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/48761ea5-8ce6-45cb-8a6a-5c69554bafc5-service-ca\") pod \"console-6b5cc65b89-qd9zn\" (UID: \"48761ea5-8ce6-45cb-8a6a-5c69554bafc5\") " pod="openshift-console/console-6b5cc65b89-qd9zn" Apr 28 19:21:01.355476 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:21:01.355384 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/48761ea5-8ce6-45cb-8a6a-5c69554bafc5-console-oauth-config\") pod \"console-6b5cc65b89-qd9zn\" (UID: \"48761ea5-8ce6-45cb-8a6a-5c69554bafc5\") " pod="openshift-console/console-6b5cc65b89-qd9zn" Apr 28 19:21:01.355476 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:21:01.355413 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/48761ea5-8ce6-45cb-8a6a-5c69554bafc5-console-serving-cert\") pod \"console-6b5cc65b89-qd9zn\" (UID: \"48761ea5-8ce6-45cb-8a6a-5c69554bafc5\") " pod="openshift-console/console-6b5cc65b89-qd9zn" Apr 28 19:21:01.355476 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:21:01.355428 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/48761ea5-8ce6-45cb-8a6a-5c69554bafc5-trusted-ca-bundle\") pod \"console-6b5cc65b89-qd9zn\" (UID: \"48761ea5-8ce6-45cb-8a6a-5c69554bafc5\") " pod="openshift-console/console-6b5cc65b89-qd9zn" Apr 28 19:21:01.455854 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:21:01.455817 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/48761ea5-8ce6-45cb-8a6a-5c69554bafc5-oauth-serving-cert\") pod \"console-6b5cc65b89-qd9zn\" (UID: \"48761ea5-8ce6-45cb-8a6a-5c69554bafc5\") " pod="openshift-console/console-6b5cc65b89-qd9zn" Apr 28 19:21:01.455854 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:21:01.455853 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/48761ea5-8ce6-45cb-8a6a-5c69554bafc5-service-ca\") pod \"console-6b5cc65b89-qd9zn\" (UID: \"48761ea5-8ce6-45cb-8a6a-5c69554bafc5\") " pod="openshift-console/console-6b5cc65b89-qd9zn" Apr 28 19:21:01.456065 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:21:01.455878 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/48761ea5-8ce6-45cb-8a6a-5c69554bafc5-console-oauth-config\") pod \"console-6b5cc65b89-qd9zn\" (UID: \"48761ea5-8ce6-45cb-8a6a-5c69554bafc5\") " pod="openshift-console/console-6b5cc65b89-qd9zn" Apr 28 19:21:01.456065 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:21:01.455905 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/48761ea5-8ce6-45cb-8a6a-5c69554bafc5-console-serving-cert\") pod \"console-6b5cc65b89-qd9zn\" (UID: \"48761ea5-8ce6-45cb-8a6a-5c69554bafc5\") " pod="openshift-console/console-6b5cc65b89-qd9zn" Apr 28 19:21:01.456065 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:21:01.455920 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/48761ea5-8ce6-45cb-8a6a-5c69554bafc5-trusted-ca-bundle\") pod \"console-6b5cc65b89-qd9zn\" (UID: \"48761ea5-8ce6-45cb-8a6a-5c69554bafc5\") " pod="openshift-console/console-6b5cc65b89-qd9zn" Apr 28 19:21:01.456065 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:21:01.455940 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/48761ea5-8ce6-45cb-8a6a-5c69554bafc5-console-config\") pod \"console-6b5cc65b89-qd9zn\" (UID: \"48761ea5-8ce6-45cb-8a6a-5c69554bafc5\") " pod="openshift-console/console-6b5cc65b89-qd9zn" Apr 28 19:21:01.456065 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:21:01.455958 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pcfgz\" (UniqueName: \"kubernetes.io/projected/48761ea5-8ce6-45cb-8a6a-5c69554bafc5-kube-api-access-pcfgz\") pod \"console-6b5cc65b89-qd9zn\" (UID: \"48761ea5-8ce6-45cb-8a6a-5c69554bafc5\") " pod="openshift-console/console-6b5cc65b89-qd9zn" Apr 28 19:21:01.456578 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:21:01.456549 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/48761ea5-8ce6-45cb-8a6a-5c69554bafc5-oauth-serving-cert\") pod \"console-6b5cc65b89-qd9zn\" (UID: \"48761ea5-8ce6-45cb-8a6a-5c69554bafc5\") " pod="openshift-console/console-6b5cc65b89-qd9zn" Apr 28 19:21:01.456815 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:21:01.456796 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/48761ea5-8ce6-45cb-8a6a-5c69554bafc5-trusted-ca-bundle\") pod \"console-6b5cc65b89-qd9zn\" (UID: \"48761ea5-8ce6-45cb-8a6a-5c69554bafc5\") " pod="openshift-console/console-6b5cc65b89-qd9zn" Apr 28 19:21:01.456859 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:21:01.456804 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/48761ea5-8ce6-45cb-8a6a-5c69554bafc5-console-config\") pod \"console-6b5cc65b89-qd9zn\" (UID: \"48761ea5-8ce6-45cb-8a6a-5c69554bafc5\") " pod="openshift-console/console-6b5cc65b89-qd9zn" Apr 28 19:21:01.457261 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:21:01.457238 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/48761ea5-8ce6-45cb-8a6a-5c69554bafc5-service-ca\") pod \"console-6b5cc65b89-qd9zn\" (UID: \"48761ea5-8ce6-45cb-8a6a-5c69554bafc5\") " pod="openshift-console/console-6b5cc65b89-qd9zn" Apr 28 19:21:01.458470 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:21:01.458453 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/48761ea5-8ce6-45cb-8a6a-5c69554bafc5-console-oauth-config\") pod \"console-6b5cc65b89-qd9zn\" (UID: \"48761ea5-8ce6-45cb-8a6a-5c69554bafc5\") " pod="openshift-console/console-6b5cc65b89-qd9zn" Apr 28 19:21:01.458523 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:21:01.458466 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/48761ea5-8ce6-45cb-8a6a-5c69554bafc5-console-serving-cert\") pod \"console-6b5cc65b89-qd9zn\" (UID: \"48761ea5-8ce6-45cb-8a6a-5c69554bafc5\") " pod="openshift-console/console-6b5cc65b89-qd9zn" Apr 28 19:21:01.465119 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:21:01.465092 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pcfgz\" (UniqueName: \"kubernetes.io/projected/48761ea5-8ce6-45cb-8a6a-5c69554bafc5-kube-api-access-pcfgz\") pod \"console-6b5cc65b89-qd9zn\" (UID: \"48761ea5-8ce6-45cb-8a6a-5c69554bafc5\") " pod="openshift-console/console-6b5cc65b89-qd9zn" Apr 28 19:21:01.595040 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:21:01.594952 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6b5cc65b89-qd9zn" Apr 28 19:21:01.714042 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:21:01.714016 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6b5cc65b89-qd9zn"] Apr 28 19:21:01.716495 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:21:01.716466 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48761ea5_8ce6_45cb_8a6a_5c69554bafc5.slice/crio-4ee5651ce65a5b182cf2020c6f562f33789d6587ba08ce335834095f0d0d00fb WatchSource:0}: Error finding container 4ee5651ce65a5b182cf2020c6f562f33789d6587ba08ce335834095f0d0d00fb: Status 404 returned error can't find the container with id 4ee5651ce65a5b182cf2020c6f562f33789d6587ba08ce335834095f0d0d00fb Apr 28 19:21:02.350324 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:21:02.350286 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6b5cc65b89-qd9zn" event={"ID":"48761ea5-8ce6-45cb-8a6a-5c69554bafc5","Type":"ContainerStarted","Data":"b3e20f03bc7607f327089cd980ab22aaa35117200d67d44f42144e0e6c0e29a2"} Apr 28 19:21:02.350324 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:21:02.350322 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6b5cc65b89-qd9zn" event={"ID":"48761ea5-8ce6-45cb-8a6a-5c69554bafc5","Type":"ContainerStarted","Data":"4ee5651ce65a5b182cf2020c6f562f33789d6587ba08ce335834095f0d0d00fb"} Apr 28 19:21:02.373636 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:21:02.373577 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6b5cc65b89-qd9zn" podStartSLOduration=1.37356343 podStartE2EDuration="1.37356343s" podCreationTimestamp="2026-04-28 19:21:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:21:02.372474415 +0000 UTC m=+278.335374792" watchObservedRunningTime="2026-04-28 19:21:02.37356343 +0000 UTC m=+278.336463795" Apr 28 19:21:10.645561 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:21:10.645519 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6b5cc65b89-qd9zn"] Apr 28 19:21:10.690479 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:21:10.690448 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7b4bf5bbd9-w6s82"] Apr 28 19:21:10.693666 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:21:10.693648 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7b4bf5bbd9-w6s82" Apr 28 19:21:10.711484 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:21:10.711457 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7b4bf5bbd9-w6s82"] Apr 28 19:21:10.824431 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:21:10.824394 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bj87z\" (UniqueName: \"kubernetes.io/projected/9f7863a5-951d-4d3a-b503-a031d7ccb87f-kube-api-access-bj87z\") pod \"console-7b4bf5bbd9-w6s82\" (UID: \"9f7863a5-951d-4d3a-b503-a031d7ccb87f\") " pod="openshift-console/console-7b4bf5bbd9-w6s82" Apr 28 19:21:10.824431 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:21:10.824435 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9f7863a5-951d-4d3a-b503-a031d7ccb87f-oauth-serving-cert\") pod \"console-7b4bf5bbd9-w6s82\" (UID: \"9f7863a5-951d-4d3a-b503-a031d7ccb87f\") " pod="openshift-console/console-7b4bf5bbd9-w6s82" Apr 28 19:21:10.824638 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:21:10.824506 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9f7863a5-951d-4d3a-b503-a031d7ccb87f-service-ca\") pod \"console-7b4bf5bbd9-w6s82\" (UID: \"9f7863a5-951d-4d3a-b503-a031d7ccb87f\") " pod="openshift-console/console-7b4bf5bbd9-w6s82" Apr 28 19:21:10.824638 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:21:10.824568 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9f7863a5-951d-4d3a-b503-a031d7ccb87f-console-serving-cert\") pod \"console-7b4bf5bbd9-w6s82\" (UID: \"9f7863a5-951d-4d3a-b503-a031d7ccb87f\") " pod="openshift-console/console-7b4bf5bbd9-w6s82" Apr 28 19:21:10.824638 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:21:10.824632 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9f7863a5-951d-4d3a-b503-a031d7ccb87f-console-oauth-config\") pod \"console-7b4bf5bbd9-w6s82\" (UID: \"9f7863a5-951d-4d3a-b503-a031d7ccb87f\") " pod="openshift-console/console-7b4bf5bbd9-w6s82" Apr 28 19:21:10.824737 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:21:10.824650 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9f7863a5-951d-4d3a-b503-a031d7ccb87f-console-config\") pod \"console-7b4bf5bbd9-w6s82\" (UID: \"9f7863a5-951d-4d3a-b503-a031d7ccb87f\") " pod="openshift-console/console-7b4bf5bbd9-w6s82" Apr 28 19:21:10.824737 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:21:10.824671 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9f7863a5-951d-4d3a-b503-a031d7ccb87f-trusted-ca-bundle\") pod \"console-7b4bf5bbd9-w6s82\" (UID: \"9f7863a5-951d-4d3a-b503-a031d7ccb87f\") " pod="openshift-console/console-7b4bf5bbd9-w6s82" Apr 28 19:21:10.925735 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:21:10.925638 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9f7863a5-951d-4d3a-b503-a031d7ccb87f-trusted-ca-bundle\") pod \"console-7b4bf5bbd9-w6s82\" (UID: \"9f7863a5-951d-4d3a-b503-a031d7ccb87f\") " pod="openshift-console/console-7b4bf5bbd9-w6s82" Apr 28 19:21:10.925735 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:21:10.925694 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bj87z\" (UniqueName: \"kubernetes.io/projected/9f7863a5-951d-4d3a-b503-a031d7ccb87f-kube-api-access-bj87z\") pod \"console-7b4bf5bbd9-w6s82\" (UID: \"9f7863a5-951d-4d3a-b503-a031d7ccb87f\") " pod="openshift-console/console-7b4bf5bbd9-w6s82" Apr 28 19:21:10.925735 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:21:10.925715 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9f7863a5-951d-4d3a-b503-a031d7ccb87f-oauth-serving-cert\") pod \"console-7b4bf5bbd9-w6s82\" (UID: \"9f7863a5-951d-4d3a-b503-a031d7ccb87f\") " pod="openshift-console/console-7b4bf5bbd9-w6s82" Apr 28 19:21:10.926003 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:21:10.925747 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9f7863a5-951d-4d3a-b503-a031d7ccb87f-service-ca\") pod \"console-7b4bf5bbd9-w6s82\" (UID: \"9f7863a5-951d-4d3a-b503-a031d7ccb87f\") " pod="openshift-console/console-7b4bf5bbd9-w6s82" Apr 28 19:21:10.926003 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:21:10.925778 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9f7863a5-951d-4d3a-b503-a031d7ccb87f-console-serving-cert\") pod \"console-7b4bf5bbd9-w6s82\" (UID: \"9f7863a5-951d-4d3a-b503-a031d7ccb87f\") " pod="openshift-console/console-7b4bf5bbd9-w6s82" Apr 28 19:21:10.926003 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:21:10.925807 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9f7863a5-951d-4d3a-b503-a031d7ccb87f-console-oauth-config\") pod \"console-7b4bf5bbd9-w6s82\" (UID: \"9f7863a5-951d-4d3a-b503-a031d7ccb87f\") " pod="openshift-console/console-7b4bf5bbd9-w6s82" Apr 28 19:21:10.926003 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:21:10.925830 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9f7863a5-951d-4d3a-b503-a031d7ccb87f-console-config\") pod \"console-7b4bf5bbd9-w6s82\" (UID: \"9f7863a5-951d-4d3a-b503-a031d7ccb87f\") " pod="openshift-console/console-7b4bf5bbd9-w6s82" Apr 28 19:21:10.926442 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:21:10.926411 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9f7863a5-951d-4d3a-b503-a031d7ccb87f-service-ca\") pod \"console-7b4bf5bbd9-w6s82\" (UID: \"9f7863a5-951d-4d3a-b503-a031d7ccb87f\") " pod="openshift-console/console-7b4bf5bbd9-w6s82" Apr 28 19:21:10.926546 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:21:10.926527 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9f7863a5-951d-4d3a-b503-a031d7ccb87f-oauth-serving-cert\") pod \"console-7b4bf5bbd9-w6s82\" (UID: \"9f7863a5-951d-4d3a-b503-a031d7ccb87f\") " pod="openshift-console/console-7b4bf5bbd9-w6s82" Apr 28 19:21:10.926593 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:21:10.926558 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9f7863a5-951d-4d3a-b503-a031d7ccb87f-trusted-ca-bundle\") pod \"console-7b4bf5bbd9-w6s82\" (UID: \"9f7863a5-951d-4d3a-b503-a031d7ccb87f\") " pod="openshift-console/console-7b4bf5bbd9-w6s82" Apr 28 19:21:10.926721 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:21:10.926576 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9f7863a5-951d-4d3a-b503-a031d7ccb87f-console-config\") pod \"console-7b4bf5bbd9-w6s82\" (UID: \"9f7863a5-951d-4d3a-b503-a031d7ccb87f\") " pod="openshift-console/console-7b4bf5bbd9-w6s82" Apr 28 19:21:10.928377 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:21:10.928359 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9f7863a5-951d-4d3a-b503-a031d7ccb87f-console-serving-cert\") pod \"console-7b4bf5bbd9-w6s82\" (UID: \"9f7863a5-951d-4d3a-b503-a031d7ccb87f\") " pod="openshift-console/console-7b4bf5bbd9-w6s82" Apr 28 19:21:10.928463 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:21:10.928392 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9f7863a5-951d-4d3a-b503-a031d7ccb87f-console-oauth-config\") pod \"console-7b4bf5bbd9-w6s82\" (UID: \"9f7863a5-951d-4d3a-b503-a031d7ccb87f\") " pod="openshift-console/console-7b4bf5bbd9-w6s82" Apr 28 19:21:10.936915 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:21:10.936893 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bj87z\" (UniqueName: \"kubernetes.io/projected/9f7863a5-951d-4d3a-b503-a031d7ccb87f-kube-api-access-bj87z\") pod \"console-7b4bf5bbd9-w6s82\" (UID: \"9f7863a5-951d-4d3a-b503-a031d7ccb87f\") " pod="openshift-console/console-7b4bf5bbd9-w6s82" Apr 28 19:21:11.002959 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:21:11.002921 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7b4bf5bbd9-w6s82" Apr 28 19:21:11.156030 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:21:11.155997 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7b4bf5bbd9-w6s82"] Apr 28 19:21:11.157450 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:21:11.157424 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f7863a5_951d_4d3a_b503_a031d7ccb87f.slice/crio-47c64f69664b410ab70dde04f7cfc5eeb182ffe8ab7dc13e32188e81f6d1bf58 WatchSource:0}: Error finding container 47c64f69664b410ab70dde04f7cfc5eeb182ffe8ab7dc13e32188e81f6d1bf58: Status 404 returned error can't find the container with id 47c64f69664b410ab70dde04f7cfc5eeb182ffe8ab7dc13e32188e81f6d1bf58 Apr 28 19:21:11.374958 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:21:11.374920 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7b4bf5bbd9-w6s82" event={"ID":"9f7863a5-951d-4d3a-b503-a031d7ccb87f","Type":"ContainerStarted","Data":"e9588f82f4e18013524cf8a75d6e3b26354f5c86dc594b2ca1a5c506121015af"} Apr 28 19:21:11.374958 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:21:11.374963 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7b4bf5bbd9-w6s82" event={"ID":"9f7863a5-951d-4d3a-b503-a031d7ccb87f","Type":"ContainerStarted","Data":"47c64f69664b410ab70dde04f7cfc5eeb182ffe8ab7dc13e32188e81f6d1bf58"} Apr 28 19:21:11.393581 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:21:11.393509 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7b4bf5bbd9-w6s82" podStartSLOduration=1.393493255 podStartE2EDuration="1.393493255s" podCreationTimestamp="2026-04-28 19:21:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:21:11.39276771 +0000 UTC m=+287.355668077" watchObservedRunningTime="2026-04-28 19:21:11.393493255 +0000 UTC m=+287.356393617" Apr 28 19:21:11.596024 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:21:11.595973 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6b5cc65b89-qd9zn" Apr 28 19:21:21.003559 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:21:21.003507 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7b4bf5bbd9-w6s82" Apr 28 19:21:21.003559 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:21:21.003554 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7b4bf5bbd9-w6s82" Apr 28 19:21:21.008237 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:21:21.008215 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7b4bf5bbd9-w6s82" Apr 28 19:21:21.404397 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:21:21.404314 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7b4bf5bbd9-w6s82" Apr 28 19:21:24.430773 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:21:24.430739 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fts6r_6551d5d0-2583-4478-98ab-1efc22016165/console-operator/1.log" Apr 28 19:21:24.431239 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:21:24.431192 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fts6r_6551d5d0-2583-4478-98ab-1efc22016165/console-operator/1.log" Apr 28 19:21:24.438172 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:21:24.438148 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5tdk8_ff2f50e1-de53-4f11-a477-9236b340536b/ovn-acl-logging/0.log" Apr 28 19:21:24.438172 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:21:24.438173 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5tdk8_ff2f50e1-de53-4f11-a477-9236b340536b/ovn-acl-logging/0.log" Apr 28 19:21:24.441717 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:21:24.441697 2576 kubelet.go:1628] "Image garbage collection succeeded" Apr 28 19:21:35.665598 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:21:35.665530 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6b5cc65b89-qd9zn" podUID="48761ea5-8ce6-45cb-8a6a-5c69554bafc5" containerName="console" containerID="cri-o://b3e20f03bc7607f327089cd980ab22aaa35117200d67d44f42144e0e6c0e29a2" gracePeriod=15 Apr 28 19:21:35.914061 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:21:35.914035 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6b5cc65b89-qd9zn_48761ea5-8ce6-45cb-8a6a-5c69554bafc5/console/0.log" Apr 28 19:21:35.914180 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:21:35.914096 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6b5cc65b89-qd9zn" Apr 28 19:21:36.019898 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:21:36.019873 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/48761ea5-8ce6-45cb-8a6a-5c69554bafc5-console-config\") pod \"48761ea5-8ce6-45cb-8a6a-5c69554bafc5\" (UID: \"48761ea5-8ce6-45cb-8a6a-5c69554bafc5\") " Apr 28 19:21:36.020060 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:21:36.019910 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/48761ea5-8ce6-45cb-8a6a-5c69554bafc5-trusted-ca-bundle\") pod \"48761ea5-8ce6-45cb-8a6a-5c69554bafc5\" (UID: \"48761ea5-8ce6-45cb-8a6a-5c69554bafc5\") " Apr 28 19:21:36.020060 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:21:36.019963 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcfgz\" (UniqueName: \"kubernetes.io/projected/48761ea5-8ce6-45cb-8a6a-5c69554bafc5-kube-api-access-pcfgz\") pod \"48761ea5-8ce6-45cb-8a6a-5c69554bafc5\" (UID: \"48761ea5-8ce6-45cb-8a6a-5c69554bafc5\") " Apr 28 19:21:36.020060 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:21:36.019991 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/48761ea5-8ce6-45cb-8a6a-5c69554bafc5-oauth-serving-cert\") pod \"48761ea5-8ce6-45cb-8a6a-5c69554bafc5\" (UID: \"48761ea5-8ce6-45cb-8a6a-5c69554bafc5\") " Apr 28 19:21:36.020192 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:21:36.020137 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/48761ea5-8ce6-45cb-8a6a-5c69554bafc5-service-ca\") pod \"48761ea5-8ce6-45cb-8a6a-5c69554bafc5\" (UID: \"48761ea5-8ce6-45cb-8a6a-5c69554bafc5\") " Apr 28 19:21:36.020245 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:21:36.020204 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/48761ea5-8ce6-45cb-8a6a-5c69554bafc5-console-oauth-config\") pod \"48761ea5-8ce6-45cb-8a6a-5c69554bafc5\" (UID: \"48761ea5-8ce6-45cb-8a6a-5c69554bafc5\") " Apr 28 19:21:36.020245 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:21:36.020232 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/48761ea5-8ce6-45cb-8a6a-5c69554bafc5-console-serving-cert\") pod \"48761ea5-8ce6-45cb-8a6a-5c69554bafc5\" (UID: \"48761ea5-8ce6-45cb-8a6a-5c69554bafc5\") " Apr 28 19:21:36.020433 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:21:36.020408 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48761ea5-8ce6-45cb-8a6a-5c69554bafc5-console-config" (OuterVolumeSpecName: "console-config") pod "48761ea5-8ce6-45cb-8a6a-5c69554bafc5" (UID: "48761ea5-8ce6-45cb-8a6a-5c69554bafc5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:21:36.020555 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:21:36.020437 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48761ea5-8ce6-45cb-8a6a-5c69554bafc5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "48761ea5-8ce6-45cb-8a6a-5c69554bafc5" (UID: "48761ea5-8ce6-45cb-8a6a-5c69554bafc5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:21:36.020555 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:21:36.020478 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48761ea5-8ce6-45cb-8a6a-5c69554bafc5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "48761ea5-8ce6-45cb-8a6a-5c69554bafc5" (UID: "48761ea5-8ce6-45cb-8a6a-5c69554bafc5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:21:36.020555 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:21:36.020534 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48761ea5-8ce6-45cb-8a6a-5c69554bafc5-service-ca" (OuterVolumeSpecName: "service-ca") pod "48761ea5-8ce6-45cb-8a6a-5c69554bafc5" (UID: "48761ea5-8ce6-45cb-8a6a-5c69554bafc5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:21:36.022324 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:21:36.022297 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48761ea5-8ce6-45cb-8a6a-5c69554bafc5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "48761ea5-8ce6-45cb-8a6a-5c69554bafc5" (UID: "48761ea5-8ce6-45cb-8a6a-5c69554bafc5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:21:36.022411 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:21:36.022391 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48761ea5-8ce6-45cb-8a6a-5c69554bafc5-kube-api-access-pcfgz" (OuterVolumeSpecName: "kube-api-access-pcfgz") pod "48761ea5-8ce6-45cb-8a6a-5c69554bafc5" (UID: "48761ea5-8ce6-45cb-8a6a-5c69554bafc5"). InnerVolumeSpecName "kube-api-access-pcfgz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:21:36.022480 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:21:36.022456 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48761ea5-8ce6-45cb-8a6a-5c69554bafc5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "48761ea5-8ce6-45cb-8a6a-5c69554bafc5" (UID: "48761ea5-8ce6-45cb-8a6a-5c69554bafc5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:21:36.121010 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:21:36.120959 2576 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/48761ea5-8ce6-45cb-8a6a-5c69554bafc5-oauth-serving-cert\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 28 19:21:36.121010 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:21:36.121005 2576 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/48761ea5-8ce6-45cb-8a6a-5c69554bafc5-service-ca\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 28 19:21:36.121010 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:21:36.121016 2576 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/48761ea5-8ce6-45cb-8a6a-5c69554bafc5-console-oauth-config\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 28 19:21:36.121010 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:21:36.121025 2576 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/48761ea5-8ce6-45cb-8a6a-5c69554bafc5-console-serving-cert\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 28 19:21:36.121271 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:21:36.121035 2576 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/48761ea5-8ce6-45cb-8a6a-5c69554bafc5-console-config\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 28 19:21:36.121271 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:21:36.121043 2576 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/48761ea5-8ce6-45cb-8a6a-5c69554bafc5-trusted-ca-bundle\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 28 19:21:36.121271 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:21:36.121052 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pcfgz\" (UniqueName: \"kubernetes.io/projected/48761ea5-8ce6-45cb-8a6a-5c69554bafc5-kube-api-access-pcfgz\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 28 19:21:36.448642 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:21:36.448561 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6b5cc65b89-qd9zn_48761ea5-8ce6-45cb-8a6a-5c69554bafc5/console/0.log" Apr 28 19:21:36.448642 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:21:36.448614 2576 generic.go:358] "Generic (PLEG): container finished" podID="48761ea5-8ce6-45cb-8a6a-5c69554bafc5" containerID="b3e20f03bc7607f327089cd980ab22aaa35117200d67d44f42144e0e6c0e29a2" exitCode=2 Apr 28 19:21:36.448893 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:21:36.448673 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6b5cc65b89-qd9zn" event={"ID":"48761ea5-8ce6-45cb-8a6a-5c69554bafc5","Type":"ContainerDied","Data":"b3e20f03bc7607f327089cd980ab22aaa35117200d67d44f42144e0e6c0e29a2"} Apr 28 19:21:36.448893 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:21:36.448685 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6b5cc65b89-qd9zn" Apr 28 19:21:36.448893 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:21:36.448700 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6b5cc65b89-qd9zn" event={"ID":"48761ea5-8ce6-45cb-8a6a-5c69554bafc5","Type":"ContainerDied","Data":"4ee5651ce65a5b182cf2020c6f562f33789d6587ba08ce335834095f0d0d00fb"} Apr 28 19:21:36.448893 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:21:36.448715 2576 scope.go:117] "RemoveContainer" containerID="b3e20f03bc7607f327089cd980ab22aaa35117200d67d44f42144e0e6c0e29a2" Apr 28 19:21:36.456843 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:21:36.456826 2576 scope.go:117] "RemoveContainer" containerID="b3e20f03bc7607f327089cd980ab22aaa35117200d67d44f42144e0e6c0e29a2" Apr 28 19:21:36.457095 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:21:36.457077 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3e20f03bc7607f327089cd980ab22aaa35117200d67d44f42144e0e6c0e29a2\": container with ID starting with b3e20f03bc7607f327089cd980ab22aaa35117200d67d44f42144e0e6c0e29a2 not found: ID does not exist" containerID="b3e20f03bc7607f327089cd980ab22aaa35117200d67d44f42144e0e6c0e29a2" Apr 28 19:21:36.457159 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:21:36.457103 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3e20f03bc7607f327089cd980ab22aaa35117200d67d44f42144e0e6c0e29a2"} err="failed to get container status \"b3e20f03bc7607f327089cd980ab22aaa35117200d67d44f42144e0e6c0e29a2\": rpc error: code = NotFound desc = could not find container \"b3e20f03bc7607f327089cd980ab22aaa35117200d67d44f42144e0e6c0e29a2\": container with ID starting with b3e20f03bc7607f327089cd980ab22aaa35117200d67d44f42144e0e6c0e29a2 not found: ID does not exist" Apr 28 19:21:36.471394 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:21:36.471368 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6b5cc65b89-qd9zn"] Apr 28 19:21:36.475946 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:21:36.475926 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6b5cc65b89-qd9zn"] Apr 28 19:21:36.513561 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:21:36.513531 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48761ea5-8ce6-45cb-8a6a-5c69554bafc5" path="/var/lib/kubelet/pods/48761ea5-8ce6-45cb-8a6a-5c69554bafc5/volumes" Apr 28 19:25:21.343033 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:25:21.342950 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-d9gwv"] Apr 28 19:25:21.343495 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:25:21.343287 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="48761ea5-8ce6-45cb-8a6a-5c69554bafc5" containerName="console" Apr 28 19:25:21.343495 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:25:21.343306 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="48761ea5-8ce6-45cb-8a6a-5c69554bafc5" containerName="console" Apr 28 19:25:21.343495 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:25:21.343388 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="48761ea5-8ce6-45cb-8a6a-5c69554bafc5" containerName="console" Apr 28 19:25:21.346253 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:25:21.346237 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-d9gwv" Apr 28 19:25:21.349334 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:25:21.349311 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 28 19:25:21.364653 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:25:21.364630 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-d9gwv"] Apr 28 19:25:21.491403 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:25:21.491363 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/71247a4c-9959-44f9-acd7-c5243ed29332-original-pull-secret\") pod \"global-pull-secret-syncer-d9gwv\" (UID: \"71247a4c-9959-44f9-acd7-c5243ed29332\") " pod="kube-system/global-pull-secret-syncer-d9gwv" Apr 28 19:25:21.491572 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:25:21.491442 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/71247a4c-9959-44f9-acd7-c5243ed29332-kubelet-config\") pod \"global-pull-secret-syncer-d9gwv\" (UID: \"71247a4c-9959-44f9-acd7-c5243ed29332\") " pod="kube-system/global-pull-secret-syncer-d9gwv" Apr 28 19:25:21.491572 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:25:21.491482 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/71247a4c-9959-44f9-acd7-c5243ed29332-dbus\") pod \"global-pull-secret-syncer-d9gwv\" (UID: \"71247a4c-9959-44f9-acd7-c5243ed29332\") " pod="kube-system/global-pull-secret-syncer-d9gwv" Apr 28 19:25:21.592544 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:25:21.592504 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/71247a4c-9959-44f9-acd7-c5243ed29332-kubelet-config\") pod \"global-pull-secret-syncer-d9gwv\" (UID: \"71247a4c-9959-44f9-acd7-c5243ed29332\") " pod="kube-system/global-pull-secret-syncer-d9gwv" Apr 28 19:25:21.592743 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:25:21.592565 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/71247a4c-9959-44f9-acd7-c5243ed29332-dbus\") pod \"global-pull-secret-syncer-d9gwv\" (UID: \"71247a4c-9959-44f9-acd7-c5243ed29332\") " pod="kube-system/global-pull-secret-syncer-d9gwv" Apr 28 19:25:21.592743 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:25:21.592632 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/71247a4c-9959-44f9-acd7-c5243ed29332-original-pull-secret\") pod \"global-pull-secret-syncer-d9gwv\" (UID: \"71247a4c-9959-44f9-acd7-c5243ed29332\") " pod="kube-system/global-pull-secret-syncer-d9gwv" Apr 28 19:25:21.592743 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:25:21.592649 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/71247a4c-9959-44f9-acd7-c5243ed29332-kubelet-config\") pod \"global-pull-secret-syncer-d9gwv\" (UID: \"71247a4c-9959-44f9-acd7-c5243ed29332\") " pod="kube-system/global-pull-secret-syncer-d9gwv" Apr 28 19:25:21.592916 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:25:21.592758 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/71247a4c-9959-44f9-acd7-c5243ed29332-dbus\") pod \"global-pull-secret-syncer-d9gwv\" (UID: \"71247a4c-9959-44f9-acd7-c5243ed29332\") " pod="kube-system/global-pull-secret-syncer-d9gwv" Apr 28 19:25:21.595086 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:25:21.595031 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/71247a4c-9959-44f9-acd7-c5243ed29332-original-pull-secret\") pod \"global-pull-secret-syncer-d9gwv\" (UID: \"71247a4c-9959-44f9-acd7-c5243ed29332\") " pod="kube-system/global-pull-secret-syncer-d9gwv" Apr 28 19:25:21.655593 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:25:21.655566 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-d9gwv" Apr 28 19:25:21.779753 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:25:21.779537 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-d9gwv"] Apr 28 19:25:21.782392 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:25:21.782359 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71247a4c_9959_44f9_acd7_c5243ed29332.slice/crio-778f7c3b22b4173fdf54aaaec25b2b3fe7c1b4e273c3dfceb60e0ce94ec9c2ad WatchSource:0}: Error finding container 778f7c3b22b4173fdf54aaaec25b2b3fe7c1b4e273c3dfceb60e0ce94ec9c2ad: Status 404 returned error can't find the container with id 778f7c3b22b4173fdf54aaaec25b2b3fe7c1b4e273c3dfceb60e0ce94ec9c2ad Apr 28 19:25:21.783978 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:25:21.783957 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 28 19:25:22.039465 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:25:22.039424 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-d9gwv" event={"ID":"71247a4c-9959-44f9-acd7-c5243ed29332","Type":"ContainerStarted","Data":"778f7c3b22b4173fdf54aaaec25b2b3fe7c1b4e273c3dfceb60e0ce94ec9c2ad"} Apr 28 19:25:26.054213 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:25:26.054119 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-d9gwv" event={"ID":"71247a4c-9959-44f9-acd7-c5243ed29332","Type":"ContainerStarted","Data":"2354f739cd25a5188b3c133fd4d7741fe54fafbe7600c2bd8579dd5b1c2ec4ca"} Apr 28 19:25:26.074951 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:25:26.074883 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-d9gwv" podStartSLOduration=1.078920913 podStartE2EDuration="5.074868303s" podCreationTimestamp="2026-04-28 19:25:21 +0000 UTC" firstStartedPulling="2026-04-28 19:25:21.784114932 +0000 UTC m=+537.747015276" lastFinishedPulling="2026-04-28 19:25:25.780062318 +0000 UTC m=+541.742962666" observedRunningTime="2026-04-28 19:25:26.074161268 +0000 UTC m=+542.037061633" watchObservedRunningTime="2026-04-28 19:25:26.074868303 +0000 UTC m=+542.037768668" Apr 28 19:25:40.316142 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:25:40.316108 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c96ndv"] Apr 28 19:25:40.319404 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:25:40.319388 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c96ndv" Apr 28 19:25:40.322338 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:25:40.322310 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 28 19:25:40.322435 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:25:40.322373 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 28 19:25:40.323532 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:25:40.323515 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-vcdcq\"" Apr 28 19:25:40.327760 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:25:40.327740 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c96ndv"] Apr 28 19:25:40.439932 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:25:40.439890 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8cfb3cc8-2636-4c2d-9837-8824cb902944-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c96ndv\" (UID: \"8cfb3cc8-2636-4c2d-9837-8824cb902944\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c96ndv" Apr 28 19:25:40.439932 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:25:40.439929 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9k4v\" (UniqueName: \"kubernetes.io/projected/8cfb3cc8-2636-4c2d-9837-8824cb902944-kube-api-access-s9k4v\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c96ndv\" (UID: \"8cfb3cc8-2636-4c2d-9837-8824cb902944\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c96ndv" Apr 28 19:25:40.440160 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:25:40.439971 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8cfb3cc8-2636-4c2d-9837-8824cb902944-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c96ndv\" (UID: \"8cfb3cc8-2636-4c2d-9837-8824cb902944\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c96ndv" Apr 28 19:25:40.540992 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:25:40.540959 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8cfb3cc8-2636-4c2d-9837-8824cb902944-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c96ndv\" (UID: \"8cfb3cc8-2636-4c2d-9837-8824cb902944\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c96ndv" Apr 28 19:25:40.541146 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:25:40.540998 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s9k4v\" (UniqueName: \"kubernetes.io/projected/8cfb3cc8-2636-4c2d-9837-8824cb902944-kube-api-access-s9k4v\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c96ndv\" (UID: \"8cfb3cc8-2636-4c2d-9837-8824cb902944\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c96ndv" Apr 28 19:25:40.541146 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:25:40.541028 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8cfb3cc8-2636-4c2d-9837-8824cb902944-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c96ndv\" (UID: \"8cfb3cc8-2636-4c2d-9837-8824cb902944\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c96ndv" Apr 28 19:25:40.541373 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:25:40.541358 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8cfb3cc8-2636-4c2d-9837-8824cb902944-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c96ndv\" (UID: \"8cfb3cc8-2636-4c2d-9837-8824cb902944\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c96ndv" Apr 28 19:25:40.541429 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:25:40.541356 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8cfb3cc8-2636-4c2d-9837-8824cb902944-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c96ndv\" (UID: \"8cfb3cc8-2636-4c2d-9837-8824cb902944\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c96ndv" Apr 28 19:25:40.550351 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:25:40.550314 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9k4v\" (UniqueName: \"kubernetes.io/projected/8cfb3cc8-2636-4c2d-9837-8824cb902944-kube-api-access-s9k4v\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c96ndv\" (UID: \"8cfb3cc8-2636-4c2d-9837-8824cb902944\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c96ndv" Apr 28 19:25:40.629119 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:25:40.629022 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c96ndv" Apr 28 19:25:40.751287 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:25:40.751245 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c96ndv"] Apr 28 19:25:40.754186 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:25:40.754152 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8cfb3cc8_2636_4c2d_9837_8824cb902944.slice/crio-272eaddf48f057877a2d0916f60ebf4b96fae090a18c48b15683d6ed8f8caa7f WatchSource:0}: Error finding container 272eaddf48f057877a2d0916f60ebf4b96fae090a18c48b15683d6ed8f8caa7f: Status 404 returned error can't find the container with id 272eaddf48f057877a2d0916f60ebf4b96fae090a18c48b15683d6ed8f8caa7f Apr 28 19:25:41.098549 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:25:41.098507 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c96ndv" event={"ID":"8cfb3cc8-2636-4c2d-9837-8824cb902944","Type":"ContainerStarted","Data":"272eaddf48f057877a2d0916f60ebf4b96fae090a18c48b15683d6ed8f8caa7f"} Apr 28 19:25:47.117790 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:25:47.117699 2576 generic.go:358] "Generic (PLEG): container finished" podID="8cfb3cc8-2636-4c2d-9837-8824cb902944" containerID="74bf6fe1d870e47f2743de904a246d46bae37edcf7e778eff36a364f72ed7ad5" exitCode=0 Apr 28 19:25:47.117790 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:25:47.117743 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c96ndv" event={"ID":"8cfb3cc8-2636-4c2d-9837-8824cb902944","Type":"ContainerDied","Data":"74bf6fe1d870e47f2743de904a246d46bae37edcf7e778eff36a364f72ed7ad5"} Apr 28 19:25:50.127348 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:25:50.127311 2576 generic.go:358] "Generic (PLEG): container finished" podID="8cfb3cc8-2636-4c2d-9837-8824cb902944" containerID="c245cf323c8a3a38e295525f9a843aecd7509770dee5183e2cd67de3602f77af" exitCode=0 Apr 28 19:25:50.127788 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:25:50.127400 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c96ndv" event={"ID":"8cfb3cc8-2636-4c2d-9837-8824cb902944","Type":"ContainerDied","Data":"c245cf323c8a3a38e295525f9a843aecd7509770dee5183e2cd67de3602f77af"} Apr 28 19:25:57.152303 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:25:57.152267 2576 generic.go:358] "Generic (PLEG): container finished" podID="8cfb3cc8-2636-4c2d-9837-8824cb902944" containerID="11dec56b94bc9129c8a3f7eef6b7cbbd391c8b9fbdda22381c89f61ea13977ca" exitCode=0 Apr 28 19:25:57.152717 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:25:57.152361 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c96ndv" event={"ID":"8cfb3cc8-2636-4c2d-9837-8824cb902944","Type":"ContainerDied","Data":"11dec56b94bc9129c8a3f7eef6b7cbbd391c8b9fbdda22381c89f61ea13977ca"} Apr 28 19:25:58.283043 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:25:58.283013 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c96ndv" Apr 28 19:25:58.376639 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:25:58.376587 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8cfb3cc8-2636-4c2d-9837-8824cb902944-bundle\") pod \"8cfb3cc8-2636-4c2d-9837-8824cb902944\" (UID: \"8cfb3cc8-2636-4c2d-9837-8824cb902944\") " Apr 28 19:25:58.376639 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:25:58.376645 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s9k4v\" (UniqueName: \"kubernetes.io/projected/8cfb3cc8-2636-4c2d-9837-8824cb902944-kube-api-access-s9k4v\") pod \"8cfb3cc8-2636-4c2d-9837-8824cb902944\" (UID: \"8cfb3cc8-2636-4c2d-9837-8824cb902944\") " Apr 28 19:25:58.376863 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:25:58.376691 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8cfb3cc8-2636-4c2d-9837-8824cb902944-util\") pod \"8cfb3cc8-2636-4c2d-9837-8824cb902944\" (UID: \"8cfb3cc8-2636-4c2d-9837-8824cb902944\") " Apr 28 19:25:58.377250 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:25:58.377217 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8cfb3cc8-2636-4c2d-9837-8824cb902944-bundle" (OuterVolumeSpecName: "bundle") pod "8cfb3cc8-2636-4c2d-9837-8824cb902944" (UID: "8cfb3cc8-2636-4c2d-9837-8824cb902944"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 19:25:58.379019 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:25:58.378985 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cfb3cc8-2636-4c2d-9837-8824cb902944-kube-api-access-s9k4v" (OuterVolumeSpecName: "kube-api-access-s9k4v") pod "8cfb3cc8-2636-4c2d-9837-8824cb902944" (UID: "8cfb3cc8-2636-4c2d-9837-8824cb902944"). InnerVolumeSpecName "kube-api-access-s9k4v". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:25:58.381198 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:25:58.381176 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8cfb3cc8-2636-4c2d-9837-8824cb902944-util" (OuterVolumeSpecName: "util") pod "8cfb3cc8-2636-4c2d-9837-8824cb902944" (UID: "8cfb3cc8-2636-4c2d-9837-8824cb902944"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 19:25:58.477308 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:25:58.477213 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8cfb3cc8-2636-4c2d-9837-8824cb902944-util\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 28 19:25:58.477308 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:25:58.477249 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8cfb3cc8-2636-4c2d-9837-8824cb902944-bundle\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 28 19:25:58.477308 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:25:58.477258 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-s9k4v\" (UniqueName: \"kubernetes.io/projected/8cfb3cc8-2636-4c2d-9837-8824cb902944-kube-api-access-s9k4v\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 28 19:25:59.159385 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:25:59.159348 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c96ndv" event={"ID":"8cfb3cc8-2636-4c2d-9837-8824cb902944","Type":"ContainerDied","Data":"272eaddf48f057877a2d0916f60ebf4b96fae090a18c48b15683d6ed8f8caa7f"} Apr 28 19:25:59.159385 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:25:59.159385 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="272eaddf48f057877a2d0916f60ebf4b96fae090a18c48b15683d6ed8f8caa7f" Apr 28 19:25:59.159588 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:25:59.159396 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c96ndv" Apr 28 19:26:02.344698 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:02.344661 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-dvc5c"] Apr 28 19:26:02.345065 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:02.345029 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8cfb3cc8-2636-4c2d-9837-8824cb902944" containerName="util" Apr 28 19:26:02.345065 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:02.345045 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cfb3cc8-2636-4c2d-9837-8824cb902944" containerName="util" Apr 28 19:26:02.345065 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:02.345059 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8cfb3cc8-2636-4c2d-9837-8824cb902944" containerName="pull" Apr 28 19:26:02.345065 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:02.345067 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cfb3cc8-2636-4c2d-9837-8824cb902944" containerName="pull" Apr 28 19:26:02.345193 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:02.345078 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8cfb3cc8-2636-4c2d-9837-8824cb902944" containerName="extract" Apr 28 19:26:02.345193 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:02.345083 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cfb3cc8-2636-4c2d-9837-8824cb902944" containerName="extract" Apr 28 19:26:02.345193 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:02.345127 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="8cfb3cc8-2636-4c2d-9837-8824cb902944" containerName="extract" Apr 28 19:26:02.349798 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:02.349782 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-dvc5c" Apr 28 19:26:02.352665 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:02.352642 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 28 19:26:02.353338 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:02.353316 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 28 19:26:02.353446 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:02.353369 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-t7fmb\"" Apr 28 19:26:02.353446 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:02.353385 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 28 19:26:02.358889 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:02.358865 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-dvc5c"] Apr 28 19:26:02.405885 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:02.405851 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/92df653e-c495-455f-bd6a-99d3768a8f41-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-dvc5c\" (UID: \"92df653e-c495-455f-bd6a-99d3768a8f41\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-dvc5c" Apr 28 19:26:02.405885 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:02.405884 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4kkg\" (UniqueName: \"kubernetes.io/projected/92df653e-c495-455f-bd6a-99d3768a8f41-kube-api-access-g4kkg\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-dvc5c\" (UID: \"92df653e-c495-455f-bd6a-99d3768a8f41\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-dvc5c" Apr 28 19:26:02.506671 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:02.506633 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/92df653e-c495-455f-bd6a-99d3768a8f41-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-dvc5c\" (UID: \"92df653e-c495-455f-bd6a-99d3768a8f41\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-dvc5c" Apr 28 19:26:02.506671 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:02.506669 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g4kkg\" (UniqueName: \"kubernetes.io/projected/92df653e-c495-455f-bd6a-99d3768a8f41-kube-api-access-g4kkg\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-dvc5c\" (UID: \"92df653e-c495-455f-bd6a-99d3768a8f41\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-dvc5c" Apr 28 19:26:02.509110 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:02.509085 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/92df653e-c495-455f-bd6a-99d3768a8f41-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-dvc5c\" (UID: \"92df653e-c495-455f-bd6a-99d3768a8f41\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-dvc5c" Apr 28 19:26:02.521565 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:02.521534 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4kkg\" (UniqueName: \"kubernetes.io/projected/92df653e-c495-455f-bd6a-99d3768a8f41-kube-api-access-g4kkg\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-dvc5c\" (UID: \"92df653e-c495-455f-bd6a-99d3768a8f41\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-dvc5c" Apr 28 19:26:02.659697 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:02.659589 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-dvc5c" Apr 28 19:26:02.786149 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:02.786126 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-dvc5c"] Apr 28 19:26:02.788510 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:26:02.788483 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod92df653e_c495_455f_bd6a_99d3768a8f41.slice/crio-e22e27ef1349a4aa80f3e747068c219b2d78d66b7ac0f46adc32d7efaad112d1 WatchSource:0}: Error finding container e22e27ef1349a4aa80f3e747068c219b2d78d66b7ac0f46adc32d7efaad112d1: Status 404 returned error can't find the container with id e22e27ef1349a4aa80f3e747068c219b2d78d66b7ac0f46adc32d7efaad112d1 Apr 28 19:26:03.172295 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:03.172263 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-dvc5c" event={"ID":"92df653e-c495-455f-bd6a-99d3768a8f41","Type":"ContainerStarted","Data":"e22e27ef1349a4aa80f3e747068c219b2d78d66b7ac0f46adc32d7efaad112d1"} Apr 28 19:26:06.432018 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:06.431993 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-65d79b9785-vfbq2"] Apr 28 19:26:06.436928 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:06.436895 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-65d79b9785-vfbq2" Apr 28 19:26:06.445675 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:06.445649 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-65d79b9785-vfbq2"] Apr 28 19:26:06.541909 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:06.541872 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/074538fe-619d-4f26-a23e-bc0f85a1fc20-oauth-serving-cert\") pod \"console-65d79b9785-vfbq2\" (UID: \"074538fe-619d-4f26-a23e-bc0f85a1fc20\") " pod="openshift-console/console-65d79b9785-vfbq2" Apr 28 19:26:06.542064 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:06.541919 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/074538fe-619d-4f26-a23e-bc0f85a1fc20-console-serving-cert\") pod \"console-65d79b9785-vfbq2\" (UID: \"074538fe-619d-4f26-a23e-bc0f85a1fc20\") " pod="openshift-console/console-65d79b9785-vfbq2" Apr 28 19:26:06.542064 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:06.541952 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/074538fe-619d-4f26-a23e-bc0f85a1fc20-console-oauth-config\") pod \"console-65d79b9785-vfbq2\" (UID: \"074538fe-619d-4f26-a23e-bc0f85a1fc20\") " pod="openshift-console/console-65d79b9785-vfbq2" Apr 28 19:26:06.542064 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:06.541969 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/074538fe-619d-4f26-a23e-bc0f85a1fc20-service-ca\") pod \"console-65d79b9785-vfbq2\" (UID: \"074538fe-619d-4f26-a23e-bc0f85a1fc20\") " pod="openshift-console/console-65d79b9785-vfbq2" Apr 28 19:26:06.542064 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:06.541993 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-692pr\" (UniqueName: \"kubernetes.io/projected/074538fe-619d-4f26-a23e-bc0f85a1fc20-kube-api-access-692pr\") pod \"console-65d79b9785-vfbq2\" (UID: \"074538fe-619d-4f26-a23e-bc0f85a1fc20\") " pod="openshift-console/console-65d79b9785-vfbq2" Apr 28 19:26:06.542064 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:06.542049 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/074538fe-619d-4f26-a23e-bc0f85a1fc20-console-config\") pod \"console-65d79b9785-vfbq2\" (UID: \"074538fe-619d-4f26-a23e-bc0f85a1fc20\") " pod="openshift-console/console-65d79b9785-vfbq2" Apr 28 19:26:06.542234 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:06.542076 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/074538fe-619d-4f26-a23e-bc0f85a1fc20-trusted-ca-bundle\") pod \"console-65d79b9785-vfbq2\" (UID: \"074538fe-619d-4f26-a23e-bc0f85a1fc20\") " pod="openshift-console/console-65d79b9785-vfbq2" Apr 28 19:26:06.642833 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:06.642791 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/074538fe-619d-4f26-a23e-bc0f85a1fc20-console-oauth-config\") pod \"console-65d79b9785-vfbq2\" (UID: \"074538fe-619d-4f26-a23e-bc0f85a1fc20\") " pod="openshift-console/console-65d79b9785-vfbq2" Apr 28 19:26:06.643102 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:06.642842 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/074538fe-619d-4f26-a23e-bc0f85a1fc20-service-ca\") pod \"console-65d79b9785-vfbq2\" (UID: \"074538fe-619d-4f26-a23e-bc0f85a1fc20\") " pod="openshift-console/console-65d79b9785-vfbq2" Apr 28 19:26:06.643102 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:06.642869 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-692pr\" (UniqueName: \"kubernetes.io/projected/074538fe-619d-4f26-a23e-bc0f85a1fc20-kube-api-access-692pr\") pod \"console-65d79b9785-vfbq2\" (UID: \"074538fe-619d-4f26-a23e-bc0f85a1fc20\") " pod="openshift-console/console-65d79b9785-vfbq2" Apr 28 19:26:06.643102 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:06.642898 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/074538fe-619d-4f26-a23e-bc0f85a1fc20-console-config\") pod \"console-65d79b9785-vfbq2\" (UID: \"074538fe-619d-4f26-a23e-bc0f85a1fc20\") " pod="openshift-console/console-65d79b9785-vfbq2" Apr 28 19:26:06.643102 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:06.642927 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/074538fe-619d-4f26-a23e-bc0f85a1fc20-trusted-ca-bundle\") pod \"console-65d79b9785-vfbq2\" (UID: \"074538fe-619d-4f26-a23e-bc0f85a1fc20\") " pod="openshift-console/console-65d79b9785-vfbq2" Apr 28 19:26:06.643102 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:06.642985 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/074538fe-619d-4f26-a23e-bc0f85a1fc20-oauth-serving-cert\") pod \"console-65d79b9785-vfbq2\" (UID: \"074538fe-619d-4f26-a23e-bc0f85a1fc20\") " pod="openshift-console/console-65d79b9785-vfbq2" Apr 28 19:26:06.643102 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:06.643016 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/074538fe-619d-4f26-a23e-bc0f85a1fc20-console-serving-cert\") pod \"console-65d79b9785-vfbq2\" (UID: \"074538fe-619d-4f26-a23e-bc0f85a1fc20\") " pod="openshift-console/console-65d79b9785-vfbq2" Apr 28 19:26:06.643773 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:06.643745 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/074538fe-619d-4f26-a23e-bc0f85a1fc20-console-config\") pod \"console-65d79b9785-vfbq2\" (UID: \"074538fe-619d-4f26-a23e-bc0f85a1fc20\") " pod="openshift-console/console-65d79b9785-vfbq2" Apr 28 19:26:06.643878 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:06.643857 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/074538fe-619d-4f26-a23e-bc0f85a1fc20-trusted-ca-bundle\") pod \"console-65d79b9785-vfbq2\" (UID: \"074538fe-619d-4f26-a23e-bc0f85a1fc20\") " pod="openshift-console/console-65d79b9785-vfbq2" Apr 28 19:26:06.643931 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:06.643877 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/074538fe-619d-4f26-a23e-bc0f85a1fc20-service-ca\") pod \"console-65d79b9785-vfbq2\" (UID: \"074538fe-619d-4f26-a23e-bc0f85a1fc20\") " pod="openshift-console/console-65d79b9785-vfbq2" Apr 28 19:26:06.643997 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:06.643975 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/074538fe-619d-4f26-a23e-bc0f85a1fc20-oauth-serving-cert\") pod \"console-65d79b9785-vfbq2\" (UID: \"074538fe-619d-4f26-a23e-bc0f85a1fc20\") " pod="openshift-console/console-65d79b9785-vfbq2" Apr 28 19:26:06.645740 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:06.645677 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/074538fe-619d-4f26-a23e-bc0f85a1fc20-console-oauth-config\") pod \"console-65d79b9785-vfbq2\" (UID: \"074538fe-619d-4f26-a23e-bc0f85a1fc20\") " pod="openshift-console/console-65d79b9785-vfbq2" Apr 28 19:26:06.645913 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:06.645895 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/074538fe-619d-4f26-a23e-bc0f85a1fc20-console-serving-cert\") pod \"console-65d79b9785-vfbq2\" (UID: \"074538fe-619d-4f26-a23e-bc0f85a1fc20\") " pod="openshift-console/console-65d79b9785-vfbq2" Apr 28 19:26:06.664467 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:06.664436 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-692pr\" (UniqueName: \"kubernetes.io/projected/074538fe-619d-4f26-a23e-bc0f85a1fc20-kube-api-access-692pr\") pod \"console-65d79b9785-vfbq2\" (UID: \"074538fe-619d-4f26-a23e-bc0f85a1fc20\") " pod="openshift-console/console-65d79b9785-vfbq2" Apr 28 19:26:06.757579 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:06.757540 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-65d79b9785-vfbq2" Apr 28 19:26:06.904372 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:06.904290 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-65d79b9785-vfbq2"] Apr 28 19:26:06.907400 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:26:06.907364 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod074538fe_619d_4f26_a23e_bc0f85a1fc20.slice/crio-040e4efb8c8a04f79ba6086a599dcde45b0b936189d916dc798192e73f9bab29 WatchSource:0}: Error finding container 040e4efb8c8a04f79ba6086a599dcde45b0b936189d916dc798192e73f9bab29: Status 404 returned error can't find the container with id 040e4efb8c8a04f79ba6086a599dcde45b0b936189d916dc798192e73f9bab29 Apr 28 19:26:07.188818 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:07.188717 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-dvc5c" event={"ID":"92df653e-c495-455f-bd6a-99d3768a8f41","Type":"ContainerStarted","Data":"c7e1f6ebf22a5266837f4fad3ff40c3b14df87a7b882460609cf2ac553d1903b"} Apr 28 19:26:07.188969 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:07.188891 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-dvc5c" Apr 28 19:26:07.190060 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:07.190037 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-65d79b9785-vfbq2" event={"ID":"074538fe-619d-4f26-a23e-bc0f85a1fc20","Type":"ContainerStarted","Data":"139142409efc05740737575d9e5723130c94b7cf1795b927bca5b9808c11a382"} Apr 28 19:26:07.190060 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:07.190062 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-65d79b9785-vfbq2" event={"ID":"074538fe-619d-4f26-a23e-bc0f85a1fc20","Type":"ContainerStarted","Data":"040e4efb8c8a04f79ba6086a599dcde45b0b936189d916dc798192e73f9bab29"} Apr 28 19:26:07.215677 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:07.215634 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-dvc5c" podStartSLOduration=1.609675849 podStartE2EDuration="5.21562312s" podCreationTimestamp="2026-04-28 19:26:02 +0000 UTC" firstStartedPulling="2026-04-28 19:26:02.790197011 +0000 UTC m=+578.753097355" lastFinishedPulling="2026-04-28 19:26:06.396144266 +0000 UTC m=+582.359044626" observedRunningTime="2026-04-28 19:26:07.215328938 +0000 UTC m=+583.178229305" watchObservedRunningTime="2026-04-28 19:26:07.21562312 +0000 UTC m=+583.178523482" Apr 28 19:26:07.242354 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:07.242315 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-65d79b9785-vfbq2" podStartSLOduration=1.242306142 podStartE2EDuration="1.242306142s" podCreationTimestamp="2026-04-28 19:26:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:26:07.240791658 +0000 UTC m=+583.203692036" watchObservedRunningTime="2026-04-28 19:26:07.242306142 +0000 UTC m=+583.205206507" Apr 28 19:26:07.430953 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:07.430921 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-hjmdl"] Apr 28 19:26:07.434105 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:07.434090 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-hjmdl" Apr 28 19:26:07.439950 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:07.439900 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 28 19:26:07.440050 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:07.439907 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 28 19:26:07.440050 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:07.439907 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-gkwcz\"" Apr 28 19:26:07.457632 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:07.457587 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-hjmdl"] Apr 28 19:26:07.551337 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:07.551305 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8f7fr\" (UniqueName: \"kubernetes.io/projected/0e77644f-3f4e-4b9f-ac2d-a6ca7a93fe14-kube-api-access-8f7fr\") pod \"keda-metrics-apiserver-7c9f485588-hjmdl\" (UID: \"0e77644f-3f4e-4b9f-ac2d-a6ca7a93fe14\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-hjmdl" Apr 28 19:26:07.551520 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:07.551356 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/0e77644f-3f4e-4b9f-ac2d-a6ca7a93fe14-certificates\") pod \"keda-metrics-apiserver-7c9f485588-hjmdl\" (UID: \"0e77644f-3f4e-4b9f-ac2d-a6ca7a93fe14\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-hjmdl" Apr 28 19:26:07.551520 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:07.551442 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/0e77644f-3f4e-4b9f-ac2d-a6ca7a93fe14-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-hjmdl\" (UID: \"0e77644f-3f4e-4b9f-ac2d-a6ca7a93fe14\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-hjmdl" Apr 28 19:26:07.652398 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:07.652359 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8f7fr\" (UniqueName: \"kubernetes.io/projected/0e77644f-3f4e-4b9f-ac2d-a6ca7a93fe14-kube-api-access-8f7fr\") pod \"keda-metrics-apiserver-7c9f485588-hjmdl\" (UID: \"0e77644f-3f4e-4b9f-ac2d-a6ca7a93fe14\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-hjmdl" Apr 28 19:26:07.652579 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:07.652444 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/0e77644f-3f4e-4b9f-ac2d-a6ca7a93fe14-certificates\") pod \"keda-metrics-apiserver-7c9f485588-hjmdl\" (UID: \"0e77644f-3f4e-4b9f-ac2d-a6ca7a93fe14\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-hjmdl" Apr 28 19:26:07.652579 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:07.652513 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/0e77644f-3f4e-4b9f-ac2d-a6ca7a93fe14-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-hjmdl\" (UID: \"0e77644f-3f4e-4b9f-ac2d-a6ca7a93fe14\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-hjmdl" Apr 28 19:26:07.652579 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:26:07.652567 2576 secret.go:281] references non-existent secret key: tls.crt Apr 28 19:26:07.652781 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:26:07.652585 2576 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 28 19:26:07.652781 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:26:07.652626 2576 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-hjmdl: references non-existent secret key: tls.crt Apr 28 19:26:07.652781 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:26:07.652691 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0e77644f-3f4e-4b9f-ac2d-a6ca7a93fe14-certificates podName:0e77644f-3f4e-4b9f-ac2d-a6ca7a93fe14 nodeName:}" failed. No retries permitted until 2026-04-28 19:26:08.152670753 +0000 UTC m=+584.115571103 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/0e77644f-3f4e-4b9f-ac2d-a6ca7a93fe14-certificates") pod "keda-metrics-apiserver-7c9f485588-hjmdl" (UID: "0e77644f-3f4e-4b9f-ac2d-a6ca7a93fe14") : references non-existent secret key: tls.crt Apr 28 19:26:07.652988 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:07.652966 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/0e77644f-3f4e-4b9f-ac2d-a6ca7a93fe14-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-hjmdl\" (UID: \"0e77644f-3f4e-4b9f-ac2d-a6ca7a93fe14\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-hjmdl" Apr 28 19:26:07.660229 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:07.660202 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-pzfq2"] Apr 28 19:26:07.663662 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:07.663648 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-pzfq2" Apr 28 19:26:07.666918 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:07.666901 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 28 19:26:07.699943 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:07.699885 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-pzfq2"] Apr 28 19:26:07.703054 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:07.703028 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8f7fr\" (UniqueName: \"kubernetes.io/projected/0e77644f-3f4e-4b9f-ac2d-a6ca7a93fe14-kube-api-access-8f7fr\") pod \"keda-metrics-apiserver-7c9f485588-hjmdl\" (UID: \"0e77644f-3f4e-4b9f-ac2d-a6ca7a93fe14\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-hjmdl" Apr 28 19:26:07.753238 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:07.753208 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/f4bac08b-9ff7-42a7-9c68-6b5d198960f7-certificates\") pod \"keda-admission-cf49989db-pzfq2\" (UID: \"f4bac08b-9ff7-42a7-9c68-6b5d198960f7\") " pod="openshift-keda/keda-admission-cf49989db-pzfq2" Apr 28 19:26:07.753396 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:07.753368 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqvhm\" (UniqueName: \"kubernetes.io/projected/f4bac08b-9ff7-42a7-9c68-6b5d198960f7-kube-api-access-kqvhm\") pod \"keda-admission-cf49989db-pzfq2\" (UID: \"f4bac08b-9ff7-42a7-9c68-6b5d198960f7\") " pod="openshift-keda/keda-admission-cf49989db-pzfq2" Apr 28 19:26:07.855155 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:07.855121 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/f4bac08b-9ff7-42a7-9c68-6b5d198960f7-certificates\") pod \"keda-admission-cf49989db-pzfq2\" (UID: \"f4bac08b-9ff7-42a7-9c68-6b5d198960f7\") " pod="openshift-keda/keda-admission-cf49989db-pzfq2" Apr 28 19:26:07.855467 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:07.855448 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kqvhm\" (UniqueName: \"kubernetes.io/projected/f4bac08b-9ff7-42a7-9c68-6b5d198960f7-kube-api-access-kqvhm\") pod \"keda-admission-cf49989db-pzfq2\" (UID: \"f4bac08b-9ff7-42a7-9c68-6b5d198960f7\") " pod="openshift-keda/keda-admission-cf49989db-pzfq2" Apr 28 19:26:07.860978 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:07.859103 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/f4bac08b-9ff7-42a7-9c68-6b5d198960f7-certificates\") pod \"keda-admission-cf49989db-pzfq2\" (UID: \"f4bac08b-9ff7-42a7-9c68-6b5d198960f7\") " pod="openshift-keda/keda-admission-cf49989db-pzfq2" Apr 28 19:26:07.870535 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:07.870502 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqvhm\" (UniqueName: \"kubernetes.io/projected/f4bac08b-9ff7-42a7-9c68-6b5d198960f7-kube-api-access-kqvhm\") pod \"keda-admission-cf49989db-pzfq2\" (UID: \"f4bac08b-9ff7-42a7-9c68-6b5d198960f7\") " pod="openshift-keda/keda-admission-cf49989db-pzfq2" Apr 28 19:26:07.973185 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:07.973097 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-pzfq2" Apr 28 19:26:08.093079 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:08.093052 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-pzfq2"] Apr 28 19:26:08.095619 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:26:08.095575 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4bac08b_9ff7_42a7_9c68_6b5d198960f7.slice/crio-a596c2d1dcd3d77aa98677c7ba7b43b50fb6f1fc2dd7140ca15f67d408dafbde WatchSource:0}: Error finding container a596c2d1dcd3d77aa98677c7ba7b43b50fb6f1fc2dd7140ca15f67d408dafbde: Status 404 returned error can't find the container with id a596c2d1dcd3d77aa98677c7ba7b43b50fb6f1fc2dd7140ca15f67d408dafbde Apr 28 19:26:08.158284 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:08.158251 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/0e77644f-3f4e-4b9f-ac2d-a6ca7a93fe14-certificates\") pod \"keda-metrics-apiserver-7c9f485588-hjmdl\" (UID: \"0e77644f-3f4e-4b9f-ac2d-a6ca7a93fe14\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-hjmdl" Apr 28 19:26:08.158424 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:26:08.158409 2576 secret.go:281] references non-existent secret key: tls.crt Apr 28 19:26:08.158481 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:26:08.158433 2576 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 28 19:26:08.158481 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:26:08.158455 2576 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-hjmdl: references non-existent secret key: tls.crt Apr 28 19:26:08.158544 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:26:08.158517 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0e77644f-3f4e-4b9f-ac2d-a6ca7a93fe14-certificates podName:0e77644f-3f4e-4b9f-ac2d-a6ca7a93fe14 nodeName:}" failed. No retries permitted until 2026-04-28 19:26:09.158498688 +0000 UTC m=+585.121399036 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/0e77644f-3f4e-4b9f-ac2d-a6ca7a93fe14-certificates") pod "keda-metrics-apiserver-7c9f485588-hjmdl" (UID: "0e77644f-3f4e-4b9f-ac2d-a6ca7a93fe14") : references non-existent secret key: tls.crt Apr 28 19:26:08.194405 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:08.194372 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-pzfq2" event={"ID":"f4bac08b-9ff7-42a7-9c68-6b5d198960f7","Type":"ContainerStarted","Data":"a596c2d1dcd3d77aa98677c7ba7b43b50fb6f1fc2dd7140ca15f67d408dafbde"} Apr 28 19:26:09.169415 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:09.169379 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/0e77644f-3f4e-4b9f-ac2d-a6ca7a93fe14-certificates\") pod \"keda-metrics-apiserver-7c9f485588-hjmdl\" (UID: \"0e77644f-3f4e-4b9f-ac2d-a6ca7a93fe14\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-hjmdl" Apr 28 19:26:09.169828 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:26:09.169550 2576 secret.go:281] references non-existent secret key: tls.crt Apr 28 19:26:09.169828 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:26:09.169570 2576 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 28 19:26:09.169828 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:26:09.169590 2576 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-hjmdl: references non-existent secret key: tls.crt Apr 28 19:26:09.169828 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:26:09.169660 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0e77644f-3f4e-4b9f-ac2d-a6ca7a93fe14-certificates podName:0e77644f-3f4e-4b9f-ac2d-a6ca7a93fe14 nodeName:}" failed. No retries permitted until 2026-04-28 19:26:11.169646008 +0000 UTC m=+587.132546353 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/0e77644f-3f4e-4b9f-ac2d-a6ca7a93fe14-certificates") pod "keda-metrics-apiserver-7c9f485588-hjmdl" (UID: "0e77644f-3f4e-4b9f-ac2d-a6ca7a93fe14") : references non-existent secret key: tls.crt Apr 28 19:26:10.202793 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:10.202754 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-pzfq2" event={"ID":"f4bac08b-9ff7-42a7-9c68-6b5d198960f7","Type":"ContainerStarted","Data":"2db5cdbbb42a2d44d0d79d79d852f4cbf8917fc18598f6d1502de512960616e8"} Apr 28 19:26:10.203136 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:10.202814 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-pzfq2" Apr 28 19:26:10.227818 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:10.227766 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-pzfq2" podStartSLOduration=1.698463046 podStartE2EDuration="3.227752515s" podCreationTimestamp="2026-04-28 19:26:07 +0000 UTC" firstStartedPulling="2026-04-28 19:26:08.096746678 +0000 UTC m=+584.059647022" lastFinishedPulling="2026-04-28 19:26:09.626036147 +0000 UTC m=+585.588936491" observedRunningTime="2026-04-28 19:26:10.225798399 +0000 UTC m=+586.188698768" watchObservedRunningTime="2026-04-28 19:26:10.227752515 +0000 UTC m=+586.190652881" Apr 28 19:26:11.185494 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:11.185458 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/0e77644f-3f4e-4b9f-ac2d-a6ca7a93fe14-certificates\") pod \"keda-metrics-apiserver-7c9f485588-hjmdl\" (UID: \"0e77644f-3f4e-4b9f-ac2d-a6ca7a93fe14\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-hjmdl" Apr 28 19:26:11.188166 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:11.188144 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/0e77644f-3f4e-4b9f-ac2d-a6ca7a93fe14-certificates\") pod \"keda-metrics-apiserver-7c9f485588-hjmdl\" (UID: \"0e77644f-3f4e-4b9f-ac2d-a6ca7a93fe14\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-hjmdl" Apr 28 19:26:11.344617 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:11.344565 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-hjmdl" Apr 28 19:26:11.488882 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:11.487875 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-hjmdl"] Apr 28 19:26:12.212456 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:12.212413 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-hjmdl" event={"ID":"0e77644f-3f4e-4b9f-ac2d-a6ca7a93fe14","Type":"ContainerStarted","Data":"016de15bcc9d445921548f2a8fbe13432be33b871b6aa65bd5fa89d2a6272f0f"} Apr 28 19:26:15.223257 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:15.223221 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-hjmdl" event={"ID":"0e77644f-3f4e-4b9f-ac2d-a6ca7a93fe14","Type":"ContainerStarted","Data":"04d0de04249304e04fda20cfc82abe45f301e431a107077a7bf01904cb6c8074"} Apr 28 19:26:15.223731 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:15.223343 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-hjmdl" Apr 28 19:26:15.256040 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:15.255989 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-hjmdl" podStartSLOduration=5.024777211 podStartE2EDuration="8.255975138s" podCreationTimestamp="2026-04-28 19:26:07 +0000 UTC" firstStartedPulling="2026-04-28 19:26:11.487386324 +0000 UTC m=+587.450286672" lastFinishedPulling="2026-04-28 19:26:14.718584241 +0000 UTC m=+590.681484599" observedRunningTime="2026-04-28 19:26:15.255463736 +0000 UTC m=+591.218364106" watchObservedRunningTime="2026-04-28 19:26:15.255975138 +0000 UTC m=+591.218875545" Apr 28 19:26:16.758358 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:16.758318 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-65d79b9785-vfbq2" Apr 28 19:26:16.758774 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:16.758371 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-65d79b9785-vfbq2" Apr 28 19:26:16.763102 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:16.763079 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-65d79b9785-vfbq2" Apr 28 19:26:17.233834 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:17.233757 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-65d79b9785-vfbq2" Apr 28 19:26:17.295211 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:17.295173 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7b4bf5bbd9-w6s82"] Apr 28 19:26:24.456033 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:24.456001 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fts6r_6551d5d0-2583-4478-98ab-1efc22016165/console-operator/1.log" Apr 28 19:26:24.456693 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:24.456672 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fts6r_6551d5d0-2583-4478-98ab-1efc22016165/console-operator/1.log" Apr 28 19:26:24.461655 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:24.461631 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5tdk8_ff2f50e1-de53-4f11-a477-9236b340536b/ovn-acl-logging/0.log" Apr 28 19:26:24.462429 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:24.462411 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5tdk8_ff2f50e1-de53-4f11-a477-9236b340536b/ovn-acl-logging/0.log" Apr 28 19:26:26.230362 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:26.230337 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-hjmdl" Apr 28 19:26:28.196767 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:28.196739 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-dvc5c" Apr 28 19:26:31.210668 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:31.210636 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-pzfq2" Apr 28 19:26:42.316144 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:42.316080 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-7b4bf5bbd9-w6s82" podUID="9f7863a5-951d-4d3a-b503-a031d7ccb87f" containerName="console" containerID="cri-o://e9588f82f4e18013524cf8a75d6e3b26354f5c86dc594b2ca1a5c506121015af" gracePeriod=15 Apr 28 19:26:42.556362 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:42.556337 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7b4bf5bbd9-w6s82_9f7863a5-951d-4d3a-b503-a031d7ccb87f/console/0.log" Apr 28 19:26:42.556509 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:42.556407 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7b4bf5bbd9-w6s82" Apr 28 19:26:42.751191 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:42.751163 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9f7863a5-951d-4d3a-b503-a031d7ccb87f-console-serving-cert\") pod \"9f7863a5-951d-4d3a-b503-a031d7ccb87f\" (UID: \"9f7863a5-951d-4d3a-b503-a031d7ccb87f\") " Apr 28 19:26:42.751349 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:42.751201 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9f7863a5-951d-4d3a-b503-a031d7ccb87f-console-oauth-config\") pod \"9f7863a5-951d-4d3a-b503-a031d7ccb87f\" (UID: \"9f7863a5-951d-4d3a-b503-a031d7ccb87f\") " Apr 28 19:26:42.751349 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:42.751245 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bj87z\" (UniqueName: \"kubernetes.io/projected/9f7863a5-951d-4d3a-b503-a031d7ccb87f-kube-api-access-bj87z\") pod \"9f7863a5-951d-4d3a-b503-a031d7ccb87f\" (UID: \"9f7863a5-951d-4d3a-b503-a031d7ccb87f\") " Apr 28 19:26:42.751349 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:42.751288 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9f7863a5-951d-4d3a-b503-a031d7ccb87f-oauth-serving-cert\") pod \"9f7863a5-951d-4d3a-b503-a031d7ccb87f\" (UID: \"9f7863a5-951d-4d3a-b503-a031d7ccb87f\") " Apr 28 19:26:42.751349 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:42.751317 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9f7863a5-951d-4d3a-b503-a031d7ccb87f-service-ca\") pod \"9f7863a5-951d-4d3a-b503-a031d7ccb87f\" (UID: \"9f7863a5-951d-4d3a-b503-a031d7ccb87f\") " Apr 28 19:26:42.751500 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:42.751365 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9f7863a5-951d-4d3a-b503-a031d7ccb87f-console-config\") pod \"9f7863a5-951d-4d3a-b503-a031d7ccb87f\" (UID: \"9f7863a5-951d-4d3a-b503-a031d7ccb87f\") " Apr 28 19:26:42.751500 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:42.751400 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9f7863a5-951d-4d3a-b503-a031d7ccb87f-trusted-ca-bundle\") pod \"9f7863a5-951d-4d3a-b503-a031d7ccb87f\" (UID: \"9f7863a5-951d-4d3a-b503-a031d7ccb87f\") " Apr 28 19:26:42.751860 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:42.751833 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f7863a5-951d-4d3a-b503-a031d7ccb87f-service-ca" (OuterVolumeSpecName: "service-ca") pod "9f7863a5-951d-4d3a-b503-a031d7ccb87f" (UID: "9f7863a5-951d-4d3a-b503-a031d7ccb87f"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:26:42.751961 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:42.751827 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f7863a5-951d-4d3a-b503-a031d7ccb87f-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "9f7863a5-951d-4d3a-b503-a031d7ccb87f" (UID: "9f7863a5-951d-4d3a-b503-a031d7ccb87f"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:26:42.751961 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:42.751895 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f7863a5-951d-4d3a-b503-a031d7ccb87f-console-config" (OuterVolumeSpecName: "console-config") pod "9f7863a5-951d-4d3a-b503-a031d7ccb87f" (UID: "9f7863a5-951d-4d3a-b503-a031d7ccb87f"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:26:42.752150 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:42.752128 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f7863a5-951d-4d3a-b503-a031d7ccb87f-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "9f7863a5-951d-4d3a-b503-a031d7ccb87f" (UID: "9f7863a5-951d-4d3a-b503-a031d7ccb87f"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:26:42.753640 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:42.753584 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f7863a5-951d-4d3a-b503-a031d7ccb87f-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "9f7863a5-951d-4d3a-b503-a031d7ccb87f" (UID: "9f7863a5-951d-4d3a-b503-a031d7ccb87f"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:26:42.753744 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:42.753691 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f7863a5-951d-4d3a-b503-a031d7ccb87f-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "9f7863a5-951d-4d3a-b503-a031d7ccb87f" (UID: "9f7863a5-951d-4d3a-b503-a031d7ccb87f"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:26:42.753790 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:42.753736 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f7863a5-951d-4d3a-b503-a031d7ccb87f-kube-api-access-bj87z" (OuterVolumeSpecName: "kube-api-access-bj87z") pod "9f7863a5-951d-4d3a-b503-a031d7ccb87f" (UID: "9f7863a5-951d-4d3a-b503-a031d7ccb87f"). InnerVolumeSpecName "kube-api-access-bj87z". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:26:42.852916 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:42.852878 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bj87z\" (UniqueName: \"kubernetes.io/projected/9f7863a5-951d-4d3a-b503-a031d7ccb87f-kube-api-access-bj87z\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 28 19:26:42.852916 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:42.852908 2576 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9f7863a5-951d-4d3a-b503-a031d7ccb87f-oauth-serving-cert\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 28 19:26:42.852916 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:42.852920 2576 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9f7863a5-951d-4d3a-b503-a031d7ccb87f-service-ca\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 28 19:26:42.852916 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:42.852929 2576 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9f7863a5-951d-4d3a-b503-a031d7ccb87f-console-config\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 28 19:26:42.853169 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:42.852937 2576 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9f7863a5-951d-4d3a-b503-a031d7ccb87f-trusted-ca-bundle\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 28 19:26:42.853169 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:42.852946 2576 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9f7863a5-951d-4d3a-b503-a031d7ccb87f-console-serving-cert\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 28 19:26:42.853169 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:42.852956 2576 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9f7863a5-951d-4d3a-b503-a031d7ccb87f-console-oauth-config\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 28 19:26:43.314049 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:43.314013 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7b4bf5bbd9-w6s82_9f7863a5-951d-4d3a-b503-a031d7ccb87f/console/0.log" Apr 28 19:26:43.314196 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:43.314055 2576 generic.go:358] "Generic (PLEG): container finished" podID="9f7863a5-951d-4d3a-b503-a031d7ccb87f" containerID="e9588f82f4e18013524cf8a75d6e3b26354f5c86dc594b2ca1a5c506121015af" exitCode=2 Apr 28 19:26:43.314196 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:43.314121 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7b4bf5bbd9-w6s82" Apr 28 19:26:43.314196 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:43.314148 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7b4bf5bbd9-w6s82" event={"ID":"9f7863a5-951d-4d3a-b503-a031d7ccb87f","Type":"ContainerDied","Data":"e9588f82f4e18013524cf8a75d6e3b26354f5c86dc594b2ca1a5c506121015af"} Apr 28 19:26:43.314196 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:43.314192 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7b4bf5bbd9-w6s82" event={"ID":"9f7863a5-951d-4d3a-b503-a031d7ccb87f","Type":"ContainerDied","Data":"47c64f69664b410ab70dde04f7cfc5eeb182ffe8ab7dc13e32188e81f6d1bf58"} Apr 28 19:26:43.314365 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:43.314210 2576 scope.go:117] "RemoveContainer" containerID="e9588f82f4e18013524cf8a75d6e3b26354f5c86dc594b2ca1a5c506121015af" Apr 28 19:26:43.322725 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:43.322507 2576 scope.go:117] "RemoveContainer" containerID="e9588f82f4e18013524cf8a75d6e3b26354f5c86dc594b2ca1a5c506121015af" Apr 28 19:26:43.322953 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:26:43.322822 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9588f82f4e18013524cf8a75d6e3b26354f5c86dc594b2ca1a5c506121015af\": container with ID starting with e9588f82f4e18013524cf8a75d6e3b26354f5c86dc594b2ca1a5c506121015af not found: ID does not exist" containerID="e9588f82f4e18013524cf8a75d6e3b26354f5c86dc594b2ca1a5c506121015af" Apr 28 19:26:43.322953 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:43.322848 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9588f82f4e18013524cf8a75d6e3b26354f5c86dc594b2ca1a5c506121015af"} err="failed to get container status \"e9588f82f4e18013524cf8a75d6e3b26354f5c86dc594b2ca1a5c506121015af\": rpc error: code = NotFound desc = could not find container \"e9588f82f4e18013524cf8a75d6e3b26354f5c86dc594b2ca1a5c506121015af\": container with ID starting with e9588f82f4e18013524cf8a75d6e3b26354f5c86dc594b2ca1a5c506121015af not found: ID does not exist" Apr 28 19:26:43.336141 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:43.336105 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7b4bf5bbd9-w6s82"] Apr 28 19:26:43.338351 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:43.338322 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-7b4bf5bbd9-w6s82"] Apr 28 19:26:44.513826 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:26:44.513791 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f7863a5-951d-4d3a-b503-a031d7ccb87f" path="/var/lib/kubelet/pods/9f7863a5-951d-4d3a-b503-a031d7ccb87f/volumes" Apr 28 19:27:14.785440 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:27:14.785411 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-86cc847c5c-nctcv"] Apr 28 19:27:14.786023 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:27:14.785856 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9f7863a5-951d-4d3a-b503-a031d7ccb87f" containerName="console" Apr 28 19:27:14.786023 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:27:14.785874 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f7863a5-951d-4d3a-b503-a031d7ccb87f" containerName="console" Apr 28 19:27:14.786023 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:27:14.785942 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="9f7863a5-951d-4d3a-b503-a031d7ccb87f" containerName="console" Apr 28 19:27:14.787751 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:27:14.787730 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-nctcv" Apr 28 19:27:14.790570 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:27:14.790548 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 28 19:27:14.790680 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:27:14.790577 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 28 19:27:14.791791 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:27:14.791771 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 28 19:27:14.791791 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:27:14.791786 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-z8xsc\"" Apr 28 19:27:14.798714 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:27:14.798695 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-nctcv"] Apr 28 19:27:14.894830 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:27:14.894795 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/76b5d293-e0e7-41d3-afda-56b52042dc5b-data\") pod \"seaweedfs-86cc847c5c-nctcv\" (UID: \"76b5d293-e0e7-41d3-afda-56b52042dc5b\") " pod="kserve/seaweedfs-86cc847c5c-nctcv" Apr 28 19:27:14.894830 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:27:14.894830 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpz7m\" (UniqueName: \"kubernetes.io/projected/76b5d293-e0e7-41d3-afda-56b52042dc5b-kube-api-access-qpz7m\") pod \"seaweedfs-86cc847c5c-nctcv\" (UID: \"76b5d293-e0e7-41d3-afda-56b52042dc5b\") " pod="kserve/seaweedfs-86cc847c5c-nctcv" Apr 28 19:27:14.995440 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:27:14.995403 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/76b5d293-e0e7-41d3-afda-56b52042dc5b-data\") pod \"seaweedfs-86cc847c5c-nctcv\" (UID: \"76b5d293-e0e7-41d3-afda-56b52042dc5b\") " pod="kserve/seaweedfs-86cc847c5c-nctcv" Apr 28 19:27:14.995440 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:27:14.995442 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qpz7m\" (UniqueName: \"kubernetes.io/projected/76b5d293-e0e7-41d3-afda-56b52042dc5b-kube-api-access-qpz7m\") pod \"seaweedfs-86cc847c5c-nctcv\" (UID: \"76b5d293-e0e7-41d3-afda-56b52042dc5b\") " pod="kserve/seaweedfs-86cc847c5c-nctcv" Apr 28 19:27:14.995888 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:27:14.995862 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/76b5d293-e0e7-41d3-afda-56b52042dc5b-data\") pod \"seaweedfs-86cc847c5c-nctcv\" (UID: \"76b5d293-e0e7-41d3-afda-56b52042dc5b\") " pod="kserve/seaweedfs-86cc847c5c-nctcv" Apr 28 19:27:15.003958 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:27:15.003927 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpz7m\" (UniqueName: \"kubernetes.io/projected/76b5d293-e0e7-41d3-afda-56b52042dc5b-kube-api-access-qpz7m\") pod \"seaweedfs-86cc847c5c-nctcv\" (UID: \"76b5d293-e0e7-41d3-afda-56b52042dc5b\") " pod="kserve/seaweedfs-86cc847c5c-nctcv" Apr 28 19:27:15.097127 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:27:15.097051 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-nctcv" Apr 28 19:27:15.225773 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:27:15.225749 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-nctcv"] Apr 28 19:27:15.228542 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:27:15.228514 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod76b5d293_e0e7_41d3_afda_56b52042dc5b.slice/crio-6f1ab0823db88f8fd151ada4944a9f23a7ea88e4437cf1323edbb368db0b0c32 WatchSource:0}: Error finding container 6f1ab0823db88f8fd151ada4944a9f23a7ea88e4437cf1323edbb368db0b0c32: Status 404 returned error can't find the container with id 6f1ab0823db88f8fd151ada4944a9f23a7ea88e4437cf1323edbb368db0b0c32 Apr 28 19:27:15.411695 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:27:15.411586 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-nctcv" event={"ID":"76b5d293-e0e7-41d3-afda-56b52042dc5b","Type":"ContainerStarted","Data":"6f1ab0823db88f8fd151ada4944a9f23a7ea88e4437cf1323edbb368db0b0c32"} Apr 28 19:27:18.427202 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:27:18.427163 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-nctcv" event={"ID":"76b5d293-e0e7-41d3-afda-56b52042dc5b","Type":"ContainerStarted","Data":"4938895626a89cf34d5e074f228a9092f6babad1749a2c62b132e14fb586a1cf"} Apr 28 19:27:18.427626 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:27:18.427335 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-86cc847c5c-nctcv" Apr 28 19:27:18.446061 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:27:18.446018 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-86cc847c5c-nctcv" podStartSLOduration=1.420766156 podStartE2EDuration="4.446006461s" podCreationTimestamp="2026-04-28 19:27:14 +0000 UTC" firstStartedPulling="2026-04-28 19:27:15.230201176 +0000 UTC m=+651.193101520" lastFinishedPulling="2026-04-28 19:27:18.255441466 +0000 UTC m=+654.218341825" observedRunningTime="2026-04-28 19:27:18.44395042 +0000 UTC m=+654.406850784" watchObservedRunningTime="2026-04-28 19:27:18.446006461 +0000 UTC m=+654.408906826" Apr 28 19:27:24.433049 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:27:24.433019 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-86cc847c5c-nctcv" Apr 28 19:28:25.769992 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:28:25.769907 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-r8x4c"] Apr 28 19:28:25.773003 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:28:25.772986 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-r8x4c" Apr 28 19:28:25.775840 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:28:25.775816 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-24fh7\"" Apr 28 19:28:25.775962 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:28:25.775817 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 28 19:28:25.783276 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:28:25.783251 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-r8x4c"] Apr 28 19:28:25.785327 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:28:25.785296 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-2sdrd"] Apr 28 19:28:25.788321 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:28:25.788300 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-2sdrd" Apr 28 19:28:25.791581 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:28:25.791558 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 28 19:28:25.791692 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:28:25.791581 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-2zzt6\"" Apr 28 19:28:25.799020 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:28:25.798999 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-2sdrd"] Apr 28 19:28:25.852914 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:28:25.852877 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lk6k\" (UniqueName: \"kubernetes.io/projected/71203882-4a2d-4f7a-a355-6babe49bc167-kube-api-access-5lk6k\") pod \"model-serving-api-86f7b4b499-r8x4c\" (UID: \"71203882-4a2d-4f7a-a355-6babe49bc167\") " pod="kserve/model-serving-api-86f7b4b499-r8x4c" Apr 28 19:28:25.852914 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:28:25.852921 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/71203882-4a2d-4f7a-a355-6babe49bc167-tls-certs\") pod \"model-serving-api-86f7b4b499-r8x4c\" (UID: \"71203882-4a2d-4f7a-a355-6babe49bc167\") " pod="kserve/model-serving-api-86f7b4b499-r8x4c" Apr 28 19:28:25.853154 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:28:25.852987 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0e7a1512-ee6c-4bac-8c74-191587ce85b3-cert\") pod \"odh-model-controller-696fc77849-2sdrd\" (UID: \"0e7a1512-ee6c-4bac-8c74-191587ce85b3\") " pod="kserve/odh-model-controller-696fc77849-2sdrd" Apr 28 19:28:25.853154 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:28:25.853035 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrwpk\" (UniqueName: \"kubernetes.io/projected/0e7a1512-ee6c-4bac-8c74-191587ce85b3-kube-api-access-mrwpk\") pod \"odh-model-controller-696fc77849-2sdrd\" (UID: \"0e7a1512-ee6c-4bac-8c74-191587ce85b3\") " pod="kserve/odh-model-controller-696fc77849-2sdrd" Apr 28 19:28:25.954012 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:28:25.953957 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5lk6k\" (UniqueName: \"kubernetes.io/projected/71203882-4a2d-4f7a-a355-6babe49bc167-kube-api-access-5lk6k\") pod \"model-serving-api-86f7b4b499-r8x4c\" (UID: \"71203882-4a2d-4f7a-a355-6babe49bc167\") " pod="kserve/model-serving-api-86f7b4b499-r8x4c" Apr 28 19:28:25.954012 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:28:25.954022 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/71203882-4a2d-4f7a-a355-6babe49bc167-tls-certs\") pod \"model-serving-api-86f7b4b499-r8x4c\" (UID: \"71203882-4a2d-4f7a-a355-6babe49bc167\") " pod="kserve/model-serving-api-86f7b4b499-r8x4c" Apr 28 19:28:25.954258 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:28:25.954054 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0e7a1512-ee6c-4bac-8c74-191587ce85b3-cert\") pod \"odh-model-controller-696fc77849-2sdrd\" (UID: \"0e7a1512-ee6c-4bac-8c74-191587ce85b3\") " pod="kserve/odh-model-controller-696fc77849-2sdrd" Apr 28 19:28:25.954258 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:28:25.954076 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mrwpk\" (UniqueName: \"kubernetes.io/projected/0e7a1512-ee6c-4bac-8c74-191587ce85b3-kube-api-access-mrwpk\") pod \"odh-model-controller-696fc77849-2sdrd\" (UID: \"0e7a1512-ee6c-4bac-8c74-191587ce85b3\") " pod="kserve/odh-model-controller-696fc77849-2sdrd" Apr 28 19:28:25.954258 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:28:25.954196 2576 secret.go:189] Couldn't get secret kserve/model-serving-api-tls: secret "model-serving-api-tls" not found Apr 28 19:28:25.954400 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:28:25.954265 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/71203882-4a2d-4f7a-a355-6babe49bc167-tls-certs podName:71203882-4a2d-4f7a-a355-6babe49bc167 nodeName:}" failed. No retries permitted until 2026-04-28 19:28:26.454244686 +0000 UTC m=+722.417145044 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/71203882-4a2d-4f7a-a355-6babe49bc167-tls-certs") pod "model-serving-api-86f7b4b499-r8x4c" (UID: "71203882-4a2d-4f7a-a355-6babe49bc167") : secret "model-serving-api-tls" not found Apr 28 19:28:25.954400 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:28:25.954198 2576 secret.go:189] Couldn't get secret kserve/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 28 19:28:25.954400 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:28:25.954369 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0e7a1512-ee6c-4bac-8c74-191587ce85b3-cert podName:0e7a1512-ee6c-4bac-8c74-191587ce85b3 nodeName:}" failed. No retries permitted until 2026-04-28 19:28:26.454350371 +0000 UTC m=+722.417250716 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0e7a1512-ee6c-4bac-8c74-191587ce85b3-cert") pod "odh-model-controller-696fc77849-2sdrd" (UID: "0e7a1512-ee6c-4bac-8c74-191587ce85b3") : secret "odh-model-controller-webhook-cert" not found Apr 28 19:28:25.967425 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:28:25.967388 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrwpk\" (UniqueName: \"kubernetes.io/projected/0e7a1512-ee6c-4bac-8c74-191587ce85b3-kube-api-access-mrwpk\") pod \"odh-model-controller-696fc77849-2sdrd\" (UID: \"0e7a1512-ee6c-4bac-8c74-191587ce85b3\") " pod="kserve/odh-model-controller-696fc77849-2sdrd" Apr 28 19:28:25.967587 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:28:25.967480 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lk6k\" (UniqueName: \"kubernetes.io/projected/71203882-4a2d-4f7a-a355-6babe49bc167-kube-api-access-5lk6k\") pod \"model-serving-api-86f7b4b499-r8x4c\" (UID: \"71203882-4a2d-4f7a-a355-6babe49bc167\") " pod="kserve/model-serving-api-86f7b4b499-r8x4c" Apr 28 19:28:26.458925 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:28:26.458872 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/71203882-4a2d-4f7a-a355-6babe49bc167-tls-certs\") pod \"model-serving-api-86f7b4b499-r8x4c\" (UID: \"71203882-4a2d-4f7a-a355-6babe49bc167\") " pod="kserve/model-serving-api-86f7b4b499-r8x4c" Apr 28 19:28:26.458925 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:28:26.458936 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0e7a1512-ee6c-4bac-8c74-191587ce85b3-cert\") pod \"odh-model-controller-696fc77849-2sdrd\" (UID: \"0e7a1512-ee6c-4bac-8c74-191587ce85b3\") " pod="kserve/odh-model-controller-696fc77849-2sdrd" Apr 28 19:28:26.461518 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:28:26.461471 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0e7a1512-ee6c-4bac-8c74-191587ce85b3-cert\") pod \"odh-model-controller-696fc77849-2sdrd\" (UID: \"0e7a1512-ee6c-4bac-8c74-191587ce85b3\") " pod="kserve/odh-model-controller-696fc77849-2sdrd" Apr 28 19:28:26.461683 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:28:26.461565 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/71203882-4a2d-4f7a-a355-6babe49bc167-tls-certs\") pod \"model-serving-api-86f7b4b499-r8x4c\" (UID: \"71203882-4a2d-4f7a-a355-6babe49bc167\") " pod="kserve/model-serving-api-86f7b4b499-r8x4c" Apr 28 19:28:26.683516 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:28:26.683484 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-r8x4c" Apr 28 19:28:26.698173 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:28:26.698145 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-2sdrd" Apr 28 19:28:26.814733 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:28:26.814701 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-r8x4c"] Apr 28 19:28:26.816581 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:28:26.816553 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71203882_4a2d_4f7a_a355_6babe49bc167.slice/crio-3896afe75e613cebb19efbc9c5e6c1032829de9a3da01640a7a13e875c81fa79 WatchSource:0}: Error finding container 3896afe75e613cebb19efbc9c5e6c1032829de9a3da01640a7a13e875c81fa79: Status 404 returned error can't find the container with id 3896afe75e613cebb19efbc9c5e6c1032829de9a3da01640a7a13e875c81fa79 Apr 28 19:28:26.837300 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:28:26.837280 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-2sdrd"] Apr 28 19:28:26.839400 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:28:26.839373 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0e7a1512_ee6c_4bac_8c74_191587ce85b3.slice/crio-272407852c712245804ac1a621ee691eb247b1a060f1be7226e90d042bee04fe WatchSource:0}: Error finding container 272407852c712245804ac1a621ee691eb247b1a060f1be7226e90d042bee04fe: Status 404 returned error can't find the container with id 272407852c712245804ac1a621ee691eb247b1a060f1be7226e90d042bee04fe Apr 28 19:28:27.648853 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:28:27.648788 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-r8x4c" event={"ID":"71203882-4a2d-4f7a-a355-6babe49bc167","Type":"ContainerStarted","Data":"3896afe75e613cebb19efbc9c5e6c1032829de9a3da01640a7a13e875c81fa79"} Apr 28 19:28:27.650424 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:28:27.650376 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-2sdrd" event={"ID":"0e7a1512-ee6c-4bac-8c74-191587ce85b3","Type":"ContainerStarted","Data":"272407852c712245804ac1a621ee691eb247b1a060f1be7226e90d042bee04fe"} Apr 28 19:28:30.663075 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:28:30.663031 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-r8x4c" event={"ID":"71203882-4a2d-4f7a-a355-6babe49bc167","Type":"ContainerStarted","Data":"c8d857b354f476305c2d503847f9a29673c458a4f25b34b6de77ebfd5a3a2765"} Apr 28 19:28:30.663514 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:28:30.663382 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-r8x4c" Apr 28 19:28:30.664594 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:28:30.664561 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-2sdrd" event={"ID":"0e7a1512-ee6c-4bac-8c74-191587ce85b3","Type":"ContainerStarted","Data":"d8c29b1a2f4384e8e061ee641bb316aa5ec32246c2eee1281367abee2a262bbd"} Apr 28 19:28:30.664866 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:28:30.664851 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-2sdrd" Apr 28 19:28:30.680434 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:28:30.680372 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-r8x4c" podStartSLOduration=1.9281234600000001 podStartE2EDuration="5.68035873s" podCreationTimestamp="2026-04-28 19:28:25 +0000 UTC" firstStartedPulling="2026-04-28 19:28:26.818381108 +0000 UTC m=+722.781281451" lastFinishedPulling="2026-04-28 19:28:30.570616366 +0000 UTC m=+726.533516721" observedRunningTime="2026-04-28 19:28:30.678922302 +0000 UTC m=+726.641822668" watchObservedRunningTime="2026-04-28 19:28:30.68035873 +0000 UTC m=+726.643259095" Apr 28 19:28:30.694196 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:28:30.694152 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-2sdrd" podStartSLOduration=1.9609553709999998 podStartE2EDuration="5.694136416s" podCreationTimestamp="2026-04-28 19:28:25 +0000 UTC" firstStartedPulling="2026-04-28 19:28:26.842393578 +0000 UTC m=+722.805293921" lastFinishedPulling="2026-04-28 19:28:30.575574619 +0000 UTC m=+726.538474966" observedRunningTime="2026-04-28 19:28:30.69318347 +0000 UTC m=+726.656083836" watchObservedRunningTime="2026-04-28 19:28:30.694136416 +0000 UTC m=+726.657036783" Apr 28 19:28:41.670881 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:28:41.670849 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-2sdrd" Apr 28 19:28:41.672752 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:28:41.672737 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-r8x4c" Apr 28 19:29:01.969134 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:29:01.969102 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-e1324-predictor-7798b8b7c8-cktzl"] Apr 28 19:29:01.972466 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:29:01.972451 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-e1324-predictor-7798b8b7c8-cktzl" Apr 28 19:29:01.975023 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:29:01.974999 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-e1324-predictor-serving-cert\"" Apr 28 19:29:01.975142 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:29:01.975026 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 28 19:29:01.976048 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:29:01.976029 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-e1324-kube-rbac-proxy-sar-config\"" Apr 28 19:29:01.976048 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:29:01.976046 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 28 19:29:01.976242 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:29:01.976047 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-fkxbx\"" Apr 28 19:29:01.980868 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:29:01.980847 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-e1324-predictor-7798b8b7c8-cktzl"] Apr 28 19:29:02.063854 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:29:02.063827 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhmvv\" (UniqueName: \"kubernetes.io/projected/ba0d6624-ce60-4b4c-ab0d-c60cddbbddcf-kube-api-access-vhmvv\") pod \"success-200-isvc-e1324-predictor-7798b8b7c8-cktzl\" (UID: \"ba0d6624-ce60-4b4c-ab0d-c60cddbbddcf\") " pod="kserve-ci-e2e-test/success-200-isvc-e1324-predictor-7798b8b7c8-cktzl" Apr 28 19:29:02.064005 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:29:02.063862 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ba0d6624-ce60-4b4c-ab0d-c60cddbbddcf-proxy-tls\") pod \"success-200-isvc-e1324-predictor-7798b8b7c8-cktzl\" (UID: \"ba0d6624-ce60-4b4c-ab0d-c60cddbbddcf\") " pod="kserve-ci-e2e-test/success-200-isvc-e1324-predictor-7798b8b7c8-cktzl" Apr 28 19:29:02.064005 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:29:02.063881 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-e1324-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ba0d6624-ce60-4b4c-ab0d-c60cddbbddcf-success-200-isvc-e1324-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-e1324-predictor-7798b8b7c8-cktzl\" (UID: \"ba0d6624-ce60-4b4c-ab0d-c60cddbbddcf\") " pod="kserve-ci-e2e-test/success-200-isvc-e1324-predictor-7798b8b7c8-cktzl" Apr 28 19:29:02.165189 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:29:02.165156 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vhmvv\" (UniqueName: \"kubernetes.io/projected/ba0d6624-ce60-4b4c-ab0d-c60cddbbddcf-kube-api-access-vhmvv\") pod \"success-200-isvc-e1324-predictor-7798b8b7c8-cktzl\" (UID: \"ba0d6624-ce60-4b4c-ab0d-c60cddbbddcf\") " pod="kserve-ci-e2e-test/success-200-isvc-e1324-predictor-7798b8b7c8-cktzl" Apr 28 19:29:02.165336 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:29:02.165197 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ba0d6624-ce60-4b4c-ab0d-c60cddbbddcf-proxy-tls\") pod \"success-200-isvc-e1324-predictor-7798b8b7c8-cktzl\" (UID: \"ba0d6624-ce60-4b4c-ab0d-c60cddbbddcf\") " pod="kserve-ci-e2e-test/success-200-isvc-e1324-predictor-7798b8b7c8-cktzl" Apr 28 19:29:02.165336 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:29:02.165226 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-e1324-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ba0d6624-ce60-4b4c-ab0d-c60cddbbddcf-success-200-isvc-e1324-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-e1324-predictor-7798b8b7c8-cktzl\" (UID: \"ba0d6624-ce60-4b4c-ab0d-c60cddbbddcf\") " pod="kserve-ci-e2e-test/success-200-isvc-e1324-predictor-7798b8b7c8-cktzl" Apr 28 19:29:02.165960 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:29:02.165940 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-e1324-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ba0d6624-ce60-4b4c-ab0d-c60cddbbddcf-success-200-isvc-e1324-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-e1324-predictor-7798b8b7c8-cktzl\" (UID: \"ba0d6624-ce60-4b4c-ab0d-c60cddbbddcf\") " pod="kserve-ci-e2e-test/success-200-isvc-e1324-predictor-7798b8b7c8-cktzl" Apr 28 19:29:02.167969 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:29:02.167946 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ba0d6624-ce60-4b4c-ab0d-c60cddbbddcf-proxy-tls\") pod \"success-200-isvc-e1324-predictor-7798b8b7c8-cktzl\" (UID: \"ba0d6624-ce60-4b4c-ab0d-c60cddbbddcf\") " pod="kserve-ci-e2e-test/success-200-isvc-e1324-predictor-7798b8b7c8-cktzl" Apr 28 19:29:02.173620 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:29:02.173567 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhmvv\" (UniqueName: \"kubernetes.io/projected/ba0d6624-ce60-4b4c-ab0d-c60cddbbddcf-kube-api-access-vhmvv\") pod \"success-200-isvc-e1324-predictor-7798b8b7c8-cktzl\" (UID: \"ba0d6624-ce60-4b4c-ab0d-c60cddbbddcf\") " pod="kserve-ci-e2e-test/success-200-isvc-e1324-predictor-7798b8b7c8-cktzl" Apr 28 19:29:02.283346 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:29:02.283318 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-e1324-predictor-7798b8b7c8-cktzl" Apr 28 19:29:02.384270 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:29:02.384235 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7b45b6d56d-gdxfv"] Apr 28 19:29:02.389975 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:29:02.389949 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7b45b6d56d-gdxfv" Apr 28 19:29:02.392631 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:29:02.392586 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\"" Apr 28 19:29:02.392907 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:29:02.392888 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-graph-1-predictor-serving-cert\"" Apr 28 19:29:02.398540 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:29:02.398515 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7b45b6d56d-gdxfv"] Apr 28 19:29:02.416518 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:29:02.416498 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-e1324-predictor-7798b8b7c8-cktzl"] Apr 28 19:29:02.418270 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:29:02.418242 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba0d6624_ce60_4b4c_ab0d_c60cddbbddcf.slice/crio-bf31d786fe2a3f1d6fdcead2016085a623c195872fb1778f2e384fddda142752 WatchSource:0}: Error finding container bf31d786fe2a3f1d6fdcead2016085a623c195872fb1778f2e384fddda142752: Status 404 returned error can't find the container with id bf31d786fe2a3f1d6fdcead2016085a623c195872fb1778f2e384fddda142752 Apr 28 19:29:02.468321 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:29:02.468295 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6ncm\" (UniqueName: \"kubernetes.io/projected/ee494290-a65e-414a-9378-97f2c8034eaa-kube-api-access-s6ncm\") pod \"isvc-sklearn-graph-1-predictor-7b45b6d56d-gdxfv\" (UID: \"ee494290-a65e-414a-9378-97f2c8034eaa\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7b45b6d56d-gdxfv" Apr 28 19:29:02.468430 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:29:02.468376 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ee494290-a65e-414a-9378-97f2c8034eaa-kserve-provision-location\") pod \"isvc-sklearn-graph-1-predictor-7b45b6d56d-gdxfv\" (UID: \"ee494290-a65e-414a-9378-97f2c8034eaa\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7b45b6d56d-gdxfv" Apr 28 19:29:02.468430 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:29:02.468420 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ee494290-a65e-414a-9378-97f2c8034eaa-proxy-tls\") pod \"isvc-sklearn-graph-1-predictor-7b45b6d56d-gdxfv\" (UID: \"ee494290-a65e-414a-9378-97f2c8034eaa\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7b45b6d56d-gdxfv" Apr 28 19:29:02.468550 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:29:02.468440 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ee494290-a65e-414a-9378-97f2c8034eaa-isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-1-predictor-7b45b6d56d-gdxfv\" (UID: \"ee494290-a65e-414a-9378-97f2c8034eaa\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7b45b6d56d-gdxfv" Apr 28 19:29:02.569425 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:29:02.569349 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s6ncm\" (UniqueName: \"kubernetes.io/projected/ee494290-a65e-414a-9378-97f2c8034eaa-kube-api-access-s6ncm\") pod \"isvc-sklearn-graph-1-predictor-7b45b6d56d-gdxfv\" (UID: \"ee494290-a65e-414a-9378-97f2c8034eaa\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7b45b6d56d-gdxfv" Apr 28 19:29:02.569425 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:29:02.569417 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ee494290-a65e-414a-9378-97f2c8034eaa-kserve-provision-location\") pod \"isvc-sklearn-graph-1-predictor-7b45b6d56d-gdxfv\" (UID: \"ee494290-a65e-414a-9378-97f2c8034eaa\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7b45b6d56d-gdxfv" Apr 28 19:29:02.569643 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:29:02.569461 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ee494290-a65e-414a-9378-97f2c8034eaa-proxy-tls\") pod \"isvc-sklearn-graph-1-predictor-7b45b6d56d-gdxfv\" (UID: \"ee494290-a65e-414a-9378-97f2c8034eaa\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7b45b6d56d-gdxfv" Apr 28 19:29:02.569643 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:29:02.569488 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ee494290-a65e-414a-9378-97f2c8034eaa-isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-1-predictor-7b45b6d56d-gdxfv\" (UID: \"ee494290-a65e-414a-9378-97f2c8034eaa\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7b45b6d56d-gdxfv" Apr 28 19:29:02.569965 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:29:02.569941 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ee494290-a65e-414a-9378-97f2c8034eaa-kserve-provision-location\") pod \"isvc-sklearn-graph-1-predictor-7b45b6d56d-gdxfv\" (UID: \"ee494290-a65e-414a-9378-97f2c8034eaa\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7b45b6d56d-gdxfv" Apr 28 19:29:02.570172 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:29:02.570152 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ee494290-a65e-414a-9378-97f2c8034eaa-isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-1-predictor-7b45b6d56d-gdxfv\" (UID: \"ee494290-a65e-414a-9378-97f2c8034eaa\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7b45b6d56d-gdxfv" Apr 28 19:29:02.572495 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:29:02.572473 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ee494290-a65e-414a-9378-97f2c8034eaa-proxy-tls\") pod \"isvc-sklearn-graph-1-predictor-7b45b6d56d-gdxfv\" (UID: \"ee494290-a65e-414a-9378-97f2c8034eaa\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7b45b6d56d-gdxfv" Apr 28 19:29:02.577851 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:29:02.577828 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6ncm\" (UniqueName: \"kubernetes.io/projected/ee494290-a65e-414a-9378-97f2c8034eaa-kube-api-access-s6ncm\") pod \"isvc-sklearn-graph-1-predictor-7b45b6d56d-gdxfv\" (UID: \"ee494290-a65e-414a-9378-97f2c8034eaa\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7b45b6d56d-gdxfv" Apr 28 19:29:02.703422 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:29:02.703380 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7b45b6d56d-gdxfv" Apr 28 19:29:02.766896 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:29:02.766838 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-e1324-predictor-7798b8b7c8-cktzl" event={"ID":"ba0d6624-ce60-4b4c-ab0d-c60cddbbddcf","Type":"ContainerStarted","Data":"bf31d786fe2a3f1d6fdcead2016085a623c195872fb1778f2e384fddda142752"} Apr 28 19:29:02.832877 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:29:02.832833 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7b45b6d56d-gdxfv"] Apr 28 19:29:02.834904 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:29:02.834879 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee494290_a65e_414a_9378_97f2c8034eaa.slice/crio-fcfbe7a93c39e2b4f0179bbd63bbc38c313708b4c0f94818a59ff34fc3d4ccc4 WatchSource:0}: Error finding container fcfbe7a93c39e2b4f0179bbd63bbc38c313708b4c0f94818a59ff34fc3d4ccc4: Status 404 returned error can't find the container with id fcfbe7a93c39e2b4f0179bbd63bbc38c313708b4c0f94818a59ff34fc3d4ccc4 Apr 28 19:29:03.778568 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:29:03.778498 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7b45b6d56d-gdxfv" event={"ID":"ee494290-a65e-414a-9378-97f2c8034eaa","Type":"ContainerStarted","Data":"fcfbe7a93c39e2b4f0179bbd63bbc38c313708b4c0f94818a59ff34fc3d4ccc4"} Apr 28 19:29:15.825890 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:29:15.825851 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7b45b6d56d-gdxfv" event={"ID":"ee494290-a65e-414a-9378-97f2c8034eaa","Type":"ContainerStarted","Data":"ac8acbf4e841b4a93d3ec99002e62fd47bd5f58ec0ac452deb39e25414166b4a"} Apr 28 19:29:16.832248 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:29:16.832205 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-e1324-predictor-7798b8b7c8-cktzl" event={"ID":"ba0d6624-ce60-4b4c-ab0d-c60cddbbddcf","Type":"ContainerStarted","Data":"f0aa17ae9a969773992b7f7963033edb6c9c9baf2c257fbb2e3bbc2ec257d308"} Apr 28 19:29:18.839704 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:29:18.839664 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-e1324-predictor-7798b8b7c8-cktzl" event={"ID":"ba0d6624-ce60-4b4c-ab0d-c60cddbbddcf","Type":"ContainerStarted","Data":"680c7069eaceb72cbe5e0b0b1e6f1e1f3379ec59f073eb1caaa372266bffe849"} Apr 28 19:29:18.840079 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:29:18.839826 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-e1324-predictor-7798b8b7c8-cktzl" Apr 28 19:29:18.859456 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:29:18.859413 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-e1324-predictor-7798b8b7c8-cktzl" podStartSLOduration=1.8121688059999999 podStartE2EDuration="17.859400292s" podCreationTimestamp="2026-04-28 19:29:01 +0000 UTC" firstStartedPulling="2026-04-28 19:29:02.420089756 +0000 UTC m=+758.382990100" lastFinishedPulling="2026-04-28 19:29:18.467321234 +0000 UTC m=+774.430221586" observedRunningTime="2026-04-28 19:29:18.857233504 +0000 UTC m=+774.820133884" watchObservedRunningTime="2026-04-28 19:29:18.859400292 +0000 UTC m=+774.822300658" Apr 28 19:29:19.843789 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:29:19.843752 2576 generic.go:358] "Generic (PLEG): container finished" podID="ee494290-a65e-414a-9378-97f2c8034eaa" containerID="ac8acbf4e841b4a93d3ec99002e62fd47bd5f58ec0ac452deb39e25414166b4a" exitCode=0 Apr 28 19:29:19.844192 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:29:19.843830 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7b45b6d56d-gdxfv" event={"ID":"ee494290-a65e-414a-9378-97f2c8034eaa","Type":"ContainerDied","Data":"ac8acbf4e841b4a93d3ec99002e62fd47bd5f58ec0ac452deb39e25414166b4a"} Apr 28 19:29:19.844192 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:29:19.844170 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-e1324-predictor-7798b8b7c8-cktzl" Apr 28 19:29:19.845455 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:29:19.845427 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-e1324-predictor-7798b8b7c8-cktzl" podUID="ba0d6624-ce60-4b4c-ab0d-c60cddbbddcf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 28 19:29:20.848619 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:29:20.848562 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-e1324-predictor-7798b8b7c8-cktzl" podUID="ba0d6624-ce60-4b4c-ab0d-c60cddbbddcf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 28 19:29:25.853756 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:29:25.853723 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-e1324-predictor-7798b8b7c8-cktzl" Apr 28 19:29:25.854335 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:29:25.854307 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-e1324-predictor-7798b8b7c8-cktzl" podUID="ba0d6624-ce60-4b4c-ab0d-c60cddbbddcf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 28 19:29:27.874123 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:29:27.874088 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7b45b6d56d-gdxfv" event={"ID":"ee494290-a65e-414a-9378-97f2c8034eaa","Type":"ContainerStarted","Data":"a6654bbe42f60ef284419f79bb43bbd4dc47d68335cbabbad4d93824eea00c7b"} Apr 28 19:29:27.874123 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:29:27.874122 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7b45b6d56d-gdxfv" event={"ID":"ee494290-a65e-414a-9378-97f2c8034eaa","Type":"ContainerStarted","Data":"520870a0f788fe3e334f84cdd911c86a00ab04d48f5624ae75baad92ce9e0cac"} Apr 28 19:29:27.874672 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:29:27.874406 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7b45b6d56d-gdxfv" Apr 28 19:29:27.874672 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:29:27.874543 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7b45b6d56d-gdxfv" Apr 28 19:29:27.875861 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:29:27.875837 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7b45b6d56d-gdxfv" podUID="ee494290-a65e-414a-9378-97f2c8034eaa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 28 19:29:27.895045 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:29:27.894991 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7b45b6d56d-gdxfv" podStartSLOduration=1.415857552 podStartE2EDuration="25.894977049s" podCreationTimestamp="2026-04-28 19:29:02 +0000 UTC" firstStartedPulling="2026-04-28 19:29:02.837199327 +0000 UTC m=+758.800099671" lastFinishedPulling="2026-04-28 19:29:27.316318823 +0000 UTC m=+783.279219168" observedRunningTime="2026-04-28 19:29:27.893373415 +0000 UTC m=+783.856273803" watchObservedRunningTime="2026-04-28 19:29:27.894977049 +0000 UTC m=+783.857877416" Apr 28 19:29:28.878028 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:29:28.877990 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7b45b6d56d-gdxfv" podUID="ee494290-a65e-414a-9378-97f2c8034eaa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 28 19:29:33.882591 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:29:33.882552 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7b45b6d56d-gdxfv" Apr 28 19:29:33.883197 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:29:33.883171 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7b45b6d56d-gdxfv" podUID="ee494290-a65e-414a-9378-97f2c8034eaa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 28 19:29:35.854267 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:29:35.854225 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-e1324-predictor-7798b8b7c8-cktzl" podUID="ba0d6624-ce60-4b4c-ab0d-c60cddbbddcf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 28 19:29:43.883514 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:29:43.883429 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7b45b6d56d-gdxfv" podUID="ee494290-a65e-414a-9378-97f2c8034eaa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 28 19:29:45.854781 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:29:45.854734 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-e1324-predictor-7798b8b7c8-cktzl" podUID="ba0d6624-ce60-4b4c-ab0d-c60cddbbddcf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 28 19:29:53.884738 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:29:53.884695 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7b45b6d56d-gdxfv" podUID="ee494290-a65e-414a-9378-97f2c8034eaa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 28 19:29:55.855144 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:29:55.855099 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-e1324-predictor-7798b8b7c8-cktzl" podUID="ba0d6624-ce60-4b4c-ab0d-c60cddbbddcf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 28 19:30:03.883801 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:30:03.883757 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7b45b6d56d-gdxfv" podUID="ee494290-a65e-414a-9378-97f2c8034eaa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 28 19:30:05.855725 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:30:05.855697 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-e1324-predictor-7798b8b7c8-cktzl" Apr 28 19:30:13.883513 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:30:13.883470 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7b45b6d56d-gdxfv" podUID="ee494290-a65e-414a-9378-97f2c8034eaa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 28 19:30:22.060740 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:30:22.060707 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/switch-graph-e1324-b8c5ffc4-w28k7"] Apr 28 19:30:22.064137 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:30:22.064121 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-e1324-b8c5ffc4-w28k7" Apr 28 19:30:22.066526 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:30:22.066495 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-e1324-serving-cert\"" Apr 28 19:30:22.066526 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:30:22.066495 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-e1324-kube-rbac-proxy-sar-config\"" Apr 28 19:30:22.070477 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:30:22.070453 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-e1324-b8c5ffc4-w28k7"] Apr 28 19:30:22.176396 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:30:22.176363 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c35727eb-b64c-43fb-814e-4c7bf7a7bd01-openshift-service-ca-bundle\") pod \"switch-graph-e1324-b8c5ffc4-w28k7\" (UID: \"c35727eb-b64c-43fb-814e-4c7bf7a7bd01\") " pod="kserve-ci-e2e-test/switch-graph-e1324-b8c5ffc4-w28k7" Apr 28 19:30:22.176578 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:30:22.176424 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c35727eb-b64c-43fb-814e-4c7bf7a7bd01-proxy-tls\") pod \"switch-graph-e1324-b8c5ffc4-w28k7\" (UID: \"c35727eb-b64c-43fb-814e-4c7bf7a7bd01\") " pod="kserve-ci-e2e-test/switch-graph-e1324-b8c5ffc4-w28k7" Apr 28 19:30:22.277753 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:30:22.277716 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c35727eb-b64c-43fb-814e-4c7bf7a7bd01-openshift-service-ca-bundle\") pod \"switch-graph-e1324-b8c5ffc4-w28k7\" (UID: \"c35727eb-b64c-43fb-814e-4c7bf7a7bd01\") " pod="kserve-ci-e2e-test/switch-graph-e1324-b8c5ffc4-w28k7" Apr 28 19:30:22.277924 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:30:22.277793 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c35727eb-b64c-43fb-814e-4c7bf7a7bd01-proxy-tls\") pod \"switch-graph-e1324-b8c5ffc4-w28k7\" (UID: \"c35727eb-b64c-43fb-814e-4c7bf7a7bd01\") " pod="kserve-ci-e2e-test/switch-graph-e1324-b8c5ffc4-w28k7" Apr 28 19:30:22.277924 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:30:22.277917 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/switch-graph-e1324-serving-cert: secret "switch-graph-e1324-serving-cert" not found Apr 28 19:30:22.277998 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:30:22.277972 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c35727eb-b64c-43fb-814e-4c7bf7a7bd01-proxy-tls podName:c35727eb-b64c-43fb-814e-4c7bf7a7bd01 nodeName:}" failed. No retries permitted until 2026-04-28 19:30:22.777955561 +0000 UTC m=+838.740855904 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/c35727eb-b64c-43fb-814e-4c7bf7a7bd01-proxy-tls") pod "switch-graph-e1324-b8c5ffc4-w28k7" (UID: "c35727eb-b64c-43fb-814e-4c7bf7a7bd01") : secret "switch-graph-e1324-serving-cert" not found Apr 28 19:30:22.278359 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:30:22.278336 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c35727eb-b64c-43fb-814e-4c7bf7a7bd01-openshift-service-ca-bundle\") pod \"switch-graph-e1324-b8c5ffc4-w28k7\" (UID: \"c35727eb-b64c-43fb-814e-4c7bf7a7bd01\") " pod="kserve-ci-e2e-test/switch-graph-e1324-b8c5ffc4-w28k7" Apr 28 19:30:22.782262 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:30:22.782220 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c35727eb-b64c-43fb-814e-4c7bf7a7bd01-proxy-tls\") pod \"switch-graph-e1324-b8c5ffc4-w28k7\" (UID: \"c35727eb-b64c-43fb-814e-4c7bf7a7bd01\") " pod="kserve-ci-e2e-test/switch-graph-e1324-b8c5ffc4-w28k7" Apr 28 19:30:22.784874 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:30:22.784852 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c35727eb-b64c-43fb-814e-4c7bf7a7bd01-proxy-tls\") pod \"switch-graph-e1324-b8c5ffc4-w28k7\" (UID: \"c35727eb-b64c-43fb-814e-4c7bf7a7bd01\") " pod="kserve-ci-e2e-test/switch-graph-e1324-b8c5ffc4-w28k7" Apr 28 19:30:22.975777 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:30:22.975735 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-e1324-b8c5ffc4-w28k7" Apr 28 19:30:23.105536 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:30:23.105512 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-e1324-b8c5ffc4-w28k7"] Apr 28 19:30:23.107705 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:30:23.107677 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc35727eb_b64c_43fb_814e_4c7bf7a7bd01.slice/crio-9c6ea0c1fd284396a3e10af9ff2bcdb6e0e1243c1d200a5dec597cb9161b2d49 WatchSource:0}: Error finding container 9c6ea0c1fd284396a3e10af9ff2bcdb6e0e1243c1d200a5dec597cb9161b2d49: Status 404 returned error can't find the container with id 9c6ea0c1fd284396a3e10af9ff2bcdb6e0e1243c1d200a5dec597cb9161b2d49 Apr 28 19:30:23.109485 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:30:23.109469 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 28 19:30:23.883819 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:30:23.883777 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7b45b6d56d-gdxfv" podUID="ee494290-a65e-414a-9378-97f2c8034eaa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 28 19:30:24.063996 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:30:24.063952 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-e1324-b8c5ffc4-w28k7" event={"ID":"c35727eb-b64c-43fb-814e-4c7bf7a7bd01","Type":"ContainerStarted","Data":"9c6ea0c1fd284396a3e10af9ff2bcdb6e0e1243c1d200a5dec597cb9161b2d49"} Apr 28 19:30:26.071361 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:30:26.071268 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-e1324-b8c5ffc4-w28k7" event={"ID":"c35727eb-b64c-43fb-814e-4c7bf7a7bd01","Type":"ContainerStarted","Data":"1af7a44b35c094db4d490c134f2f5870005603e8576bad27c6f33769894a09ac"} Apr 28 19:30:26.071766 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:30:26.071437 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-e1324-b8c5ffc4-w28k7" Apr 28 19:30:26.088250 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:30:26.088201 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/switch-graph-e1324-b8c5ffc4-w28k7" podStartSLOduration=1.421293263 podStartE2EDuration="4.088187577s" podCreationTimestamp="2026-04-28 19:30:22 +0000 UTC" firstStartedPulling="2026-04-28 19:30:23.109594759 +0000 UTC m=+839.072495103" lastFinishedPulling="2026-04-28 19:30:25.776489073 +0000 UTC m=+841.739389417" observedRunningTime="2026-04-28 19:30:26.086552769 +0000 UTC m=+842.049453148" watchObservedRunningTime="2026-04-28 19:30:26.088187577 +0000 UTC m=+842.051087942" Apr 28 19:30:32.080900 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:30:32.080867 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/switch-graph-e1324-b8c5ffc4-w28k7" Apr 28 19:30:32.263731 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:30:32.263696 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/switch-graph-e1324-serving-cert: secret "switch-graph-e1324-serving-cert" not found Apr 28 19:30:32.263904 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:30:32.263796 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c35727eb-b64c-43fb-814e-4c7bf7a7bd01-proxy-tls podName:c35727eb-b64c-43fb-814e-4c7bf7a7bd01 nodeName:}" failed. No retries permitted until 2026-04-28 19:30:32.763773532 +0000 UTC m=+848.726673881 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/c35727eb-b64c-43fb-814e-4c7bf7a7bd01-proxy-tls") pod "switch-graph-e1324-b8c5ffc4-w28k7" (UID: "c35727eb-b64c-43fb-814e-4c7bf7a7bd01") : secret "switch-graph-e1324-serving-cert" not found Apr 28 19:30:32.270941 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:30:32.270910 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-e1324-b8c5ffc4-w28k7"] Apr 28 19:30:32.271189 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:30:32.271165 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/switch-graph-e1324-b8c5ffc4-w28k7" podUID="c35727eb-b64c-43fb-814e-4c7bf7a7bd01" containerName="switch-graph-e1324" containerID="cri-o://1af7a44b35c094db4d490c134f2f5870005603e8576bad27c6f33769894a09ac" gracePeriod=30 Apr 28 19:30:32.405495 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:30:32.405406 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-e1324-predictor-7798b8b7c8-cktzl"] Apr 28 19:30:32.405770 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:30:32.405730 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-e1324-predictor-7798b8b7c8-cktzl" podUID="ba0d6624-ce60-4b4c-ab0d-c60cddbbddcf" containerName="kserve-container" containerID="cri-o://f0aa17ae9a969773992b7f7963033edb6c9c9baf2c257fbb2e3bbc2ec257d308" gracePeriod=30 Apr 28 19:30:32.405849 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:30:32.405779 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-e1324-predictor-7798b8b7c8-cktzl" podUID="ba0d6624-ce60-4b4c-ab0d-c60cddbbddcf" containerName="kube-rbac-proxy" containerID="cri-o://680c7069eaceb72cbe5e0b0b1e6f1e1f3379ec59f073eb1caaa372266bffe849" gracePeriod=30 Apr 28 19:30:32.557141 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:30:32.557102 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-0600b-predictor-655f9fcfd4-98d2n"] Apr 28 19:30:32.560540 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:30:32.560515 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-0600b-predictor-655f9fcfd4-98d2n" Apr 28 19:30:32.563494 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:30:32.563476 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-0600b-predictor-serving-cert\"" Apr 28 19:30:32.563494 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:30:32.563485 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-0600b-kube-rbac-proxy-sar-config\"" Apr 28 19:30:32.583927 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:30:32.583901 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-0600b-predictor-655f9fcfd4-98d2n"] Apr 28 19:30:32.667381 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:30:32.667296 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/26d374c9-c909-4ef7-b723-e234c4404579-proxy-tls\") pod \"success-200-isvc-0600b-predictor-655f9fcfd4-98d2n\" (UID: \"26d374c9-c909-4ef7-b723-e234c4404579\") " pod="kserve-ci-e2e-test/success-200-isvc-0600b-predictor-655f9fcfd4-98d2n" Apr 28 19:30:32.667530 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:30:32.667428 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-0600b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/26d374c9-c909-4ef7-b723-e234c4404579-success-200-isvc-0600b-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-0600b-predictor-655f9fcfd4-98d2n\" (UID: \"26d374c9-c909-4ef7-b723-e234c4404579\") " pod="kserve-ci-e2e-test/success-200-isvc-0600b-predictor-655f9fcfd4-98d2n" Apr 28 19:30:32.667530 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:30:32.667482 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7v4wp\" (UniqueName: \"kubernetes.io/projected/26d374c9-c909-4ef7-b723-e234c4404579-kube-api-access-7v4wp\") pod \"success-200-isvc-0600b-predictor-655f9fcfd4-98d2n\" (UID: \"26d374c9-c909-4ef7-b723-e234c4404579\") " pod="kserve-ci-e2e-test/success-200-isvc-0600b-predictor-655f9fcfd4-98d2n" Apr 28 19:30:32.768196 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:30:32.768157 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-0600b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/26d374c9-c909-4ef7-b723-e234c4404579-success-200-isvc-0600b-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-0600b-predictor-655f9fcfd4-98d2n\" (UID: \"26d374c9-c909-4ef7-b723-e234c4404579\") " pod="kserve-ci-e2e-test/success-200-isvc-0600b-predictor-655f9fcfd4-98d2n" Apr 28 19:30:32.768374 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:30:32.768233 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7v4wp\" (UniqueName: \"kubernetes.io/projected/26d374c9-c909-4ef7-b723-e234c4404579-kube-api-access-7v4wp\") pod \"success-200-isvc-0600b-predictor-655f9fcfd4-98d2n\" (UID: \"26d374c9-c909-4ef7-b723-e234c4404579\") " pod="kserve-ci-e2e-test/success-200-isvc-0600b-predictor-655f9fcfd4-98d2n" Apr 28 19:30:32.768374 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:30:32.768270 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/26d374c9-c909-4ef7-b723-e234c4404579-proxy-tls\") pod \"success-200-isvc-0600b-predictor-655f9fcfd4-98d2n\" (UID: \"26d374c9-c909-4ef7-b723-e234c4404579\") " pod="kserve-ci-e2e-test/success-200-isvc-0600b-predictor-655f9fcfd4-98d2n" Apr 28 19:30:32.768374 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:30:32.768326 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/switch-graph-e1324-serving-cert: secret "switch-graph-e1324-serving-cert" not found Apr 28 19:30:32.768536 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:30:32.768386 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/success-200-isvc-0600b-predictor-serving-cert: secret "success-200-isvc-0600b-predictor-serving-cert" not found Apr 28 19:30:32.768536 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:30:32.768406 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c35727eb-b64c-43fb-814e-4c7bf7a7bd01-proxy-tls podName:c35727eb-b64c-43fb-814e-4c7bf7a7bd01 nodeName:}" failed. No retries permitted until 2026-04-28 19:30:33.768385852 +0000 UTC m=+849.731286198 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/c35727eb-b64c-43fb-814e-4c7bf7a7bd01-proxy-tls") pod "switch-graph-e1324-b8c5ffc4-w28k7" (UID: "c35727eb-b64c-43fb-814e-4c7bf7a7bd01") : secret "switch-graph-e1324-serving-cert" not found Apr 28 19:30:32.768536 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:30:32.768449 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26d374c9-c909-4ef7-b723-e234c4404579-proxy-tls podName:26d374c9-c909-4ef7-b723-e234c4404579 nodeName:}" failed. No retries permitted until 2026-04-28 19:30:33.26843244 +0000 UTC m=+849.231332785 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/26d374c9-c909-4ef7-b723-e234c4404579-proxy-tls") pod "success-200-isvc-0600b-predictor-655f9fcfd4-98d2n" (UID: "26d374c9-c909-4ef7-b723-e234c4404579") : secret "success-200-isvc-0600b-predictor-serving-cert" not found Apr 28 19:30:32.768953 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:30:32.768934 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-0600b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/26d374c9-c909-4ef7-b723-e234c4404579-success-200-isvc-0600b-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-0600b-predictor-655f9fcfd4-98d2n\" (UID: \"26d374c9-c909-4ef7-b723-e234c4404579\") " pod="kserve-ci-e2e-test/success-200-isvc-0600b-predictor-655f9fcfd4-98d2n" Apr 28 19:30:32.776648 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:30:32.776620 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7v4wp\" (UniqueName: \"kubernetes.io/projected/26d374c9-c909-4ef7-b723-e234c4404579-kube-api-access-7v4wp\") pod \"success-200-isvc-0600b-predictor-655f9fcfd4-98d2n\" (UID: \"26d374c9-c909-4ef7-b723-e234c4404579\") " pod="kserve-ci-e2e-test/success-200-isvc-0600b-predictor-655f9fcfd4-98d2n" Apr 28 19:30:33.096220 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:30:33.096178 2576 generic.go:358] "Generic (PLEG): container finished" podID="ba0d6624-ce60-4b4c-ab0d-c60cddbbddcf" containerID="680c7069eaceb72cbe5e0b0b1e6f1e1f3379ec59f073eb1caaa372266bffe849" exitCode=2 Apr 28 19:30:33.096571 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:30:33.096227 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-e1324-predictor-7798b8b7c8-cktzl" event={"ID":"ba0d6624-ce60-4b4c-ab0d-c60cddbbddcf","Type":"ContainerDied","Data":"680c7069eaceb72cbe5e0b0b1e6f1e1f3379ec59f073eb1caaa372266bffe849"} Apr 28 19:30:33.274265 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:30:33.274233 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/26d374c9-c909-4ef7-b723-e234c4404579-proxy-tls\") pod \"success-200-isvc-0600b-predictor-655f9fcfd4-98d2n\" (UID: \"26d374c9-c909-4ef7-b723-e234c4404579\") " pod="kserve-ci-e2e-test/success-200-isvc-0600b-predictor-655f9fcfd4-98d2n" Apr 28 19:30:33.276966 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:30:33.276939 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/26d374c9-c909-4ef7-b723-e234c4404579-proxy-tls\") pod \"success-200-isvc-0600b-predictor-655f9fcfd4-98d2n\" (UID: \"26d374c9-c909-4ef7-b723-e234c4404579\") " pod="kserve-ci-e2e-test/success-200-isvc-0600b-predictor-655f9fcfd4-98d2n" Apr 28 19:30:33.471204 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:30:33.471120 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-0600b-predictor-655f9fcfd4-98d2n" Apr 28 19:30:33.598131 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:30:33.596547 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-0600b-predictor-655f9fcfd4-98d2n"] Apr 28 19:30:33.779445 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:30:33.779415 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/switch-graph-e1324-serving-cert: secret "switch-graph-e1324-serving-cert" not found Apr 28 19:30:33.779593 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:30:33.779487 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c35727eb-b64c-43fb-814e-4c7bf7a7bd01-proxy-tls podName:c35727eb-b64c-43fb-814e-4c7bf7a7bd01 nodeName:}" failed. No retries permitted until 2026-04-28 19:30:35.779469862 +0000 UTC m=+851.742370212 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/c35727eb-b64c-43fb-814e-4c7bf7a7bd01-proxy-tls") pod "switch-graph-e1324-b8c5ffc4-w28k7" (UID: "c35727eb-b64c-43fb-814e-4c7bf7a7bd01") : secret "switch-graph-e1324-serving-cert" not found Apr 28 19:30:33.883709 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:30:33.883682 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7b45b6d56d-gdxfv" Apr 28 19:30:34.101973 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:30:34.101888 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-0600b-predictor-655f9fcfd4-98d2n" event={"ID":"26d374c9-c909-4ef7-b723-e234c4404579","Type":"ContainerStarted","Data":"77b1b7a4a4c7bf431c816ff210fd62e3a999f0a5267951c31dfa2591104f8182"} Apr 28 19:30:34.101973 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:30:34.101930 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-0600b-predictor-655f9fcfd4-98d2n" event={"ID":"26d374c9-c909-4ef7-b723-e234c4404579","Type":"ContainerStarted","Data":"6f4488adff6cdcaee85693455beca4883abed01041f9bbead35c9a145dd0bec0"} Apr 28 19:30:34.101973 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:30:34.101946 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-0600b-predictor-655f9fcfd4-98d2n" event={"ID":"26d374c9-c909-4ef7-b723-e234c4404579","Type":"ContainerStarted","Data":"7279d902e88a4cdaffb559ab468e03dd5915e81043b9eb35f6ed37931131ce7e"} Apr 28 19:30:34.102399 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:30:34.102081 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-0600b-predictor-655f9fcfd4-98d2n" Apr 28 19:30:34.102399 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:30:34.102111 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-0600b-predictor-655f9fcfd4-98d2n" Apr 28 19:30:34.103368 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:30:34.103343 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-0600b-predictor-655f9fcfd4-98d2n" podUID="26d374c9-c909-4ef7-b723-e234c4404579" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 28 19:30:34.119860 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:30:34.119814 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-0600b-predictor-655f9fcfd4-98d2n" podStartSLOduration=2.119802704 podStartE2EDuration="2.119802704s" podCreationTimestamp="2026-04-28 19:30:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:30:34.118633729 +0000 UTC m=+850.081534097" watchObservedRunningTime="2026-04-28 19:30:34.119802704 +0000 UTC m=+850.082703070" Apr 28 19:30:35.105899 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:30:35.105863 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-0600b-predictor-655f9fcfd4-98d2n" podUID="26d374c9-c909-4ef7-b723-e234c4404579" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 28 19:30:35.540151 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:30:35.540121 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-e1324-predictor-7798b8b7c8-cktzl" Apr 28 19:30:35.595779 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:30:35.595748 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vhmvv\" (UniqueName: \"kubernetes.io/projected/ba0d6624-ce60-4b4c-ab0d-c60cddbbddcf-kube-api-access-vhmvv\") pod \"ba0d6624-ce60-4b4c-ab0d-c60cddbbddcf\" (UID: \"ba0d6624-ce60-4b4c-ab0d-c60cddbbddcf\") " Apr 28 19:30:35.595947 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:30:35.595801 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-e1324-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ba0d6624-ce60-4b4c-ab0d-c60cddbbddcf-success-200-isvc-e1324-kube-rbac-proxy-sar-config\") pod \"ba0d6624-ce60-4b4c-ab0d-c60cddbbddcf\" (UID: \"ba0d6624-ce60-4b4c-ab0d-c60cddbbddcf\") " Apr 28 19:30:35.595947 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:30:35.595843 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ba0d6624-ce60-4b4c-ab0d-c60cddbbddcf-proxy-tls\") pod \"ba0d6624-ce60-4b4c-ab0d-c60cddbbddcf\" (UID: \"ba0d6624-ce60-4b4c-ab0d-c60cddbbddcf\") " Apr 28 19:30:35.596204 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:30:35.596180 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba0d6624-ce60-4b4c-ab0d-c60cddbbddcf-success-200-isvc-e1324-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-e1324-kube-rbac-proxy-sar-config") pod "ba0d6624-ce60-4b4c-ab0d-c60cddbbddcf" (UID: "ba0d6624-ce60-4b4c-ab0d-c60cddbbddcf"). InnerVolumeSpecName "success-200-isvc-e1324-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:30:35.598092 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:30:35.598070 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba0d6624-ce60-4b4c-ab0d-c60cddbbddcf-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "ba0d6624-ce60-4b4c-ab0d-c60cddbbddcf" (UID: "ba0d6624-ce60-4b4c-ab0d-c60cddbbddcf"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:30:35.598199 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:30:35.598185 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba0d6624-ce60-4b4c-ab0d-c60cddbbddcf-kube-api-access-vhmvv" (OuterVolumeSpecName: "kube-api-access-vhmvv") pod "ba0d6624-ce60-4b4c-ab0d-c60cddbbddcf" (UID: "ba0d6624-ce60-4b4c-ab0d-c60cddbbddcf"). InnerVolumeSpecName "kube-api-access-vhmvv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:30:35.696439 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:30:35.696370 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ba0d6624-ce60-4b4c-ab0d-c60cddbbddcf-proxy-tls\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 28 19:30:35.696439 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:30:35.696414 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vhmvv\" (UniqueName: \"kubernetes.io/projected/ba0d6624-ce60-4b4c-ab0d-c60cddbbddcf-kube-api-access-vhmvv\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 28 19:30:35.696439 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:30:35.696425 2576 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-e1324-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ba0d6624-ce60-4b4c-ab0d-c60cddbbddcf-success-200-isvc-e1324-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 28 19:30:35.797113 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:30:35.797081 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/switch-graph-e1324-serving-cert: secret "switch-graph-e1324-serving-cert" not found Apr 28 19:30:35.797245 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:30:35.797151 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c35727eb-b64c-43fb-814e-4c7bf7a7bd01-proxy-tls podName:c35727eb-b64c-43fb-814e-4c7bf7a7bd01 nodeName:}" failed. No retries permitted until 2026-04-28 19:30:39.797134724 +0000 UTC m=+855.760035071 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/c35727eb-b64c-43fb-814e-4c7bf7a7bd01-proxy-tls") pod "switch-graph-e1324-b8c5ffc4-w28k7" (UID: "c35727eb-b64c-43fb-814e-4c7bf7a7bd01") : secret "switch-graph-e1324-serving-cert" not found Apr 28 19:30:36.109824 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:30:36.109787 2576 generic.go:358] "Generic (PLEG): container finished" podID="ba0d6624-ce60-4b4c-ab0d-c60cddbbddcf" containerID="f0aa17ae9a969773992b7f7963033edb6c9c9baf2c257fbb2e3bbc2ec257d308" exitCode=0 Apr 28 19:30:36.110225 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:30:36.109856 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-e1324-predictor-7798b8b7c8-cktzl" event={"ID":"ba0d6624-ce60-4b4c-ab0d-c60cddbbddcf","Type":"ContainerDied","Data":"f0aa17ae9a969773992b7f7963033edb6c9c9baf2c257fbb2e3bbc2ec257d308"} Apr 28 19:30:36.110225 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:30:36.109873 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-e1324-predictor-7798b8b7c8-cktzl" Apr 28 19:30:36.110225 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:30:36.109891 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-e1324-predictor-7798b8b7c8-cktzl" event={"ID":"ba0d6624-ce60-4b4c-ab0d-c60cddbbddcf","Type":"ContainerDied","Data":"bf31d786fe2a3f1d6fdcead2016085a623c195872fb1778f2e384fddda142752"} Apr 28 19:30:36.110225 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:30:36.109911 2576 scope.go:117] "RemoveContainer" containerID="680c7069eaceb72cbe5e0b0b1e6f1e1f3379ec59f073eb1caaa372266bffe849" Apr 28 19:30:36.118643 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:30:36.118624 2576 scope.go:117] "RemoveContainer" containerID="f0aa17ae9a969773992b7f7963033edb6c9c9baf2c257fbb2e3bbc2ec257d308" Apr 28 19:30:36.125875 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:30:36.125854 2576 scope.go:117] "RemoveContainer" containerID="680c7069eaceb72cbe5e0b0b1e6f1e1f3379ec59f073eb1caaa372266bffe849" Apr 28 19:30:36.126149 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:30:36.126126 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"680c7069eaceb72cbe5e0b0b1e6f1e1f3379ec59f073eb1caaa372266bffe849\": container with ID starting with 680c7069eaceb72cbe5e0b0b1e6f1e1f3379ec59f073eb1caaa372266bffe849 not found: ID does not exist" containerID="680c7069eaceb72cbe5e0b0b1e6f1e1f3379ec59f073eb1caaa372266bffe849" Apr 28 19:30:36.126230 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:30:36.126161 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"680c7069eaceb72cbe5e0b0b1e6f1e1f3379ec59f073eb1caaa372266bffe849"} err="failed to get container status \"680c7069eaceb72cbe5e0b0b1e6f1e1f3379ec59f073eb1caaa372266bffe849\": rpc error: code = NotFound desc = could not find container \"680c7069eaceb72cbe5e0b0b1e6f1e1f3379ec59f073eb1caaa372266bffe849\": container with ID starting with 680c7069eaceb72cbe5e0b0b1e6f1e1f3379ec59f073eb1caaa372266bffe849 not found: ID does not exist" Apr 28 19:30:36.126230 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:30:36.126185 2576 scope.go:117] "RemoveContainer" containerID="f0aa17ae9a969773992b7f7963033edb6c9c9baf2c257fbb2e3bbc2ec257d308" Apr 28 19:30:36.126439 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:30:36.126423 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0aa17ae9a969773992b7f7963033edb6c9c9baf2c257fbb2e3bbc2ec257d308\": container with ID starting with f0aa17ae9a969773992b7f7963033edb6c9c9baf2c257fbb2e3bbc2ec257d308 not found: ID does not exist" containerID="f0aa17ae9a969773992b7f7963033edb6c9c9baf2c257fbb2e3bbc2ec257d308" Apr 28 19:30:36.126481 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:30:36.126444 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0aa17ae9a969773992b7f7963033edb6c9c9baf2c257fbb2e3bbc2ec257d308"} err="failed to get container status \"f0aa17ae9a969773992b7f7963033edb6c9c9baf2c257fbb2e3bbc2ec257d308\": rpc error: code = NotFound desc = could not find container \"f0aa17ae9a969773992b7f7963033edb6c9c9baf2c257fbb2e3bbc2ec257d308\": container with ID starting with f0aa17ae9a969773992b7f7963033edb6c9c9baf2c257fbb2e3bbc2ec257d308 not found: ID does not exist" Apr 28 19:30:36.131189 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:30:36.131164 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-e1324-predictor-7798b8b7c8-cktzl"] Apr 28 19:30:36.135128 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:30:36.135106 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-e1324-predictor-7798b8b7c8-cktzl"] Apr 28 19:30:36.513532 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:30:36.513498 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba0d6624-ce60-4b4c-ab0d-c60cddbbddcf" path="/var/lib/kubelet/pods/ba0d6624-ce60-4b4c-ab0d-c60cddbbddcf/volumes" Apr 28 19:30:37.078806 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:30:37.078767 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-e1324-b8c5ffc4-w28k7" podUID="c35727eb-b64c-43fb-814e-4c7bf7a7bd01" containerName="switch-graph-e1324" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:30:39.833371 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:30:39.833328 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/switch-graph-e1324-serving-cert: secret "switch-graph-e1324-serving-cert" not found Apr 28 19:30:39.833887 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:30:39.833418 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c35727eb-b64c-43fb-814e-4c7bf7a7bd01-proxy-tls podName:c35727eb-b64c-43fb-814e-4c7bf7a7bd01 nodeName:}" failed. No retries permitted until 2026-04-28 19:30:47.83339959 +0000 UTC m=+863.796299934 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/c35727eb-b64c-43fb-814e-4c7bf7a7bd01-proxy-tls") pod "switch-graph-e1324-b8c5ffc4-w28k7" (UID: "c35727eb-b64c-43fb-814e-4c7bf7a7bd01") : secret "switch-graph-e1324-serving-cert" not found Apr 28 19:30:40.110253 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:30:40.110166 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-0600b-predictor-655f9fcfd4-98d2n" Apr 28 19:30:40.110777 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:30:40.110745 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-0600b-predictor-655f9fcfd4-98d2n" podUID="26d374c9-c909-4ef7-b723-e234c4404579" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 28 19:30:42.081075 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:30:42.081038 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-e1324-b8c5ffc4-w28k7" podUID="c35727eb-b64c-43fb-814e-4c7bf7a7bd01" containerName="switch-graph-e1324" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:30:47.079072 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:30:47.079031 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-e1324-b8c5ffc4-w28k7" podUID="c35727eb-b64c-43fb-814e-4c7bf7a7bd01" containerName="switch-graph-e1324" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:30:47.079437 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:30:47.079137 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-e1324-b8c5ffc4-w28k7" Apr 28 19:30:47.906484 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:30:47.906444 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/switch-graph-e1324-serving-cert: secret "switch-graph-e1324-serving-cert" not found Apr 28 19:30:47.906711 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:30:47.906533 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c35727eb-b64c-43fb-814e-4c7bf7a7bd01-proxy-tls podName:c35727eb-b64c-43fb-814e-4c7bf7a7bd01 nodeName:}" failed. No retries permitted until 2026-04-28 19:31:03.906515395 +0000 UTC m=+879.869415740 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/c35727eb-b64c-43fb-814e-4c7bf7a7bd01-proxy-tls") pod "switch-graph-e1324-b8c5ffc4-w28k7" (UID: "c35727eb-b64c-43fb-814e-4c7bf7a7bd01") : secret "switch-graph-e1324-serving-cert" not found Apr 28 19:30:50.111230 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:30:50.111192 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-0600b-predictor-655f9fcfd4-98d2n" podUID="26d374c9-c909-4ef7-b723-e234c4404579" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 28 19:30:52.078990 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:30:52.078942 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-e1324-b8c5ffc4-w28k7" podUID="c35727eb-b64c-43fb-814e-4c7bf7a7bd01" containerName="switch-graph-e1324" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:30:57.078524 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:30:57.078486 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-e1324-b8c5ffc4-w28k7" podUID="c35727eb-b64c-43fb-814e-4c7bf7a7bd01" containerName="switch-graph-e1324" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:31:00.111586 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:00.111547 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-0600b-predictor-655f9fcfd4-98d2n" podUID="26d374c9-c909-4ef7-b723-e234c4404579" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 28 19:31:02.033350 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:02.033316 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/model-chainer-59fd89b477-szskd"] Apr 28 19:31:02.033735 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:02.033663 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ba0d6624-ce60-4b4c-ab0d-c60cddbbddcf" containerName="kube-rbac-proxy" Apr 28 19:31:02.033735 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:02.033674 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba0d6624-ce60-4b4c-ab0d-c60cddbbddcf" containerName="kube-rbac-proxy" Apr 28 19:31:02.033735 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:02.033690 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ba0d6624-ce60-4b4c-ab0d-c60cddbbddcf" containerName="kserve-container" Apr 28 19:31:02.033735 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:02.033696 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba0d6624-ce60-4b4c-ab0d-c60cddbbddcf" containerName="kserve-container" Apr 28 19:31:02.033861 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:02.033744 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="ba0d6624-ce60-4b4c-ab0d-c60cddbbddcf" containerName="kserve-container" Apr 28 19:31:02.033861 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:02.033752 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="ba0d6624-ce60-4b4c-ab0d-c60cddbbddcf" containerName="kube-rbac-proxy" Apr 28 19:31:02.039899 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:02.039882 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-59fd89b477-szskd" Apr 28 19:31:02.042436 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:02.042414 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-kube-rbac-proxy-sar-config\"" Apr 28 19:31:02.042672 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:02.042654 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-serving-cert\"" Apr 28 19:31:02.046202 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:02.046178 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-59fd89b477-szskd"] Apr 28 19:31:02.079124 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:02.079097 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-e1324-b8c5ffc4-w28k7" podUID="c35727eb-b64c-43fb-814e-4c7bf7a7bd01" containerName="switch-graph-e1324" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:31:02.130633 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:02.130583 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5d500d4c-1229-4256-848b-499bd86547a4-proxy-tls\") pod \"model-chainer-59fd89b477-szskd\" (UID: \"5d500d4c-1229-4256-848b-499bd86547a4\") " pod="kserve-ci-e2e-test/model-chainer-59fd89b477-szskd" Apr 28 19:31:02.130743 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:02.130701 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5d500d4c-1229-4256-848b-499bd86547a4-openshift-service-ca-bundle\") pod \"model-chainer-59fd89b477-szskd\" (UID: \"5d500d4c-1229-4256-848b-499bd86547a4\") " pod="kserve-ci-e2e-test/model-chainer-59fd89b477-szskd" Apr 28 19:31:02.231817 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:02.231790 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5d500d4c-1229-4256-848b-499bd86547a4-proxy-tls\") pod \"model-chainer-59fd89b477-szskd\" (UID: \"5d500d4c-1229-4256-848b-499bd86547a4\") " pod="kserve-ci-e2e-test/model-chainer-59fd89b477-szskd" Apr 28 19:31:02.231943 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:02.231848 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5d500d4c-1229-4256-848b-499bd86547a4-openshift-service-ca-bundle\") pod \"model-chainer-59fd89b477-szskd\" (UID: \"5d500d4c-1229-4256-848b-499bd86547a4\") " pod="kserve-ci-e2e-test/model-chainer-59fd89b477-szskd" Apr 28 19:31:02.232393 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:02.232374 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5d500d4c-1229-4256-848b-499bd86547a4-openshift-service-ca-bundle\") pod \"model-chainer-59fd89b477-szskd\" (UID: \"5d500d4c-1229-4256-848b-499bd86547a4\") " pod="kserve-ci-e2e-test/model-chainer-59fd89b477-szskd" Apr 28 19:31:02.234277 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:02.234251 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5d500d4c-1229-4256-848b-499bd86547a4-proxy-tls\") pod \"model-chainer-59fd89b477-szskd\" (UID: \"5d500d4c-1229-4256-848b-499bd86547a4\") " pod="kserve-ci-e2e-test/model-chainer-59fd89b477-szskd" Apr 28 19:31:02.350920 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:02.350896 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-59fd89b477-szskd" Apr 28 19:31:02.428358 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:02.428336 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-e1324-b8c5ffc4-w28k7" Apr 28 19:31:02.479164 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:02.479139 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-59fd89b477-szskd"] Apr 28 19:31:02.480935 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:31:02.480910 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d500d4c_1229_4256_848b_499bd86547a4.slice/crio-91bfd093ae6b7a074a19efcc354caa1d37ab98b747eec23d8c7b527f150b2694 WatchSource:0}: Error finding container 91bfd093ae6b7a074a19efcc354caa1d37ab98b747eec23d8c7b527f150b2694: Status 404 returned error can't find the container with id 91bfd093ae6b7a074a19efcc354caa1d37ab98b747eec23d8c7b527f150b2694 Apr 28 19:31:02.533577 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:02.533560 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c35727eb-b64c-43fb-814e-4c7bf7a7bd01-proxy-tls\") pod \"c35727eb-b64c-43fb-814e-4c7bf7a7bd01\" (UID: \"c35727eb-b64c-43fb-814e-4c7bf7a7bd01\") " Apr 28 19:31:02.533719 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:02.533704 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c35727eb-b64c-43fb-814e-4c7bf7a7bd01-openshift-service-ca-bundle\") pod \"c35727eb-b64c-43fb-814e-4c7bf7a7bd01\" (UID: \"c35727eb-b64c-43fb-814e-4c7bf7a7bd01\") " Apr 28 19:31:02.534007 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:02.533981 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c35727eb-b64c-43fb-814e-4c7bf7a7bd01-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "c35727eb-b64c-43fb-814e-4c7bf7a7bd01" (UID: "c35727eb-b64c-43fb-814e-4c7bf7a7bd01"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:31:02.535709 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:02.535679 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c35727eb-b64c-43fb-814e-4c7bf7a7bd01-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "c35727eb-b64c-43fb-814e-4c7bf7a7bd01" (UID: "c35727eb-b64c-43fb-814e-4c7bf7a7bd01"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:31:02.635186 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:02.635155 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c35727eb-b64c-43fb-814e-4c7bf7a7bd01-proxy-tls\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 28 19:31:02.635186 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:02.635183 2576 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c35727eb-b64c-43fb-814e-4c7bf7a7bd01-openshift-service-ca-bundle\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 28 19:31:03.205076 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:03.205034 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-59fd89b477-szskd" event={"ID":"5d500d4c-1229-4256-848b-499bd86547a4","Type":"ContainerStarted","Data":"0579a97726d6630412b6f516e4e81bd9b513d0e396abb8f48ee095e0a30989d8"} Apr 28 19:31:03.205076 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:03.205076 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-59fd89b477-szskd" event={"ID":"5d500d4c-1229-4256-848b-499bd86547a4","Type":"ContainerStarted","Data":"91bfd093ae6b7a074a19efcc354caa1d37ab98b747eec23d8c7b527f150b2694"} Apr 28 19:31:03.205539 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:03.205264 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-59fd89b477-szskd" Apr 28 19:31:03.206227 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:03.206206 2576 generic.go:358] "Generic (PLEG): container finished" podID="c35727eb-b64c-43fb-814e-4c7bf7a7bd01" containerID="1af7a44b35c094db4d490c134f2f5870005603e8576bad27c6f33769894a09ac" exitCode=0 Apr 28 19:31:03.206298 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:03.206253 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-e1324-b8c5ffc4-w28k7" event={"ID":"c35727eb-b64c-43fb-814e-4c7bf7a7bd01","Type":"ContainerDied","Data":"1af7a44b35c094db4d490c134f2f5870005603e8576bad27c6f33769894a09ac"} Apr 28 19:31:03.206298 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:03.206269 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-e1324-b8c5ffc4-w28k7" event={"ID":"c35727eb-b64c-43fb-814e-4c7bf7a7bd01","Type":"ContainerDied","Data":"9c6ea0c1fd284396a3e10af9ff2bcdb6e0e1243c1d200a5dec597cb9161b2d49"} Apr 28 19:31:03.206298 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:03.206283 2576 scope.go:117] "RemoveContainer" containerID="1af7a44b35c094db4d490c134f2f5870005603e8576bad27c6f33769894a09ac" Apr 28 19:31:03.206447 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:03.206296 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-e1324-b8c5ffc4-w28k7" Apr 28 19:31:03.214801 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:03.214780 2576 scope.go:117] "RemoveContainer" containerID="1af7a44b35c094db4d490c134f2f5870005603e8576bad27c6f33769894a09ac" Apr 28 19:31:03.215036 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:31:03.215017 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1af7a44b35c094db4d490c134f2f5870005603e8576bad27c6f33769894a09ac\": container with ID starting with 1af7a44b35c094db4d490c134f2f5870005603e8576bad27c6f33769894a09ac not found: ID does not exist" containerID="1af7a44b35c094db4d490c134f2f5870005603e8576bad27c6f33769894a09ac" Apr 28 19:31:03.215089 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:03.215043 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1af7a44b35c094db4d490c134f2f5870005603e8576bad27c6f33769894a09ac"} err="failed to get container status \"1af7a44b35c094db4d490c134f2f5870005603e8576bad27c6f33769894a09ac\": rpc error: code = NotFound desc = could not find container \"1af7a44b35c094db4d490c134f2f5870005603e8576bad27c6f33769894a09ac\": container with ID starting with 1af7a44b35c094db4d490c134f2f5870005603e8576bad27c6f33769894a09ac not found: ID does not exist" Apr 28 19:31:03.221824 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:03.221785 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/model-chainer-59fd89b477-szskd" podStartSLOduration=1.221773244 podStartE2EDuration="1.221773244s" podCreationTimestamp="2026-04-28 19:31:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:31:03.220731821 +0000 UTC m=+879.183632189" watchObservedRunningTime="2026-04-28 19:31:03.221773244 +0000 UTC m=+879.184673611" Apr 28 19:31:03.233928 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:03.233906 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-e1324-b8c5ffc4-w28k7"] Apr 28 19:31:03.239940 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:03.239918 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/switch-graph-e1324-b8c5ffc4-w28k7"] Apr 28 19:31:04.514088 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:04.514057 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c35727eb-b64c-43fb-814e-4c7bf7a7bd01" path="/var/lib/kubelet/pods/c35727eb-b64c-43fb-814e-4c7bf7a7bd01/volumes" Apr 28 19:31:09.215721 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:09.215684 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/model-chainer-59fd89b477-szskd" Apr 28 19:31:10.111117 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:10.111083 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-0600b-predictor-655f9fcfd4-98d2n" podUID="26d374c9-c909-4ef7-b723-e234c4404579" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 28 19:31:12.109653 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:12.109594 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-59fd89b477-szskd"] Apr 28 19:31:12.110063 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:12.109870 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/model-chainer-59fd89b477-szskd" podUID="5d500d4c-1229-4256-848b-499bd86547a4" containerName="model-chainer" containerID="cri-o://0579a97726d6630412b6f516e4e81bd9b513d0e396abb8f48ee095e0a30989d8" gracePeriod=30 Apr 28 19:31:12.272295 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:12.272264 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7b45b6d56d-gdxfv"] Apr 28 19:31:12.272589 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:12.272568 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7b45b6d56d-gdxfv" podUID="ee494290-a65e-414a-9378-97f2c8034eaa" containerName="kserve-container" containerID="cri-o://520870a0f788fe3e334f84cdd911c86a00ab04d48f5624ae75baad92ce9e0cac" gracePeriod=30 Apr 28 19:31:12.272664 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:12.272618 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7b45b6d56d-gdxfv" podUID="ee494290-a65e-414a-9378-97f2c8034eaa" containerName="kube-rbac-proxy" containerID="cri-o://a6654bbe42f60ef284419f79bb43bbd4dc47d68335cbabbad4d93824eea00c7b" gracePeriod=30 Apr 28 19:31:12.290838 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:12.290809 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-6c7b3-predictor-7f4499fcf8-f5g4c"] Apr 28 19:31:12.291192 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:12.291180 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c35727eb-b64c-43fb-814e-4c7bf7a7bd01" containerName="switch-graph-e1324" Apr 28 19:31:12.291236 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:12.291196 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="c35727eb-b64c-43fb-814e-4c7bf7a7bd01" containerName="switch-graph-e1324" Apr 28 19:31:12.291269 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:12.291250 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="c35727eb-b64c-43fb-814e-4c7bf7a7bd01" containerName="switch-graph-e1324" Apr 28 19:31:12.295838 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:12.295820 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-6c7b3-predictor-7f4499fcf8-f5g4c" Apr 28 19:31:12.298237 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:12.298219 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-6c7b3-predictor-serving-cert\"" Apr 28 19:31:12.298315 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:12.298279 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-6c7b3-kube-rbac-proxy-sar-config\"" Apr 28 19:31:12.304593 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:12.304573 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-6c7b3-predictor-7f4499fcf8-f5g4c"] Apr 28 19:31:12.415487 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:12.415410 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-6c7b3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a2d82b36-d9ec-4822-aae8-5b166edea3ef-success-200-isvc-6c7b3-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-6c7b3-predictor-7f4499fcf8-f5g4c\" (UID: \"a2d82b36-d9ec-4822-aae8-5b166edea3ef\") " pod="kserve-ci-e2e-test/success-200-isvc-6c7b3-predictor-7f4499fcf8-f5g4c" Apr 28 19:31:12.415487 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:12.415474 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a2d82b36-d9ec-4822-aae8-5b166edea3ef-proxy-tls\") pod \"success-200-isvc-6c7b3-predictor-7f4499fcf8-f5g4c\" (UID: \"a2d82b36-d9ec-4822-aae8-5b166edea3ef\") " pod="kserve-ci-e2e-test/success-200-isvc-6c7b3-predictor-7f4499fcf8-f5g4c" Apr 28 19:31:12.415680 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:12.415514 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmpmq\" (UniqueName: \"kubernetes.io/projected/a2d82b36-d9ec-4822-aae8-5b166edea3ef-kube-api-access-cmpmq\") pod \"success-200-isvc-6c7b3-predictor-7f4499fcf8-f5g4c\" (UID: \"a2d82b36-d9ec-4822-aae8-5b166edea3ef\") " pod="kserve-ci-e2e-test/success-200-isvc-6c7b3-predictor-7f4499fcf8-f5g4c" Apr 28 19:31:12.516507 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:12.516479 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-6c7b3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a2d82b36-d9ec-4822-aae8-5b166edea3ef-success-200-isvc-6c7b3-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-6c7b3-predictor-7f4499fcf8-f5g4c\" (UID: \"a2d82b36-d9ec-4822-aae8-5b166edea3ef\") " pod="kserve-ci-e2e-test/success-200-isvc-6c7b3-predictor-7f4499fcf8-f5g4c" Apr 28 19:31:12.516681 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:12.516529 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a2d82b36-d9ec-4822-aae8-5b166edea3ef-proxy-tls\") pod \"success-200-isvc-6c7b3-predictor-7f4499fcf8-f5g4c\" (UID: \"a2d82b36-d9ec-4822-aae8-5b166edea3ef\") " pod="kserve-ci-e2e-test/success-200-isvc-6c7b3-predictor-7f4499fcf8-f5g4c" Apr 28 19:31:12.516681 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:12.516570 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cmpmq\" (UniqueName: \"kubernetes.io/projected/a2d82b36-d9ec-4822-aae8-5b166edea3ef-kube-api-access-cmpmq\") pod \"success-200-isvc-6c7b3-predictor-7f4499fcf8-f5g4c\" (UID: \"a2d82b36-d9ec-4822-aae8-5b166edea3ef\") " pod="kserve-ci-e2e-test/success-200-isvc-6c7b3-predictor-7f4499fcf8-f5g4c" Apr 28 19:31:12.517248 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:12.517219 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-6c7b3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a2d82b36-d9ec-4822-aae8-5b166edea3ef-success-200-isvc-6c7b3-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-6c7b3-predictor-7f4499fcf8-f5g4c\" (UID: \"a2d82b36-d9ec-4822-aae8-5b166edea3ef\") " pod="kserve-ci-e2e-test/success-200-isvc-6c7b3-predictor-7f4499fcf8-f5g4c" Apr 28 19:31:12.519104 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:12.519077 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a2d82b36-d9ec-4822-aae8-5b166edea3ef-proxy-tls\") pod \"success-200-isvc-6c7b3-predictor-7f4499fcf8-f5g4c\" (UID: \"a2d82b36-d9ec-4822-aae8-5b166edea3ef\") " pod="kserve-ci-e2e-test/success-200-isvc-6c7b3-predictor-7f4499fcf8-f5g4c" Apr 28 19:31:12.525022 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:12.524996 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmpmq\" (UniqueName: \"kubernetes.io/projected/a2d82b36-d9ec-4822-aae8-5b166edea3ef-kube-api-access-cmpmq\") pod \"success-200-isvc-6c7b3-predictor-7f4499fcf8-f5g4c\" (UID: \"a2d82b36-d9ec-4822-aae8-5b166edea3ef\") " pod="kserve-ci-e2e-test/success-200-isvc-6c7b3-predictor-7f4499fcf8-f5g4c" Apr 28 19:31:12.607661 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:12.607633 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-6c7b3-predictor-7f4499fcf8-f5g4c" Apr 28 19:31:12.740532 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:12.740505 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-6c7b3-predictor-7f4499fcf8-f5g4c"] Apr 28 19:31:12.742508 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:31:12.742483 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda2d82b36_d9ec_4822_aae8_5b166edea3ef.slice/crio-37b55213c27799557532b89bdd2924c32334aa11744053b1a55d596c5448717e WatchSource:0}: Error finding container 37b55213c27799557532b89bdd2924c32334aa11744053b1a55d596c5448717e: Status 404 returned error can't find the container with id 37b55213c27799557532b89bdd2924c32334aa11744053b1a55d596c5448717e Apr 28 19:31:13.240734 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:13.240698 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-6c7b3-predictor-7f4499fcf8-f5g4c" event={"ID":"a2d82b36-d9ec-4822-aae8-5b166edea3ef","Type":"ContainerStarted","Data":"f34f177de093a20509c3771dcf7b31764e9a21bdf060f8c79c2162fbfcd423a1"} Apr 28 19:31:13.241188 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:13.240741 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-6c7b3-predictor-7f4499fcf8-f5g4c" event={"ID":"a2d82b36-d9ec-4822-aae8-5b166edea3ef","Type":"ContainerStarted","Data":"1678f39fbf4a5475c83e6488ba51cf1fd7edaa893d655c5e757a20a49d9f0b0f"} Apr 28 19:31:13.241188 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:13.240753 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-6c7b3-predictor-7f4499fcf8-f5g4c" event={"ID":"a2d82b36-d9ec-4822-aae8-5b166edea3ef","Type":"ContainerStarted","Data":"37b55213c27799557532b89bdd2924c32334aa11744053b1a55d596c5448717e"} Apr 28 19:31:13.241188 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:13.240811 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-6c7b3-predictor-7f4499fcf8-f5g4c" Apr 28 19:31:13.242623 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:13.242569 2576 generic.go:358] "Generic (PLEG): container finished" podID="ee494290-a65e-414a-9378-97f2c8034eaa" containerID="a6654bbe42f60ef284419f79bb43bbd4dc47d68335cbabbad4d93824eea00c7b" exitCode=2 Apr 28 19:31:13.242721 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:13.242649 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7b45b6d56d-gdxfv" event={"ID":"ee494290-a65e-414a-9378-97f2c8034eaa","Type":"ContainerDied","Data":"a6654bbe42f60ef284419f79bb43bbd4dc47d68335cbabbad4d93824eea00c7b"} Apr 28 19:31:13.258924 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:13.258879 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-6c7b3-predictor-7f4499fcf8-f5g4c" podStartSLOduration=1.25886699 podStartE2EDuration="1.25886699s" podCreationTimestamp="2026-04-28 19:31:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:31:13.258338987 +0000 UTC m=+889.221239352" watchObservedRunningTime="2026-04-28 19:31:13.25886699 +0000 UTC m=+889.221767355" Apr 28 19:31:13.878979 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:13.878934 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7b45b6d56d-gdxfv" podUID="ee494290-a65e-414a-9378-97f2c8034eaa" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.29:8643/healthz\": dial tcp 10.134.0.29:8643: connect: connection refused" Apr 28 19:31:13.883266 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:13.883243 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7b45b6d56d-gdxfv" podUID="ee494290-a65e-414a-9378-97f2c8034eaa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 28 19:31:14.215300 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:14.215201 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-59fd89b477-szskd" podUID="5d500d4c-1229-4256-848b-499bd86547a4" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:31:14.247247 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:14.247208 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-6c7b3-predictor-7f4499fcf8-f5g4c" Apr 28 19:31:14.248994 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:14.248965 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-6c7b3-predictor-7f4499fcf8-f5g4c" podUID="a2d82b36-d9ec-4822-aae8-5b166edea3ef" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 28 19:31:15.251345 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:15.251304 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-6c7b3-predictor-7f4499fcf8-f5g4c" podUID="a2d82b36-d9ec-4822-aae8-5b166edea3ef" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 28 19:31:16.618099 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:16.618077 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7b45b6d56d-gdxfv" Apr 28 19:31:16.753467 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:16.753429 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ee494290-a65e-414a-9378-97f2c8034eaa-kserve-provision-location\") pod \"ee494290-a65e-414a-9378-97f2c8034eaa\" (UID: \"ee494290-a65e-414a-9378-97f2c8034eaa\") " Apr 28 19:31:16.753679 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:16.753495 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ee494290-a65e-414a-9378-97f2c8034eaa-isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\") pod \"ee494290-a65e-414a-9378-97f2c8034eaa\" (UID: \"ee494290-a65e-414a-9378-97f2c8034eaa\") " Apr 28 19:31:16.753679 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:16.753636 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s6ncm\" (UniqueName: \"kubernetes.io/projected/ee494290-a65e-414a-9378-97f2c8034eaa-kube-api-access-s6ncm\") pod \"ee494290-a65e-414a-9378-97f2c8034eaa\" (UID: \"ee494290-a65e-414a-9378-97f2c8034eaa\") " Apr 28 19:31:16.753679 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:16.753667 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ee494290-a65e-414a-9378-97f2c8034eaa-proxy-tls\") pod \"ee494290-a65e-414a-9378-97f2c8034eaa\" (UID: \"ee494290-a65e-414a-9378-97f2c8034eaa\") " Apr 28 19:31:16.753857 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:16.753822 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee494290-a65e-414a-9378-97f2c8034eaa-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "ee494290-a65e-414a-9378-97f2c8034eaa" (UID: "ee494290-a65e-414a-9378-97f2c8034eaa"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 19:31:16.753903 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:16.753881 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee494290-a65e-414a-9378-97f2c8034eaa-isvc-sklearn-graph-1-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-graph-1-kube-rbac-proxy-sar-config") pod "ee494290-a65e-414a-9378-97f2c8034eaa" (UID: "ee494290-a65e-414a-9378-97f2c8034eaa"). InnerVolumeSpecName "isvc-sklearn-graph-1-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:31:16.753944 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:16.753900 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ee494290-a65e-414a-9378-97f2c8034eaa-kserve-provision-location\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 28 19:31:16.755928 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:16.755897 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee494290-a65e-414a-9378-97f2c8034eaa-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "ee494290-a65e-414a-9378-97f2c8034eaa" (UID: "ee494290-a65e-414a-9378-97f2c8034eaa"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:31:16.755928 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:16.755908 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee494290-a65e-414a-9378-97f2c8034eaa-kube-api-access-s6ncm" (OuterVolumeSpecName: "kube-api-access-s6ncm") pod "ee494290-a65e-414a-9378-97f2c8034eaa" (UID: "ee494290-a65e-414a-9378-97f2c8034eaa"). InnerVolumeSpecName "kube-api-access-s6ncm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:31:16.854838 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:16.854801 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-s6ncm\" (UniqueName: \"kubernetes.io/projected/ee494290-a65e-414a-9378-97f2c8034eaa-kube-api-access-s6ncm\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 28 19:31:16.854838 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:16.854832 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ee494290-a65e-414a-9378-97f2c8034eaa-proxy-tls\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 28 19:31:16.854838 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:16.854845 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ee494290-a65e-414a-9378-97f2c8034eaa-isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 28 19:31:17.259889 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:17.259853 2576 generic.go:358] "Generic (PLEG): container finished" podID="ee494290-a65e-414a-9378-97f2c8034eaa" containerID="520870a0f788fe3e334f84cdd911c86a00ab04d48f5624ae75baad92ce9e0cac" exitCode=0 Apr 28 19:31:17.260053 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:17.259925 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7b45b6d56d-gdxfv" event={"ID":"ee494290-a65e-414a-9378-97f2c8034eaa","Type":"ContainerDied","Data":"520870a0f788fe3e334f84cdd911c86a00ab04d48f5624ae75baad92ce9e0cac"} Apr 28 19:31:17.260053 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:17.259975 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7b45b6d56d-gdxfv" event={"ID":"ee494290-a65e-414a-9378-97f2c8034eaa","Type":"ContainerDied","Data":"fcfbe7a93c39e2b4f0179bbd63bbc38c313708b4c0f94818a59ff34fc3d4ccc4"} Apr 28 19:31:17.260053 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:17.259996 2576 scope.go:117] "RemoveContainer" containerID="a6654bbe42f60ef284419f79bb43bbd4dc47d68335cbabbad4d93824eea00c7b" Apr 28 19:31:17.260195 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:17.259997 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7b45b6d56d-gdxfv" Apr 28 19:31:17.268576 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:17.268559 2576 scope.go:117] "RemoveContainer" containerID="520870a0f788fe3e334f84cdd911c86a00ab04d48f5624ae75baad92ce9e0cac" Apr 28 19:31:17.275950 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:17.275931 2576 scope.go:117] "RemoveContainer" containerID="ac8acbf4e841b4a93d3ec99002e62fd47bd5f58ec0ac452deb39e25414166b4a" Apr 28 19:31:17.283715 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:17.283483 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7b45b6d56d-gdxfv"] Apr 28 19:31:17.284214 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:17.284196 2576 scope.go:117] "RemoveContainer" containerID="a6654bbe42f60ef284419f79bb43bbd4dc47d68335cbabbad4d93824eea00c7b" Apr 28 19:31:17.284486 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:31:17.284466 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6654bbe42f60ef284419f79bb43bbd4dc47d68335cbabbad4d93824eea00c7b\": container with ID starting with a6654bbe42f60ef284419f79bb43bbd4dc47d68335cbabbad4d93824eea00c7b not found: ID does not exist" containerID="a6654bbe42f60ef284419f79bb43bbd4dc47d68335cbabbad4d93824eea00c7b" Apr 28 19:31:17.284558 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:17.284497 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6654bbe42f60ef284419f79bb43bbd4dc47d68335cbabbad4d93824eea00c7b"} err="failed to get container status \"a6654bbe42f60ef284419f79bb43bbd4dc47d68335cbabbad4d93824eea00c7b\": rpc error: code = NotFound desc = could not find container \"a6654bbe42f60ef284419f79bb43bbd4dc47d68335cbabbad4d93824eea00c7b\": container with ID starting with a6654bbe42f60ef284419f79bb43bbd4dc47d68335cbabbad4d93824eea00c7b not found: ID does not exist" Apr 28 19:31:17.284558 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:17.284522 2576 scope.go:117] "RemoveContainer" containerID="520870a0f788fe3e334f84cdd911c86a00ab04d48f5624ae75baad92ce9e0cac" Apr 28 19:31:17.284826 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:31:17.284801 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"520870a0f788fe3e334f84cdd911c86a00ab04d48f5624ae75baad92ce9e0cac\": container with ID starting with 520870a0f788fe3e334f84cdd911c86a00ab04d48f5624ae75baad92ce9e0cac not found: ID does not exist" containerID="520870a0f788fe3e334f84cdd911c86a00ab04d48f5624ae75baad92ce9e0cac" Apr 28 19:31:17.284930 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:17.284835 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"520870a0f788fe3e334f84cdd911c86a00ab04d48f5624ae75baad92ce9e0cac"} err="failed to get container status \"520870a0f788fe3e334f84cdd911c86a00ab04d48f5624ae75baad92ce9e0cac\": rpc error: code = NotFound desc = could not find container \"520870a0f788fe3e334f84cdd911c86a00ab04d48f5624ae75baad92ce9e0cac\": container with ID starting with 520870a0f788fe3e334f84cdd911c86a00ab04d48f5624ae75baad92ce9e0cac not found: ID does not exist" Apr 28 19:31:17.284930 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:17.284855 2576 scope.go:117] "RemoveContainer" containerID="ac8acbf4e841b4a93d3ec99002e62fd47bd5f58ec0ac452deb39e25414166b4a" Apr 28 19:31:17.285156 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:31:17.285137 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac8acbf4e841b4a93d3ec99002e62fd47bd5f58ec0ac452deb39e25414166b4a\": container with ID starting with ac8acbf4e841b4a93d3ec99002e62fd47bd5f58ec0ac452deb39e25414166b4a not found: ID does not exist" containerID="ac8acbf4e841b4a93d3ec99002e62fd47bd5f58ec0ac452deb39e25414166b4a" Apr 28 19:31:17.285233 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:17.285163 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac8acbf4e841b4a93d3ec99002e62fd47bd5f58ec0ac452deb39e25414166b4a"} err="failed to get container status \"ac8acbf4e841b4a93d3ec99002e62fd47bd5f58ec0ac452deb39e25414166b4a\": rpc error: code = NotFound desc = could not find container \"ac8acbf4e841b4a93d3ec99002e62fd47bd5f58ec0ac452deb39e25414166b4a\": container with ID starting with ac8acbf4e841b4a93d3ec99002e62fd47bd5f58ec0ac452deb39e25414166b4a not found: ID does not exist" Apr 28 19:31:17.286011 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:17.285994 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7b45b6d56d-gdxfv"] Apr 28 19:31:18.514151 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:18.514118 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee494290-a65e-414a-9378-97f2c8034eaa" path="/var/lib/kubelet/pods/ee494290-a65e-414a-9378-97f2c8034eaa/volumes" Apr 28 19:31:19.214201 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:19.214161 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-59fd89b477-szskd" podUID="5d500d4c-1229-4256-848b-499bd86547a4" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:31:20.111768 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:20.111737 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-0600b-predictor-655f9fcfd4-98d2n" Apr 28 19:31:20.256224 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:20.256193 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-6c7b3-predictor-7f4499fcf8-f5g4c" Apr 28 19:31:20.256834 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:20.256801 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-6c7b3-predictor-7f4499fcf8-f5g4c" podUID="a2d82b36-d9ec-4822-aae8-5b166edea3ef" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 28 19:31:24.214924 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:24.214875 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-59fd89b477-szskd" podUID="5d500d4c-1229-4256-848b-499bd86547a4" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:31:24.215381 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:24.215045 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-59fd89b477-szskd" Apr 28 19:31:24.477815 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:24.477730 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fts6r_6551d5d0-2583-4478-98ab-1efc22016165/console-operator/1.log" Apr 28 19:31:24.479540 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:24.479516 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fts6r_6551d5d0-2583-4478-98ab-1efc22016165/console-operator/1.log" Apr 28 19:31:24.483349 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:24.483329 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5tdk8_ff2f50e1-de53-4f11-a477-9236b340536b/ovn-acl-logging/0.log" Apr 28 19:31:24.484730 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:24.484709 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5tdk8_ff2f50e1-de53-4f11-a477-9236b340536b/ovn-acl-logging/0.log" Apr 28 19:31:29.214689 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:29.214641 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-59fd89b477-szskd" podUID="5d500d4c-1229-4256-848b-499bd86547a4" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:31:30.257567 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:30.257528 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-6c7b3-predictor-7f4499fcf8-f5g4c" podUID="a2d82b36-d9ec-4822-aae8-5b166edea3ef" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 28 19:31:32.520979 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:32.520945 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/switch-graph-0600b-587b488647-lbj9j"] Apr 28 19:31:32.521321 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:32.521278 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ee494290-a65e-414a-9378-97f2c8034eaa" containerName="storage-initializer" Apr 28 19:31:32.521321 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:32.521291 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee494290-a65e-414a-9378-97f2c8034eaa" containerName="storage-initializer" Apr 28 19:31:32.521321 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:32.521305 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ee494290-a65e-414a-9378-97f2c8034eaa" containerName="kube-rbac-proxy" Apr 28 19:31:32.521321 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:32.521310 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee494290-a65e-414a-9378-97f2c8034eaa" containerName="kube-rbac-proxy" Apr 28 19:31:32.521448 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:32.521323 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ee494290-a65e-414a-9378-97f2c8034eaa" containerName="kserve-container" Apr 28 19:31:32.521448 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:32.521329 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee494290-a65e-414a-9378-97f2c8034eaa" containerName="kserve-container" Apr 28 19:31:32.521448 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:32.521382 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="ee494290-a65e-414a-9378-97f2c8034eaa" containerName="kube-rbac-proxy" Apr 28 19:31:32.521448 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:32.521392 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="ee494290-a65e-414a-9378-97f2c8034eaa" containerName="kserve-container" Apr 28 19:31:32.525775 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:32.525756 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-0600b-587b488647-lbj9j" Apr 28 19:31:32.528146 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:32.528126 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-0600b-kube-rbac-proxy-sar-config\"" Apr 28 19:31:32.528230 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:32.528203 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-0600b-serving-cert\"" Apr 28 19:31:32.533012 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:32.532770 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-0600b-587b488647-lbj9j"] Apr 28 19:31:32.573279 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:32.573228 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5f9e1cee-58d5-48e7-a221-5d22c91e1a32-proxy-tls\") pod \"switch-graph-0600b-587b488647-lbj9j\" (UID: \"5f9e1cee-58d5-48e7-a221-5d22c91e1a32\") " pod="kserve-ci-e2e-test/switch-graph-0600b-587b488647-lbj9j" Apr 28 19:31:32.573471 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:32.573318 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5f9e1cee-58d5-48e7-a221-5d22c91e1a32-openshift-service-ca-bundle\") pod \"switch-graph-0600b-587b488647-lbj9j\" (UID: \"5f9e1cee-58d5-48e7-a221-5d22c91e1a32\") " pod="kserve-ci-e2e-test/switch-graph-0600b-587b488647-lbj9j" Apr 28 19:31:32.674436 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:32.674401 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5f9e1cee-58d5-48e7-a221-5d22c91e1a32-openshift-service-ca-bundle\") pod \"switch-graph-0600b-587b488647-lbj9j\" (UID: \"5f9e1cee-58d5-48e7-a221-5d22c91e1a32\") " pod="kserve-ci-e2e-test/switch-graph-0600b-587b488647-lbj9j" Apr 28 19:31:32.674654 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:32.674486 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5f9e1cee-58d5-48e7-a221-5d22c91e1a32-proxy-tls\") pod \"switch-graph-0600b-587b488647-lbj9j\" (UID: \"5f9e1cee-58d5-48e7-a221-5d22c91e1a32\") " pod="kserve-ci-e2e-test/switch-graph-0600b-587b488647-lbj9j" Apr 28 19:31:32.675097 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:32.675063 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5f9e1cee-58d5-48e7-a221-5d22c91e1a32-openshift-service-ca-bundle\") pod \"switch-graph-0600b-587b488647-lbj9j\" (UID: \"5f9e1cee-58d5-48e7-a221-5d22c91e1a32\") " pod="kserve-ci-e2e-test/switch-graph-0600b-587b488647-lbj9j" Apr 28 19:31:32.677281 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:32.677198 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5f9e1cee-58d5-48e7-a221-5d22c91e1a32-proxy-tls\") pod \"switch-graph-0600b-587b488647-lbj9j\" (UID: \"5f9e1cee-58d5-48e7-a221-5d22c91e1a32\") " pod="kserve-ci-e2e-test/switch-graph-0600b-587b488647-lbj9j" Apr 28 19:31:32.836508 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:32.836416 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-0600b-587b488647-lbj9j" Apr 28 19:31:32.954799 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:32.954776 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-0600b-587b488647-lbj9j"] Apr 28 19:31:33.315854 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:33.315820 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-0600b-587b488647-lbj9j" event={"ID":"5f9e1cee-58d5-48e7-a221-5d22c91e1a32","Type":"ContainerStarted","Data":"df1963d7f00de35140262a20875b598e9f77e65a42050fcb9ffffe56d9269f69"} Apr 28 19:31:33.315854 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:33.315855 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-0600b-587b488647-lbj9j" event={"ID":"5f9e1cee-58d5-48e7-a221-5d22c91e1a32","Type":"ContainerStarted","Data":"fc0b1df72d0c72c3e961fc0d22fe14a1855176a56dafafc32e43882d78b69163"} Apr 28 19:31:33.316055 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:33.315992 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-0600b-587b488647-lbj9j" Apr 28 19:31:33.332856 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:33.332801 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/switch-graph-0600b-587b488647-lbj9j" podStartSLOduration=1.332782864 podStartE2EDuration="1.332782864s" podCreationTimestamp="2026-04-28 19:31:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:31:33.332467702 +0000 UTC m=+909.295368073" watchObservedRunningTime="2026-04-28 19:31:33.332782864 +0000 UTC m=+909.295683229" Apr 28 19:31:34.214377 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:34.214342 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-59fd89b477-szskd" podUID="5d500d4c-1229-4256-848b-499bd86547a4" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:31:39.214329 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:39.214285 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-59fd89b477-szskd" podUID="5d500d4c-1229-4256-848b-499bd86547a4" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:31:39.325672 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:39.325648 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/switch-graph-0600b-587b488647-lbj9j" Apr 28 19:31:40.257154 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:40.257117 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-6c7b3-predictor-7f4499fcf8-f5g4c" podUID="a2d82b36-d9ec-4822-aae8-5b166edea3ef" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 28 19:31:42.138203 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:31:42.138168 2576 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d500d4c_1229_4256_848b_499bd86547a4.slice/crio-conmon-0579a97726d6630412b6f516e4e81bd9b513d0e396abb8f48ee095e0a30989d8.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d500d4c_1229_4256_848b_499bd86547a4.slice/crio-0579a97726d6630412b6f516e4e81bd9b513d0e396abb8f48ee095e0a30989d8.scope\": RecentStats: unable to find data in memory cache]" Apr 28 19:31:42.138578 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:31:42.138212 2576 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d500d4c_1229_4256_848b_499bd86547a4.slice/crio-conmon-0579a97726d6630412b6f516e4e81bd9b513d0e396abb8f48ee095e0a30989d8.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d500d4c_1229_4256_848b_499bd86547a4.slice/crio-0579a97726d6630412b6f516e4e81bd9b513d0e396abb8f48ee095e0a30989d8.scope\": RecentStats: unable to find data in memory cache]" Apr 28 19:31:42.138578 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:31:42.138291 2576 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d500d4c_1229_4256_848b_499bd86547a4.slice/crio-0579a97726d6630412b6f516e4e81bd9b513d0e396abb8f48ee095e0a30989d8.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d500d4c_1229_4256_848b_499bd86547a4.slice/crio-conmon-0579a97726d6630412b6f516e4e81bd9b513d0e396abb8f48ee095e0a30989d8.scope\": RecentStats: unable to find data in memory cache]" Apr 28 19:31:42.138578 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:31:42.138429 2576 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d500d4c_1229_4256_848b_499bd86547a4.slice/crio-0579a97726d6630412b6f516e4e81bd9b513d0e396abb8f48ee095e0a30989d8.scope\": RecentStats: unable to find data in memory cache]" Apr 28 19:31:42.261594 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:42.261573 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-59fd89b477-szskd" Apr 28 19:31:42.344227 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:42.344125 2576 generic.go:358] "Generic (PLEG): container finished" podID="5d500d4c-1229-4256-848b-499bd86547a4" containerID="0579a97726d6630412b6f516e4e81bd9b513d0e396abb8f48ee095e0a30989d8" exitCode=0 Apr 28 19:31:42.344227 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:42.344174 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-59fd89b477-szskd" event={"ID":"5d500d4c-1229-4256-848b-499bd86547a4","Type":"ContainerDied","Data":"0579a97726d6630412b6f516e4e81bd9b513d0e396abb8f48ee095e0a30989d8"} Apr 28 19:31:42.344227 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:42.344181 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-59fd89b477-szskd" Apr 28 19:31:42.344227 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:42.344195 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-59fd89b477-szskd" event={"ID":"5d500d4c-1229-4256-848b-499bd86547a4","Type":"ContainerDied","Data":"91bfd093ae6b7a074a19efcc354caa1d37ab98b747eec23d8c7b527f150b2694"} Apr 28 19:31:42.344227 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:42.344210 2576 scope.go:117] "RemoveContainer" containerID="0579a97726d6630412b6f516e4e81bd9b513d0e396abb8f48ee095e0a30989d8" Apr 28 19:31:42.353392 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:42.351813 2576 scope.go:117] "RemoveContainer" containerID="0579a97726d6630412b6f516e4e81bd9b513d0e396abb8f48ee095e0a30989d8" Apr 28 19:31:42.353392 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:31:42.352077 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0579a97726d6630412b6f516e4e81bd9b513d0e396abb8f48ee095e0a30989d8\": container with ID starting with 0579a97726d6630412b6f516e4e81bd9b513d0e396abb8f48ee095e0a30989d8 not found: ID does not exist" containerID="0579a97726d6630412b6f516e4e81bd9b513d0e396abb8f48ee095e0a30989d8" Apr 28 19:31:42.353392 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:42.352101 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0579a97726d6630412b6f516e4e81bd9b513d0e396abb8f48ee095e0a30989d8"} err="failed to get container status \"0579a97726d6630412b6f516e4e81bd9b513d0e396abb8f48ee095e0a30989d8\": rpc error: code = NotFound desc = could not find container \"0579a97726d6630412b6f516e4e81bd9b513d0e396abb8f48ee095e0a30989d8\": container with ID starting with 0579a97726d6630412b6f516e4e81bd9b513d0e396abb8f48ee095e0a30989d8 not found: ID does not exist" Apr 28 19:31:42.356709 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:42.356687 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5d500d4c-1229-4256-848b-499bd86547a4-openshift-service-ca-bundle\") pod \"5d500d4c-1229-4256-848b-499bd86547a4\" (UID: \"5d500d4c-1229-4256-848b-499bd86547a4\") " Apr 28 19:31:42.356816 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:42.356745 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5d500d4c-1229-4256-848b-499bd86547a4-proxy-tls\") pod \"5d500d4c-1229-4256-848b-499bd86547a4\" (UID: \"5d500d4c-1229-4256-848b-499bd86547a4\") " Apr 28 19:31:42.357048 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:42.357018 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d500d4c-1229-4256-848b-499bd86547a4-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "5d500d4c-1229-4256-848b-499bd86547a4" (UID: "5d500d4c-1229-4256-848b-499bd86547a4"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:31:42.359059 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:42.359037 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d500d4c-1229-4256-848b-499bd86547a4-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "5d500d4c-1229-4256-848b-499bd86547a4" (UID: "5d500d4c-1229-4256-848b-499bd86547a4"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:31:42.457663 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:42.457633 2576 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5d500d4c-1229-4256-848b-499bd86547a4-openshift-service-ca-bundle\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 28 19:31:42.457663 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:42.457661 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5d500d4c-1229-4256-848b-499bd86547a4-proxy-tls\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 28 19:31:42.661871 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:42.661806 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-59fd89b477-szskd"] Apr 28 19:31:42.664309 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:42.664286 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/model-chainer-59fd89b477-szskd"] Apr 28 19:31:44.513490 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:44.513456 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d500d4c-1229-4256-848b-499bd86547a4" path="/var/lib/kubelet/pods/5d500d4c-1229-4256-848b-499bd86547a4/volumes" Apr 28 19:31:50.257744 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:31:50.257704 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-6c7b3-predictor-7f4499fcf8-f5g4c" podUID="a2d82b36-d9ec-4822-aae8-5b166edea3ef" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 28 19:32:00.257791 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:32:00.257751 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-6c7b3-predictor-7f4499fcf8-f5g4c" Apr 28 19:32:12.297810 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:32:12.297776 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/sequence-graph-6c7b3-86d869f885-wpbqp"] Apr 28 19:32:12.298250 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:32:12.298164 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5d500d4c-1229-4256-848b-499bd86547a4" containerName="model-chainer" Apr 28 19:32:12.298250 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:32:12.298177 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d500d4c-1229-4256-848b-499bd86547a4" containerName="model-chainer" Apr 28 19:32:12.298250 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:32:12.298248 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="5d500d4c-1229-4256-848b-499bd86547a4" containerName="model-chainer" Apr 28 19:32:12.301299 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:32:12.301283 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-6c7b3-86d869f885-wpbqp" Apr 28 19:32:12.303820 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:32:12.303798 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-6c7b3-serving-cert\"" Apr 28 19:32:12.303953 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:32:12.303918 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-6c7b3-kube-rbac-proxy-sar-config\"" Apr 28 19:32:12.309394 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:32:12.309363 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-6c7b3-86d869f885-wpbqp"] Apr 28 19:32:12.396282 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:32:12.396246 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28c849df-0a23-4f7a-a897-51577af0df9d-openshift-service-ca-bundle\") pod \"sequence-graph-6c7b3-86d869f885-wpbqp\" (UID: \"28c849df-0a23-4f7a-a897-51577af0df9d\") " pod="kserve-ci-e2e-test/sequence-graph-6c7b3-86d869f885-wpbqp" Apr 28 19:32:12.396446 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:32:12.396290 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/28c849df-0a23-4f7a-a897-51577af0df9d-proxy-tls\") pod \"sequence-graph-6c7b3-86d869f885-wpbqp\" (UID: \"28c849df-0a23-4f7a-a897-51577af0df9d\") " pod="kserve-ci-e2e-test/sequence-graph-6c7b3-86d869f885-wpbqp" Apr 28 19:32:12.497015 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:32:12.496983 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28c849df-0a23-4f7a-a897-51577af0df9d-openshift-service-ca-bundle\") pod \"sequence-graph-6c7b3-86d869f885-wpbqp\" (UID: \"28c849df-0a23-4f7a-a897-51577af0df9d\") " pod="kserve-ci-e2e-test/sequence-graph-6c7b3-86d869f885-wpbqp" Apr 28 19:32:12.497203 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:32:12.497025 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/28c849df-0a23-4f7a-a897-51577af0df9d-proxy-tls\") pod \"sequence-graph-6c7b3-86d869f885-wpbqp\" (UID: \"28c849df-0a23-4f7a-a897-51577af0df9d\") " pod="kserve-ci-e2e-test/sequence-graph-6c7b3-86d869f885-wpbqp" Apr 28 19:32:12.497203 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:32:12.497142 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/sequence-graph-6c7b3-serving-cert: secret "sequence-graph-6c7b3-serving-cert" not found Apr 28 19:32:12.497284 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:32:12.497229 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/28c849df-0a23-4f7a-a897-51577af0df9d-proxy-tls podName:28c849df-0a23-4f7a-a897-51577af0df9d nodeName:}" failed. No retries permitted until 2026-04-28 19:32:12.997208807 +0000 UTC m=+948.960109154 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/28c849df-0a23-4f7a-a897-51577af0df9d-proxy-tls") pod "sequence-graph-6c7b3-86d869f885-wpbqp" (UID: "28c849df-0a23-4f7a-a897-51577af0df9d") : secret "sequence-graph-6c7b3-serving-cert" not found Apr 28 19:32:12.497632 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:32:12.497592 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28c849df-0a23-4f7a-a897-51577af0df9d-openshift-service-ca-bundle\") pod \"sequence-graph-6c7b3-86d869f885-wpbqp\" (UID: \"28c849df-0a23-4f7a-a897-51577af0df9d\") " pod="kserve-ci-e2e-test/sequence-graph-6c7b3-86d869f885-wpbqp" Apr 28 19:32:13.001267 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:32:13.001227 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/28c849df-0a23-4f7a-a897-51577af0df9d-proxy-tls\") pod \"sequence-graph-6c7b3-86d869f885-wpbqp\" (UID: \"28c849df-0a23-4f7a-a897-51577af0df9d\") " pod="kserve-ci-e2e-test/sequence-graph-6c7b3-86d869f885-wpbqp" Apr 28 19:32:13.003892 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:32:13.003870 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/28c849df-0a23-4f7a-a897-51577af0df9d-proxy-tls\") pod \"sequence-graph-6c7b3-86d869f885-wpbqp\" (UID: \"28c849df-0a23-4f7a-a897-51577af0df9d\") " pod="kserve-ci-e2e-test/sequence-graph-6c7b3-86d869f885-wpbqp" Apr 28 19:32:13.211854 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:32:13.211821 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-6c7b3-86d869f885-wpbqp" Apr 28 19:32:13.333093 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:32:13.333070 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-6c7b3-86d869f885-wpbqp"] Apr 28 19:32:13.335331 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:32:13.335301 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod28c849df_0a23_4f7a_a897_51577af0df9d.slice/crio-9a0c808be01c2357e8b566245579ec3390a4e2cdcf10de17f6a36929f63e2cc5 WatchSource:0}: Error finding container 9a0c808be01c2357e8b566245579ec3390a4e2cdcf10de17f6a36929f63e2cc5: Status 404 returned error can't find the container with id 9a0c808be01c2357e8b566245579ec3390a4e2cdcf10de17f6a36929f63e2cc5 Apr 28 19:32:13.443224 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:32:13.443187 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-6c7b3-86d869f885-wpbqp" event={"ID":"28c849df-0a23-4f7a-a897-51577af0df9d","Type":"ContainerStarted","Data":"028d64c1a3d399d3711b9aa18e4626f02796f4a315f19cad0d03bde9d3aa61c1"} Apr 28 19:32:13.443224 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:32:13.443228 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-6c7b3-86d869f885-wpbqp" event={"ID":"28c849df-0a23-4f7a-a897-51577af0df9d","Type":"ContainerStarted","Data":"9a0c808be01c2357e8b566245579ec3390a4e2cdcf10de17f6a36929f63e2cc5"} Apr 28 19:32:13.443460 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:32:13.443251 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-6c7b3-86d869f885-wpbqp" Apr 28 19:32:13.460029 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:32:13.459982 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/sequence-graph-6c7b3-86d869f885-wpbqp" podStartSLOduration=1.459967481 podStartE2EDuration="1.459967481s" podCreationTimestamp="2026-04-28 19:32:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:32:13.458960487 +0000 UTC m=+949.421860853" watchObservedRunningTime="2026-04-28 19:32:13.459967481 +0000 UTC m=+949.422867846" Apr 28 19:32:19.452283 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:32:19.452252 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sequence-graph-6c7b3-86d869f885-wpbqp" Apr 28 19:36:24.499589 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:36:24.499507 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fts6r_6551d5d0-2583-4478-98ab-1efc22016165/console-operator/1.log" Apr 28 19:36:24.502139 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:36:24.502115 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fts6r_6551d5d0-2583-4478-98ab-1efc22016165/console-operator/1.log" Apr 28 19:36:24.505116 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:36:24.505097 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5tdk8_ff2f50e1-de53-4f11-a477-9236b340536b/ovn-acl-logging/0.log" Apr 28 19:36:24.507515 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:36:24.507499 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5tdk8_ff2f50e1-de53-4f11-a477-9236b340536b/ovn-acl-logging/0.log" Apr 28 19:39:47.205012 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:39:47.204969 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-0600b-587b488647-lbj9j"] Apr 28 19:39:47.205573 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:39:47.205222 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/switch-graph-0600b-587b488647-lbj9j" podUID="5f9e1cee-58d5-48e7-a221-5d22c91e1a32" containerName="switch-graph-0600b" containerID="cri-o://df1963d7f00de35140262a20875b598e9f77e65a42050fcb9ffffe56d9269f69" gracePeriod=30 Apr 28 19:39:47.333957 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:39:47.333922 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-0600b-predictor-655f9fcfd4-98d2n"] Apr 28 19:39:47.334249 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:39:47.334202 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-0600b-predictor-655f9fcfd4-98d2n" podUID="26d374c9-c909-4ef7-b723-e234c4404579" containerName="kserve-container" containerID="cri-o://6f4488adff6cdcaee85693455beca4883abed01041f9bbead35c9a145dd0bec0" gracePeriod=30 Apr 28 19:39:47.334331 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:39:47.334233 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-0600b-predictor-655f9fcfd4-98d2n" podUID="26d374c9-c909-4ef7-b723-e234c4404579" containerName="kube-rbac-proxy" containerID="cri-o://77b1b7a4a4c7bf431c816ff210fd62e3a999f0a5267951c31dfa2591104f8182" gracePeriod=30 Apr 28 19:39:47.395228 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:39:47.395193 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-2a210-predictor-846fc69544-c4mwm"] Apr 28 19:39:47.398660 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:39:47.398641 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-2a210-predictor-846fc69544-c4mwm" Apr 28 19:39:47.401167 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:39:47.401147 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-2a210-kube-rbac-proxy-sar-config\"" Apr 28 19:39:47.401442 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:39:47.401428 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-2a210-predictor-serving-cert\"" Apr 28 19:39:47.416004 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:39:47.415978 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-2a210-predictor-846fc69544-c4mwm"] Apr 28 19:39:47.462133 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:39:47.462041 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4ktb\" (UniqueName: \"kubernetes.io/projected/dfa7538c-fd88-4bac-ad4b-312c32e20b30-kube-api-access-p4ktb\") pod \"success-200-isvc-2a210-predictor-846fc69544-c4mwm\" (UID: \"dfa7538c-fd88-4bac-ad4b-312c32e20b30\") " pod="kserve-ci-e2e-test/success-200-isvc-2a210-predictor-846fc69544-c4mwm" Apr 28 19:39:47.462133 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:39:47.462107 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-2a210-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/dfa7538c-fd88-4bac-ad4b-312c32e20b30-success-200-isvc-2a210-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-2a210-predictor-846fc69544-c4mwm\" (UID: \"dfa7538c-fd88-4bac-ad4b-312c32e20b30\") " pod="kserve-ci-e2e-test/success-200-isvc-2a210-predictor-846fc69544-c4mwm" Apr 28 19:39:47.462350 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:39:47.462144 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dfa7538c-fd88-4bac-ad4b-312c32e20b30-proxy-tls\") pod \"success-200-isvc-2a210-predictor-846fc69544-c4mwm\" (UID: \"dfa7538c-fd88-4bac-ad4b-312c32e20b30\") " pod="kserve-ci-e2e-test/success-200-isvc-2a210-predictor-846fc69544-c4mwm" Apr 28 19:39:47.563018 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:39:47.562972 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p4ktb\" (UniqueName: \"kubernetes.io/projected/dfa7538c-fd88-4bac-ad4b-312c32e20b30-kube-api-access-p4ktb\") pod \"success-200-isvc-2a210-predictor-846fc69544-c4mwm\" (UID: \"dfa7538c-fd88-4bac-ad4b-312c32e20b30\") " pod="kserve-ci-e2e-test/success-200-isvc-2a210-predictor-846fc69544-c4mwm" Apr 28 19:39:47.563018 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:39:47.563033 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-2a210-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/dfa7538c-fd88-4bac-ad4b-312c32e20b30-success-200-isvc-2a210-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-2a210-predictor-846fc69544-c4mwm\" (UID: \"dfa7538c-fd88-4bac-ad4b-312c32e20b30\") " pod="kserve-ci-e2e-test/success-200-isvc-2a210-predictor-846fc69544-c4mwm" Apr 28 19:39:47.563260 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:39:47.563057 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dfa7538c-fd88-4bac-ad4b-312c32e20b30-proxy-tls\") pod \"success-200-isvc-2a210-predictor-846fc69544-c4mwm\" (UID: \"dfa7538c-fd88-4bac-ad4b-312c32e20b30\") " pod="kserve-ci-e2e-test/success-200-isvc-2a210-predictor-846fc69544-c4mwm" Apr 28 19:39:47.563260 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:39:47.563145 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/success-200-isvc-2a210-predictor-serving-cert: secret "success-200-isvc-2a210-predictor-serving-cert" not found Apr 28 19:39:47.563260 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:39:47.563194 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dfa7538c-fd88-4bac-ad4b-312c32e20b30-proxy-tls podName:dfa7538c-fd88-4bac-ad4b-312c32e20b30 nodeName:}" failed. No retries permitted until 2026-04-28 19:39:48.063178533 +0000 UTC m=+1404.026078876 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/dfa7538c-fd88-4bac-ad4b-312c32e20b30-proxy-tls") pod "success-200-isvc-2a210-predictor-846fc69544-c4mwm" (UID: "dfa7538c-fd88-4bac-ad4b-312c32e20b30") : secret "success-200-isvc-2a210-predictor-serving-cert" not found Apr 28 19:39:47.563734 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:39:47.563713 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-2a210-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/dfa7538c-fd88-4bac-ad4b-312c32e20b30-success-200-isvc-2a210-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-2a210-predictor-846fc69544-c4mwm\" (UID: \"dfa7538c-fd88-4bac-ad4b-312c32e20b30\") " pod="kserve-ci-e2e-test/success-200-isvc-2a210-predictor-846fc69544-c4mwm" Apr 28 19:39:47.572296 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:39:47.572269 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4ktb\" (UniqueName: \"kubernetes.io/projected/dfa7538c-fd88-4bac-ad4b-312c32e20b30-kube-api-access-p4ktb\") pod \"success-200-isvc-2a210-predictor-846fc69544-c4mwm\" (UID: \"dfa7538c-fd88-4bac-ad4b-312c32e20b30\") " pod="kserve-ci-e2e-test/success-200-isvc-2a210-predictor-846fc69544-c4mwm" Apr 28 19:39:47.930852 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:39:47.930815 2576 generic.go:358] "Generic (PLEG): container finished" podID="26d374c9-c909-4ef7-b723-e234c4404579" containerID="77b1b7a4a4c7bf431c816ff210fd62e3a999f0a5267951c31dfa2591104f8182" exitCode=2 Apr 28 19:39:47.931024 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:39:47.930866 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-0600b-predictor-655f9fcfd4-98d2n" event={"ID":"26d374c9-c909-4ef7-b723-e234c4404579","Type":"ContainerDied","Data":"77b1b7a4a4c7bf431c816ff210fd62e3a999f0a5267951c31dfa2591104f8182"} Apr 28 19:39:48.066529 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:39:48.066487 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dfa7538c-fd88-4bac-ad4b-312c32e20b30-proxy-tls\") pod \"success-200-isvc-2a210-predictor-846fc69544-c4mwm\" (UID: \"dfa7538c-fd88-4bac-ad4b-312c32e20b30\") " pod="kserve-ci-e2e-test/success-200-isvc-2a210-predictor-846fc69544-c4mwm" Apr 28 19:39:48.066755 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:39:48.066643 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/success-200-isvc-2a210-predictor-serving-cert: secret "success-200-isvc-2a210-predictor-serving-cert" not found Apr 28 19:39:48.066755 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:39:48.066704 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dfa7538c-fd88-4bac-ad4b-312c32e20b30-proxy-tls podName:dfa7538c-fd88-4bac-ad4b-312c32e20b30 nodeName:}" failed. No retries permitted until 2026-04-28 19:39:49.066688079 +0000 UTC m=+1405.029588423 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/dfa7538c-fd88-4bac-ad4b-312c32e20b30-proxy-tls") pod "success-200-isvc-2a210-predictor-846fc69544-c4mwm" (UID: "dfa7538c-fd88-4bac-ad4b-312c32e20b30") : secret "success-200-isvc-2a210-predictor-serving-cert" not found Apr 28 19:39:49.075130 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:39:49.075085 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dfa7538c-fd88-4bac-ad4b-312c32e20b30-proxy-tls\") pod \"success-200-isvc-2a210-predictor-846fc69544-c4mwm\" (UID: \"dfa7538c-fd88-4bac-ad4b-312c32e20b30\") " pod="kserve-ci-e2e-test/success-200-isvc-2a210-predictor-846fc69544-c4mwm" Apr 28 19:39:49.077722 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:39:49.077690 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dfa7538c-fd88-4bac-ad4b-312c32e20b30-proxy-tls\") pod \"success-200-isvc-2a210-predictor-846fc69544-c4mwm\" (UID: \"dfa7538c-fd88-4bac-ad4b-312c32e20b30\") " pod="kserve-ci-e2e-test/success-200-isvc-2a210-predictor-846fc69544-c4mwm" Apr 28 19:39:49.208982 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:39:49.208938 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-2a210-predictor-846fc69544-c4mwm" Apr 28 19:39:49.323340 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:39:49.323299 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-0600b-587b488647-lbj9j" podUID="5f9e1cee-58d5-48e7-a221-5d22c91e1a32" containerName="switch-graph-0600b" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:39:49.334139 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:39:49.334115 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-2a210-predictor-846fc69544-c4mwm"] Apr 28 19:39:49.336024 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:39:49.335993 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddfa7538c_fd88_4bac_ad4b_312c32e20b30.slice/crio-2307eb434fa62574666cd1730e9c020f33d00ab4d893249d299f53f9c94ca843 WatchSource:0}: Error finding container 2307eb434fa62574666cd1730e9c020f33d00ab4d893249d299f53f9c94ca843: Status 404 returned error can't find the container with id 2307eb434fa62574666cd1730e9c020f33d00ab4d893249d299f53f9c94ca843 Apr 28 19:39:49.337860 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:39:49.337844 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 28 19:39:49.939252 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:39:49.939211 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-2a210-predictor-846fc69544-c4mwm" event={"ID":"dfa7538c-fd88-4bac-ad4b-312c32e20b30","Type":"ContainerStarted","Data":"a8db648776a8cd6fecb4746faeaca354b63ad743e56fbcf9c0b9ce1d9e02fa39"} Apr 28 19:39:49.939252 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:39:49.939253 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-2a210-predictor-846fc69544-c4mwm" event={"ID":"dfa7538c-fd88-4bac-ad4b-312c32e20b30","Type":"ContainerStarted","Data":"f2e86e47305aa74164466d240e8e7965a70974a6381756f6a314aa1e748c8409"} Apr 28 19:39:49.939475 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:39:49.939267 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-2a210-predictor-846fc69544-c4mwm" event={"ID":"dfa7538c-fd88-4bac-ad4b-312c32e20b30","Type":"ContainerStarted","Data":"2307eb434fa62574666cd1730e9c020f33d00ab4d893249d299f53f9c94ca843"} Apr 28 19:39:49.939475 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:39:49.939326 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-2a210-predictor-846fc69544-c4mwm" Apr 28 19:39:49.957671 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:39:49.957618 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-2a210-predictor-846fc69544-c4mwm" podStartSLOduration=2.957585261 podStartE2EDuration="2.957585261s" podCreationTimestamp="2026-04-28 19:39:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:39:49.955980311 +0000 UTC m=+1405.918880693" watchObservedRunningTime="2026-04-28 19:39:49.957585261 +0000 UTC m=+1405.920485627" Apr 28 19:39:50.106901 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:39:50.106856 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-0600b-predictor-655f9fcfd4-98d2n" podUID="26d374c9-c909-4ef7-b723-e234c4404579" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.31:8643/healthz\": dial tcp 10.134.0.31:8643: connect: connection refused" Apr 28 19:39:50.111195 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:39:50.111173 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-0600b-predictor-655f9fcfd4-98d2n" podUID="26d374c9-c909-4ef7-b723-e234c4404579" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 28 19:39:50.567543 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:39:50.567517 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-0600b-predictor-655f9fcfd4-98d2n" Apr 28 19:39:50.589424 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:39:50.589398 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7v4wp\" (UniqueName: \"kubernetes.io/projected/26d374c9-c909-4ef7-b723-e234c4404579-kube-api-access-7v4wp\") pod \"26d374c9-c909-4ef7-b723-e234c4404579\" (UID: \"26d374c9-c909-4ef7-b723-e234c4404579\") " Apr 28 19:39:50.589566 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:39:50.589434 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-0600b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/26d374c9-c909-4ef7-b723-e234c4404579-success-200-isvc-0600b-kube-rbac-proxy-sar-config\") pod \"26d374c9-c909-4ef7-b723-e234c4404579\" (UID: \"26d374c9-c909-4ef7-b723-e234c4404579\") " Apr 28 19:39:50.589566 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:39:50.589491 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/26d374c9-c909-4ef7-b723-e234c4404579-proxy-tls\") pod \"26d374c9-c909-4ef7-b723-e234c4404579\" (UID: \"26d374c9-c909-4ef7-b723-e234c4404579\") " Apr 28 19:39:50.589875 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:39:50.589851 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26d374c9-c909-4ef7-b723-e234c4404579-success-200-isvc-0600b-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-0600b-kube-rbac-proxy-sar-config") pod "26d374c9-c909-4ef7-b723-e234c4404579" (UID: "26d374c9-c909-4ef7-b723-e234c4404579"). InnerVolumeSpecName "success-200-isvc-0600b-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:39:50.591669 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:39:50.591636 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26d374c9-c909-4ef7-b723-e234c4404579-kube-api-access-7v4wp" (OuterVolumeSpecName: "kube-api-access-7v4wp") pod "26d374c9-c909-4ef7-b723-e234c4404579" (UID: "26d374c9-c909-4ef7-b723-e234c4404579"). InnerVolumeSpecName "kube-api-access-7v4wp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:39:50.591776 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:39:50.591756 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26d374c9-c909-4ef7-b723-e234c4404579-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "26d374c9-c909-4ef7-b723-e234c4404579" (UID: "26d374c9-c909-4ef7-b723-e234c4404579"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:39:50.690743 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:39:50.690671 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7v4wp\" (UniqueName: \"kubernetes.io/projected/26d374c9-c909-4ef7-b723-e234c4404579-kube-api-access-7v4wp\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 28 19:39:50.690743 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:39:50.690697 2576 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-0600b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/26d374c9-c909-4ef7-b723-e234c4404579-success-200-isvc-0600b-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 28 19:39:50.690743 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:39:50.690709 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/26d374c9-c909-4ef7-b723-e234c4404579-proxy-tls\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 28 19:39:50.944283 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:39:50.944187 2576 generic.go:358] "Generic (PLEG): container finished" podID="26d374c9-c909-4ef7-b723-e234c4404579" containerID="6f4488adff6cdcaee85693455beca4883abed01041f9bbead35c9a145dd0bec0" exitCode=0 Apr 28 19:39:50.944454 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:39:50.944328 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-0600b-predictor-655f9fcfd4-98d2n" event={"ID":"26d374c9-c909-4ef7-b723-e234c4404579","Type":"ContainerDied","Data":"6f4488adff6cdcaee85693455beca4883abed01041f9bbead35c9a145dd0bec0"} Apr 28 19:39:50.944454 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:39:50.944342 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-0600b-predictor-655f9fcfd4-98d2n" Apr 28 19:39:50.944454 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:39:50.944368 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-0600b-predictor-655f9fcfd4-98d2n" event={"ID":"26d374c9-c909-4ef7-b723-e234c4404579","Type":"ContainerDied","Data":"7279d902e88a4cdaffb559ab468e03dd5915e81043b9eb35f6ed37931131ce7e"} Apr 28 19:39:50.944454 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:39:50.944388 2576 scope.go:117] "RemoveContainer" containerID="77b1b7a4a4c7bf431c816ff210fd62e3a999f0a5267951c31dfa2591104f8182" Apr 28 19:39:50.944952 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:39:50.944927 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-2a210-predictor-846fc69544-c4mwm" Apr 28 19:39:50.946785 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:39:50.946755 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-2a210-predictor-846fc69544-c4mwm" podUID="dfa7538c-fd88-4bac-ad4b-312c32e20b30" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 28 19:39:50.953819 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:39:50.953802 2576 scope.go:117] "RemoveContainer" containerID="6f4488adff6cdcaee85693455beca4883abed01041f9bbead35c9a145dd0bec0" Apr 28 19:39:50.962049 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:39:50.962025 2576 scope.go:117] "RemoveContainer" containerID="77b1b7a4a4c7bf431c816ff210fd62e3a999f0a5267951c31dfa2591104f8182" Apr 28 19:39:50.962340 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:39:50.962321 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77b1b7a4a4c7bf431c816ff210fd62e3a999f0a5267951c31dfa2591104f8182\": container with ID starting with 77b1b7a4a4c7bf431c816ff210fd62e3a999f0a5267951c31dfa2591104f8182 not found: ID does not exist" containerID="77b1b7a4a4c7bf431c816ff210fd62e3a999f0a5267951c31dfa2591104f8182" Apr 28 19:39:50.962384 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:39:50.962349 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77b1b7a4a4c7bf431c816ff210fd62e3a999f0a5267951c31dfa2591104f8182"} err="failed to get container status \"77b1b7a4a4c7bf431c816ff210fd62e3a999f0a5267951c31dfa2591104f8182\": rpc error: code = NotFound desc = could not find container \"77b1b7a4a4c7bf431c816ff210fd62e3a999f0a5267951c31dfa2591104f8182\": container with ID starting with 77b1b7a4a4c7bf431c816ff210fd62e3a999f0a5267951c31dfa2591104f8182 not found: ID does not exist" Apr 28 19:39:50.962384 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:39:50.962368 2576 scope.go:117] "RemoveContainer" containerID="6f4488adff6cdcaee85693455beca4883abed01041f9bbead35c9a145dd0bec0" Apr 28 19:39:50.962627 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:39:50.962581 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f4488adff6cdcaee85693455beca4883abed01041f9bbead35c9a145dd0bec0\": container with ID starting with 6f4488adff6cdcaee85693455beca4883abed01041f9bbead35c9a145dd0bec0 not found: ID does not exist" containerID="6f4488adff6cdcaee85693455beca4883abed01041f9bbead35c9a145dd0bec0" Apr 28 19:39:50.962685 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:39:50.962638 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f4488adff6cdcaee85693455beca4883abed01041f9bbead35c9a145dd0bec0"} err="failed to get container status \"6f4488adff6cdcaee85693455beca4883abed01041f9bbead35c9a145dd0bec0\": rpc error: code = NotFound desc = could not find container \"6f4488adff6cdcaee85693455beca4883abed01041f9bbead35c9a145dd0bec0\": container with ID starting with 6f4488adff6cdcaee85693455beca4883abed01041f9bbead35c9a145dd0bec0 not found: ID does not exist" Apr 28 19:39:50.965316 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:39:50.965287 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-0600b-predictor-655f9fcfd4-98d2n"] Apr 28 19:39:50.970152 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:39:50.970129 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-0600b-predictor-655f9fcfd4-98d2n"] Apr 28 19:39:51.950985 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:39:51.950945 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-2a210-predictor-846fc69544-c4mwm" podUID="dfa7538c-fd88-4bac-ad4b-312c32e20b30" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 28 19:39:52.514054 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:39:52.514024 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26d374c9-c909-4ef7-b723-e234c4404579" path="/var/lib/kubelet/pods/26d374c9-c909-4ef7-b723-e234c4404579/volumes" Apr 28 19:39:54.323059 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:39:54.323023 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-0600b-587b488647-lbj9j" podUID="5f9e1cee-58d5-48e7-a221-5d22c91e1a32" containerName="switch-graph-0600b" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:39:56.955969 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:39:56.955932 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-2a210-predictor-846fc69544-c4mwm" Apr 28 19:39:56.956339 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:39:56.956312 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-2a210-predictor-846fc69544-c4mwm" podUID="dfa7538c-fd88-4bac-ad4b-312c32e20b30" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 28 19:39:59.323003 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:39:59.322967 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-0600b-587b488647-lbj9j" podUID="5f9e1cee-58d5-48e7-a221-5d22c91e1a32" containerName="switch-graph-0600b" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:39:59.323370 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:39:59.323134 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-0600b-587b488647-lbj9j" Apr 28 19:40:04.323137 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:04.323093 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-0600b-587b488647-lbj9j" podUID="5f9e1cee-58d5-48e7-a221-5d22c91e1a32" containerName="switch-graph-0600b" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:40:06.956517 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:06.956474 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-2a210-predictor-846fc69544-c4mwm" podUID="dfa7538c-fd88-4bac-ad4b-312c32e20b30" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 28 19:40:09.323808 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:09.323768 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-0600b-587b488647-lbj9j" podUID="5f9e1cee-58d5-48e7-a221-5d22c91e1a32" containerName="switch-graph-0600b" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:40:14.322890 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:14.322801 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-0600b-587b488647-lbj9j" podUID="5f9e1cee-58d5-48e7-a221-5d22c91e1a32" containerName="switch-graph-0600b" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:40:16.957266 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:16.957227 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-2a210-predictor-846fc69544-c4mwm" podUID="dfa7538c-fd88-4bac-ad4b-312c32e20b30" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 28 19:40:17.852134 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:17.852105 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-0600b-587b488647-lbj9j" Apr 28 19:40:17.917781 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:17.917753 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5f9e1cee-58d5-48e7-a221-5d22c91e1a32-proxy-tls\") pod \"5f9e1cee-58d5-48e7-a221-5d22c91e1a32\" (UID: \"5f9e1cee-58d5-48e7-a221-5d22c91e1a32\") " Apr 28 19:40:17.917943 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:17.917850 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5f9e1cee-58d5-48e7-a221-5d22c91e1a32-openshift-service-ca-bundle\") pod \"5f9e1cee-58d5-48e7-a221-5d22c91e1a32\" (UID: \"5f9e1cee-58d5-48e7-a221-5d22c91e1a32\") " Apr 28 19:40:17.918190 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:17.918168 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f9e1cee-58d5-48e7-a221-5d22c91e1a32-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "5f9e1cee-58d5-48e7-a221-5d22c91e1a32" (UID: "5f9e1cee-58d5-48e7-a221-5d22c91e1a32"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:40:17.920023 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:17.920000 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f9e1cee-58d5-48e7-a221-5d22c91e1a32-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "5f9e1cee-58d5-48e7-a221-5d22c91e1a32" (UID: "5f9e1cee-58d5-48e7-a221-5d22c91e1a32"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:40:18.019134 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:18.019101 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5f9e1cee-58d5-48e7-a221-5d22c91e1a32-proxy-tls\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 28 19:40:18.019134 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:18.019135 2576 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5f9e1cee-58d5-48e7-a221-5d22c91e1a32-openshift-service-ca-bundle\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 28 19:40:18.038196 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:18.038167 2576 generic.go:358] "Generic (PLEG): container finished" podID="5f9e1cee-58d5-48e7-a221-5d22c91e1a32" containerID="df1963d7f00de35140262a20875b598e9f77e65a42050fcb9ffffe56d9269f69" exitCode=0 Apr 28 19:40:18.038327 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:18.038233 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-0600b-587b488647-lbj9j" Apr 28 19:40:18.038327 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:18.038256 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-0600b-587b488647-lbj9j" event={"ID":"5f9e1cee-58d5-48e7-a221-5d22c91e1a32","Type":"ContainerDied","Data":"df1963d7f00de35140262a20875b598e9f77e65a42050fcb9ffffe56d9269f69"} Apr 28 19:40:18.038327 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:18.038293 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-0600b-587b488647-lbj9j" event={"ID":"5f9e1cee-58d5-48e7-a221-5d22c91e1a32","Type":"ContainerDied","Data":"fc0b1df72d0c72c3e961fc0d22fe14a1855176a56dafafc32e43882d78b69163"} Apr 28 19:40:18.038327 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:18.038324 2576 scope.go:117] "RemoveContainer" containerID="df1963d7f00de35140262a20875b598e9f77e65a42050fcb9ffffe56d9269f69" Apr 28 19:40:18.046528 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:18.046512 2576 scope.go:117] "RemoveContainer" containerID="df1963d7f00de35140262a20875b598e9f77e65a42050fcb9ffffe56d9269f69" Apr 28 19:40:18.046808 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:40:18.046789 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df1963d7f00de35140262a20875b598e9f77e65a42050fcb9ffffe56d9269f69\": container with ID starting with df1963d7f00de35140262a20875b598e9f77e65a42050fcb9ffffe56d9269f69 not found: ID does not exist" containerID="df1963d7f00de35140262a20875b598e9f77e65a42050fcb9ffffe56d9269f69" Apr 28 19:40:18.046893 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:18.046815 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df1963d7f00de35140262a20875b598e9f77e65a42050fcb9ffffe56d9269f69"} err="failed to get container status \"df1963d7f00de35140262a20875b598e9f77e65a42050fcb9ffffe56d9269f69\": rpc error: code = NotFound desc = could not find container \"df1963d7f00de35140262a20875b598e9f77e65a42050fcb9ffffe56d9269f69\": container with ID starting with df1963d7f00de35140262a20875b598e9f77e65a42050fcb9ffffe56d9269f69 not found: ID does not exist" Apr 28 19:40:18.058135 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:18.058111 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-0600b-587b488647-lbj9j"] Apr 28 19:40:18.062101 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:18.062080 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/switch-graph-0600b-587b488647-lbj9j"] Apr 28 19:40:18.514265 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:18.514238 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f9e1cee-58d5-48e7-a221-5d22c91e1a32" path="/var/lib/kubelet/pods/5f9e1cee-58d5-48e7-a221-5d22c91e1a32/volumes" Apr 28 19:40:26.957392 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:26.957348 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-2a210-predictor-846fc69544-c4mwm" podUID="dfa7538c-fd88-4bac-ad4b-312c32e20b30" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 28 19:40:27.118255 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:27.118221 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-6c7b3-86d869f885-wpbqp"] Apr 28 19:40:27.118498 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:27.118473 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sequence-graph-6c7b3-86d869f885-wpbqp" podUID="28c849df-0a23-4f7a-a897-51577af0df9d" containerName="sequence-graph-6c7b3" containerID="cri-o://028d64c1a3d399d3711b9aa18e4626f02796f4a315f19cad0d03bde9d3aa61c1" gracePeriod=30 Apr 28 19:40:27.225588 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:27.225505 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-6c7b3-predictor-7f4499fcf8-f5g4c"] Apr 28 19:40:27.225818 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:27.225795 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-6c7b3-predictor-7f4499fcf8-f5g4c" podUID="a2d82b36-d9ec-4822-aae8-5b166edea3ef" containerName="kserve-container" containerID="cri-o://1678f39fbf4a5475c83e6488ba51cf1fd7edaa893d655c5e757a20a49d9f0b0f" gracePeriod=30 Apr 28 19:40:27.225891 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:27.225824 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-6c7b3-predictor-7f4499fcf8-f5g4c" podUID="a2d82b36-d9ec-4822-aae8-5b166edea3ef" containerName="kube-rbac-proxy" containerID="cri-o://f34f177de093a20509c3771dcf7b31764e9a21bdf060f8c79c2162fbfcd423a1" gracePeriod=30 Apr 28 19:40:27.274566 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:27.274537 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-26ebe-predictor-585cfc74d-f52v6"] Apr 28 19:40:27.275081 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:27.275063 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5f9e1cee-58d5-48e7-a221-5d22c91e1a32" containerName="switch-graph-0600b" Apr 28 19:40:27.275141 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:27.275086 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f9e1cee-58d5-48e7-a221-5d22c91e1a32" containerName="switch-graph-0600b" Apr 28 19:40:27.275141 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:27.275108 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="26d374c9-c909-4ef7-b723-e234c4404579" containerName="kserve-container" Apr 28 19:40:27.275141 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:27.275116 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="26d374c9-c909-4ef7-b723-e234c4404579" containerName="kserve-container" Apr 28 19:40:27.275141 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:27.275133 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="26d374c9-c909-4ef7-b723-e234c4404579" containerName="kube-rbac-proxy" Apr 28 19:40:27.275141 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:27.275141 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="26d374c9-c909-4ef7-b723-e234c4404579" containerName="kube-rbac-proxy" Apr 28 19:40:27.275288 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:27.275220 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="26d374c9-c909-4ef7-b723-e234c4404579" containerName="kserve-container" Apr 28 19:40:27.275288 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:27.275236 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="26d374c9-c909-4ef7-b723-e234c4404579" containerName="kube-rbac-proxy" Apr 28 19:40:27.275288 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:27.275246 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="5f9e1cee-58d5-48e7-a221-5d22c91e1a32" containerName="switch-graph-0600b" Apr 28 19:40:27.279928 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:27.279909 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-26ebe-predictor-585cfc74d-f52v6" Apr 28 19:40:27.282422 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:27.282401 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-26ebe-kube-rbac-proxy-sar-config\"" Apr 28 19:40:27.282722 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:27.282702 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-26ebe-predictor-serving-cert\"" Apr 28 19:40:27.286659 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:27.286619 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-26ebe-predictor-585cfc74d-f52v6"] Apr 28 19:40:27.395632 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:27.395562 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-26ebe-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/10453b2a-d390-4d9c-9088-02ea43478764-success-200-isvc-26ebe-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-26ebe-predictor-585cfc74d-f52v6\" (UID: \"10453b2a-d390-4d9c-9088-02ea43478764\") " pod="kserve-ci-e2e-test/success-200-isvc-26ebe-predictor-585cfc74d-f52v6" Apr 28 19:40:27.395782 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:27.395701 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gqct\" (UniqueName: \"kubernetes.io/projected/10453b2a-d390-4d9c-9088-02ea43478764-kube-api-access-5gqct\") pod \"success-200-isvc-26ebe-predictor-585cfc74d-f52v6\" (UID: \"10453b2a-d390-4d9c-9088-02ea43478764\") " pod="kserve-ci-e2e-test/success-200-isvc-26ebe-predictor-585cfc74d-f52v6" Apr 28 19:40:27.395782 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:27.395767 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/10453b2a-d390-4d9c-9088-02ea43478764-proxy-tls\") pod \"success-200-isvc-26ebe-predictor-585cfc74d-f52v6\" (UID: \"10453b2a-d390-4d9c-9088-02ea43478764\") " pod="kserve-ci-e2e-test/success-200-isvc-26ebe-predictor-585cfc74d-f52v6" Apr 28 19:40:27.496960 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:27.496918 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/10453b2a-d390-4d9c-9088-02ea43478764-proxy-tls\") pod \"success-200-isvc-26ebe-predictor-585cfc74d-f52v6\" (UID: \"10453b2a-d390-4d9c-9088-02ea43478764\") " pod="kserve-ci-e2e-test/success-200-isvc-26ebe-predictor-585cfc74d-f52v6" Apr 28 19:40:27.497208 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:27.496969 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-26ebe-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/10453b2a-d390-4d9c-9088-02ea43478764-success-200-isvc-26ebe-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-26ebe-predictor-585cfc74d-f52v6\" (UID: \"10453b2a-d390-4d9c-9088-02ea43478764\") " pod="kserve-ci-e2e-test/success-200-isvc-26ebe-predictor-585cfc74d-f52v6" Apr 28 19:40:27.497208 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:27.497052 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5gqct\" (UniqueName: \"kubernetes.io/projected/10453b2a-d390-4d9c-9088-02ea43478764-kube-api-access-5gqct\") pod \"success-200-isvc-26ebe-predictor-585cfc74d-f52v6\" (UID: \"10453b2a-d390-4d9c-9088-02ea43478764\") " pod="kserve-ci-e2e-test/success-200-isvc-26ebe-predictor-585cfc74d-f52v6" Apr 28 19:40:27.497208 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:40:27.497082 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/success-200-isvc-26ebe-predictor-serving-cert: secret "success-200-isvc-26ebe-predictor-serving-cert" not found Apr 28 19:40:27.497208 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:40:27.497166 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/10453b2a-d390-4d9c-9088-02ea43478764-proxy-tls podName:10453b2a-d390-4d9c-9088-02ea43478764 nodeName:}" failed. No retries permitted until 2026-04-28 19:40:27.997143025 +0000 UTC m=+1443.960043383 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/10453b2a-d390-4d9c-9088-02ea43478764-proxy-tls") pod "success-200-isvc-26ebe-predictor-585cfc74d-f52v6" (UID: "10453b2a-d390-4d9c-9088-02ea43478764") : secret "success-200-isvc-26ebe-predictor-serving-cert" not found Apr 28 19:40:27.497744 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:27.497718 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-26ebe-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/10453b2a-d390-4d9c-9088-02ea43478764-success-200-isvc-26ebe-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-26ebe-predictor-585cfc74d-f52v6\" (UID: \"10453b2a-d390-4d9c-9088-02ea43478764\") " pod="kserve-ci-e2e-test/success-200-isvc-26ebe-predictor-585cfc74d-f52v6" Apr 28 19:40:27.508693 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:27.508666 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gqct\" (UniqueName: \"kubernetes.io/projected/10453b2a-d390-4d9c-9088-02ea43478764-kube-api-access-5gqct\") pod \"success-200-isvc-26ebe-predictor-585cfc74d-f52v6\" (UID: \"10453b2a-d390-4d9c-9088-02ea43478764\") " pod="kserve-ci-e2e-test/success-200-isvc-26ebe-predictor-585cfc74d-f52v6" Apr 28 19:40:28.000957 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:28.000922 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/10453b2a-d390-4d9c-9088-02ea43478764-proxy-tls\") pod \"success-200-isvc-26ebe-predictor-585cfc74d-f52v6\" (UID: \"10453b2a-d390-4d9c-9088-02ea43478764\") " pod="kserve-ci-e2e-test/success-200-isvc-26ebe-predictor-585cfc74d-f52v6" Apr 28 19:40:28.003465 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:28.003435 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/10453b2a-d390-4d9c-9088-02ea43478764-proxy-tls\") pod \"success-200-isvc-26ebe-predictor-585cfc74d-f52v6\" (UID: \"10453b2a-d390-4d9c-9088-02ea43478764\") " pod="kserve-ci-e2e-test/success-200-isvc-26ebe-predictor-585cfc74d-f52v6" Apr 28 19:40:28.075816 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:28.075764 2576 generic.go:358] "Generic (PLEG): container finished" podID="a2d82b36-d9ec-4822-aae8-5b166edea3ef" containerID="f34f177de093a20509c3771dcf7b31764e9a21bdf060f8c79c2162fbfcd423a1" exitCode=2 Apr 28 19:40:28.075975 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:28.075841 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-6c7b3-predictor-7f4499fcf8-f5g4c" event={"ID":"a2d82b36-d9ec-4822-aae8-5b166edea3ef","Type":"ContainerDied","Data":"f34f177de093a20509c3771dcf7b31764e9a21bdf060f8c79c2162fbfcd423a1"} Apr 28 19:40:28.193323 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:28.193285 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-26ebe-predictor-585cfc74d-f52v6" Apr 28 19:40:28.316369 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:28.316338 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-26ebe-predictor-585cfc74d-f52v6"] Apr 28 19:40:28.319520 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:40:28.319473 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10453b2a_d390_4d9c_9088_02ea43478764.slice/crio-9e4e9fefb12308bdfcaba4d432d3a498670cfc3099548c56aa3b52ff630498c0 WatchSource:0}: Error finding container 9e4e9fefb12308bdfcaba4d432d3a498670cfc3099548c56aa3b52ff630498c0: Status 404 returned error can't find the container with id 9e4e9fefb12308bdfcaba4d432d3a498670cfc3099548c56aa3b52ff630498c0 Apr 28 19:40:29.080594 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:29.080553 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-26ebe-predictor-585cfc74d-f52v6" event={"ID":"10453b2a-d390-4d9c-9088-02ea43478764","Type":"ContainerStarted","Data":"10afb3f960d784ae9bdf3f33caa109a3c1d73823b9d7d0b84e1d812ff53cf6a7"} Apr 28 19:40:29.080594 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:29.080589 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-26ebe-predictor-585cfc74d-f52v6" event={"ID":"10453b2a-d390-4d9c-9088-02ea43478764","Type":"ContainerStarted","Data":"ff4869663af6adeec9f40306311c9d51eeda0db22753198d3439ccb971a63df5"} Apr 28 19:40:29.080594 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:29.080618 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-26ebe-predictor-585cfc74d-f52v6" event={"ID":"10453b2a-d390-4d9c-9088-02ea43478764","Type":"ContainerStarted","Data":"9e4e9fefb12308bdfcaba4d432d3a498670cfc3099548c56aa3b52ff630498c0"} Apr 28 19:40:29.081152 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:29.080725 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-26ebe-predictor-585cfc74d-f52v6" Apr 28 19:40:29.081152 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:29.080749 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-26ebe-predictor-585cfc74d-f52v6" Apr 28 19:40:29.082183 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:29.082158 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-26ebe-predictor-585cfc74d-f52v6" podUID="10453b2a-d390-4d9c-9088-02ea43478764" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 28 19:40:29.099526 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:29.099464 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-26ebe-predictor-585cfc74d-f52v6" podStartSLOduration=2.099447672 podStartE2EDuration="2.099447672s" podCreationTimestamp="2026-04-28 19:40:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:40:29.098413321 +0000 UTC m=+1445.061313700" watchObservedRunningTime="2026-04-28 19:40:29.099447672 +0000 UTC m=+1445.062348037" Apr 28 19:40:29.452990 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:29.452908 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-6c7b3-86d869f885-wpbqp" podUID="28c849df-0a23-4f7a-a897-51577af0df9d" containerName="sequence-graph-6c7b3" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:40:30.084364 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:30.084317 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-26ebe-predictor-585cfc74d-f52v6" podUID="10453b2a-d390-4d9c-9088-02ea43478764" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 28 19:40:30.252380 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:30.252337 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-6c7b3-predictor-7f4499fcf8-f5g4c" podUID="a2d82b36-d9ec-4822-aae8-5b166edea3ef" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.33:8643/healthz\": dial tcp 10.134.0.33:8643: connect: connection refused" Apr 28 19:40:30.257836 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:30.257802 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-6c7b3-predictor-7f4499fcf8-f5g4c" podUID="a2d82b36-d9ec-4822-aae8-5b166edea3ef" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 28 19:40:30.374917 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:30.374896 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-6c7b3-predictor-7f4499fcf8-f5g4c" Apr 28 19:40:30.521231 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:30.521200 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-6c7b3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a2d82b36-d9ec-4822-aae8-5b166edea3ef-success-200-isvc-6c7b3-kube-rbac-proxy-sar-config\") pod \"a2d82b36-d9ec-4822-aae8-5b166edea3ef\" (UID: \"a2d82b36-d9ec-4822-aae8-5b166edea3ef\") " Apr 28 19:40:30.521397 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:30.521256 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a2d82b36-d9ec-4822-aae8-5b166edea3ef-proxy-tls\") pod \"a2d82b36-d9ec-4822-aae8-5b166edea3ef\" (UID: \"a2d82b36-d9ec-4822-aae8-5b166edea3ef\") " Apr 28 19:40:30.521397 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:30.521277 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cmpmq\" (UniqueName: \"kubernetes.io/projected/a2d82b36-d9ec-4822-aae8-5b166edea3ef-kube-api-access-cmpmq\") pod \"a2d82b36-d9ec-4822-aae8-5b166edea3ef\" (UID: \"a2d82b36-d9ec-4822-aae8-5b166edea3ef\") " Apr 28 19:40:30.521554 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:30.521527 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2d82b36-d9ec-4822-aae8-5b166edea3ef-success-200-isvc-6c7b3-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-6c7b3-kube-rbac-proxy-sar-config") pod "a2d82b36-d9ec-4822-aae8-5b166edea3ef" (UID: "a2d82b36-d9ec-4822-aae8-5b166edea3ef"). InnerVolumeSpecName "success-200-isvc-6c7b3-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:40:30.523584 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:30.523557 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2d82b36-d9ec-4822-aae8-5b166edea3ef-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "a2d82b36-d9ec-4822-aae8-5b166edea3ef" (UID: "a2d82b36-d9ec-4822-aae8-5b166edea3ef"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:40:30.523584 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:30.523572 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2d82b36-d9ec-4822-aae8-5b166edea3ef-kube-api-access-cmpmq" (OuterVolumeSpecName: "kube-api-access-cmpmq") pod "a2d82b36-d9ec-4822-aae8-5b166edea3ef" (UID: "a2d82b36-d9ec-4822-aae8-5b166edea3ef"). InnerVolumeSpecName "kube-api-access-cmpmq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:40:30.622391 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:30.622302 2576 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-6c7b3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a2d82b36-d9ec-4822-aae8-5b166edea3ef-success-200-isvc-6c7b3-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 28 19:40:30.622391 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:30.622341 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a2d82b36-d9ec-4822-aae8-5b166edea3ef-proxy-tls\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 28 19:40:30.622391 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:30.622351 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cmpmq\" (UniqueName: \"kubernetes.io/projected/a2d82b36-d9ec-4822-aae8-5b166edea3ef-kube-api-access-cmpmq\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 28 19:40:31.088434 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:31.088403 2576 generic.go:358] "Generic (PLEG): container finished" podID="a2d82b36-d9ec-4822-aae8-5b166edea3ef" containerID="1678f39fbf4a5475c83e6488ba51cf1fd7edaa893d655c5e757a20a49d9f0b0f" exitCode=0 Apr 28 19:40:31.088843 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:31.088479 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-6c7b3-predictor-7f4499fcf8-f5g4c" Apr 28 19:40:31.088843 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:31.088481 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-6c7b3-predictor-7f4499fcf8-f5g4c" event={"ID":"a2d82b36-d9ec-4822-aae8-5b166edea3ef","Type":"ContainerDied","Data":"1678f39fbf4a5475c83e6488ba51cf1fd7edaa893d655c5e757a20a49d9f0b0f"} Apr 28 19:40:31.088843 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:31.088520 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-6c7b3-predictor-7f4499fcf8-f5g4c" event={"ID":"a2d82b36-d9ec-4822-aae8-5b166edea3ef","Type":"ContainerDied","Data":"37b55213c27799557532b89bdd2924c32334aa11744053b1a55d596c5448717e"} Apr 28 19:40:31.088843 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:31.088536 2576 scope.go:117] "RemoveContainer" containerID="f34f177de093a20509c3771dcf7b31764e9a21bdf060f8c79c2162fbfcd423a1" Apr 28 19:40:31.097398 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:31.097381 2576 scope.go:117] "RemoveContainer" containerID="1678f39fbf4a5475c83e6488ba51cf1fd7edaa893d655c5e757a20a49d9f0b0f" Apr 28 19:40:31.104205 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:31.104191 2576 scope.go:117] "RemoveContainer" containerID="f34f177de093a20509c3771dcf7b31764e9a21bdf060f8c79c2162fbfcd423a1" Apr 28 19:40:31.104446 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:40:31.104427 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f34f177de093a20509c3771dcf7b31764e9a21bdf060f8c79c2162fbfcd423a1\": container with ID starting with f34f177de093a20509c3771dcf7b31764e9a21bdf060f8c79c2162fbfcd423a1 not found: ID does not exist" containerID="f34f177de093a20509c3771dcf7b31764e9a21bdf060f8c79c2162fbfcd423a1" Apr 28 19:40:31.104507 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:31.104455 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f34f177de093a20509c3771dcf7b31764e9a21bdf060f8c79c2162fbfcd423a1"} err="failed to get container status \"f34f177de093a20509c3771dcf7b31764e9a21bdf060f8c79c2162fbfcd423a1\": rpc error: code = NotFound desc = could not find container \"f34f177de093a20509c3771dcf7b31764e9a21bdf060f8c79c2162fbfcd423a1\": container with ID starting with f34f177de093a20509c3771dcf7b31764e9a21bdf060f8c79c2162fbfcd423a1 not found: ID does not exist" Apr 28 19:40:31.104507 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:31.104471 2576 scope.go:117] "RemoveContainer" containerID="1678f39fbf4a5475c83e6488ba51cf1fd7edaa893d655c5e757a20a49d9f0b0f" Apr 28 19:40:31.104728 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:40:31.104711 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1678f39fbf4a5475c83e6488ba51cf1fd7edaa893d655c5e757a20a49d9f0b0f\": container with ID starting with 1678f39fbf4a5475c83e6488ba51cf1fd7edaa893d655c5e757a20a49d9f0b0f not found: ID does not exist" containerID="1678f39fbf4a5475c83e6488ba51cf1fd7edaa893d655c5e757a20a49d9f0b0f" Apr 28 19:40:31.104782 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:31.104734 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1678f39fbf4a5475c83e6488ba51cf1fd7edaa893d655c5e757a20a49d9f0b0f"} err="failed to get container status \"1678f39fbf4a5475c83e6488ba51cf1fd7edaa893d655c5e757a20a49d9f0b0f\": rpc error: code = NotFound desc = could not find container \"1678f39fbf4a5475c83e6488ba51cf1fd7edaa893d655c5e757a20a49d9f0b0f\": container with ID starting with 1678f39fbf4a5475c83e6488ba51cf1fd7edaa893d655c5e757a20a49d9f0b0f not found: ID does not exist" Apr 28 19:40:31.110408 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:31.110388 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-6c7b3-predictor-7f4499fcf8-f5g4c"] Apr 28 19:40:31.115645 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:31.115627 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-6c7b3-predictor-7f4499fcf8-f5g4c"] Apr 28 19:40:32.514972 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:32.514937 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2d82b36-d9ec-4822-aae8-5b166edea3ef" path="/var/lib/kubelet/pods/a2d82b36-d9ec-4822-aae8-5b166edea3ef/volumes" Apr 28 19:40:34.452074 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:34.452021 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-6c7b3-86d869f885-wpbqp" podUID="28c849df-0a23-4f7a-a897-51577af0df9d" containerName="sequence-graph-6c7b3" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:40:35.089017 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:35.088988 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-26ebe-predictor-585cfc74d-f52v6" Apr 28 19:40:35.089452 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:35.089423 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-26ebe-predictor-585cfc74d-f52v6" podUID="10453b2a-d390-4d9c-9088-02ea43478764" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 28 19:40:36.957386 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:36.957358 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-2a210-predictor-846fc69544-c4mwm" Apr 28 19:40:39.452450 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:39.452413 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-6c7b3-86d869f885-wpbqp" podUID="28c849df-0a23-4f7a-a897-51577af0df9d" containerName="sequence-graph-6c7b3" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:40:39.452826 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:39.452541 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-6c7b3-86d869f885-wpbqp" Apr 28 19:40:44.452026 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:44.451985 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-6c7b3-86d869f885-wpbqp" podUID="28c849df-0a23-4f7a-a897-51577af0df9d" containerName="sequence-graph-6c7b3" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:40:45.090348 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:45.090306 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-26ebe-predictor-585cfc74d-f52v6" podUID="10453b2a-d390-4d9c-9088-02ea43478764" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 28 19:40:47.442370 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:47.442335 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-2a210-76dff99855-v7s6j"] Apr 28 19:40:47.442804 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:47.442713 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a2d82b36-d9ec-4822-aae8-5b166edea3ef" containerName="kube-rbac-proxy" Apr 28 19:40:47.442804 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:47.442727 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2d82b36-d9ec-4822-aae8-5b166edea3ef" containerName="kube-rbac-proxy" Apr 28 19:40:47.442804 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:47.442744 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a2d82b36-d9ec-4822-aae8-5b166edea3ef" containerName="kserve-container" Apr 28 19:40:47.442804 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:47.442750 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2d82b36-d9ec-4822-aae8-5b166edea3ef" containerName="kserve-container" Apr 28 19:40:47.442804 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:47.442801 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="a2d82b36-d9ec-4822-aae8-5b166edea3ef" containerName="kserve-container" Apr 28 19:40:47.443067 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:47.442812 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="a2d82b36-d9ec-4822-aae8-5b166edea3ef" containerName="kube-rbac-proxy" Apr 28 19:40:47.447000 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:47.446976 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-2a210-76dff99855-v7s6j" Apr 28 19:40:47.449278 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:47.449251 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"ensemble-graph-2a210-serving-cert\"" Apr 28 19:40:47.449379 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:47.449359 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"ensemble-graph-2a210-kube-rbac-proxy-sar-config\"" Apr 28 19:40:47.457577 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:47.457552 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-2a210-76dff99855-v7s6j"] Apr 28 19:40:47.469583 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:47.469557 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6f6b7d47-ecfc-45df-a4ea-c6a83efedf61-proxy-tls\") pod \"ensemble-graph-2a210-76dff99855-v7s6j\" (UID: \"6f6b7d47-ecfc-45df-a4ea-c6a83efedf61\") " pod="kserve-ci-e2e-test/ensemble-graph-2a210-76dff99855-v7s6j" Apr 28 19:40:47.469713 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:47.469616 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6f6b7d47-ecfc-45df-a4ea-c6a83efedf61-openshift-service-ca-bundle\") pod \"ensemble-graph-2a210-76dff99855-v7s6j\" (UID: \"6f6b7d47-ecfc-45df-a4ea-c6a83efedf61\") " pod="kserve-ci-e2e-test/ensemble-graph-2a210-76dff99855-v7s6j" Apr 28 19:40:47.571010 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:47.570964 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6f6b7d47-ecfc-45df-a4ea-c6a83efedf61-proxy-tls\") pod \"ensemble-graph-2a210-76dff99855-v7s6j\" (UID: \"6f6b7d47-ecfc-45df-a4ea-c6a83efedf61\") " pod="kserve-ci-e2e-test/ensemble-graph-2a210-76dff99855-v7s6j" Apr 28 19:40:47.571010 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:47.571013 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6f6b7d47-ecfc-45df-a4ea-c6a83efedf61-openshift-service-ca-bundle\") pod \"ensemble-graph-2a210-76dff99855-v7s6j\" (UID: \"6f6b7d47-ecfc-45df-a4ea-c6a83efedf61\") " pod="kserve-ci-e2e-test/ensemble-graph-2a210-76dff99855-v7s6j" Apr 28 19:40:47.571271 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:40:47.571171 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/ensemble-graph-2a210-serving-cert: secret "ensemble-graph-2a210-serving-cert" not found Apr 28 19:40:47.571271 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:40:47.571250 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6f6b7d47-ecfc-45df-a4ea-c6a83efedf61-proxy-tls podName:6f6b7d47-ecfc-45df-a4ea-c6a83efedf61 nodeName:}" failed. No retries permitted until 2026-04-28 19:40:48.07122789 +0000 UTC m=+1464.034128238 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/6f6b7d47-ecfc-45df-a4ea-c6a83efedf61-proxy-tls") pod "ensemble-graph-2a210-76dff99855-v7s6j" (UID: "6f6b7d47-ecfc-45df-a4ea-c6a83efedf61") : secret "ensemble-graph-2a210-serving-cert" not found Apr 28 19:40:47.571725 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:47.571705 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6f6b7d47-ecfc-45df-a4ea-c6a83efedf61-openshift-service-ca-bundle\") pod \"ensemble-graph-2a210-76dff99855-v7s6j\" (UID: \"6f6b7d47-ecfc-45df-a4ea-c6a83efedf61\") " pod="kserve-ci-e2e-test/ensemble-graph-2a210-76dff99855-v7s6j" Apr 28 19:40:48.074303 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:48.074261 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6f6b7d47-ecfc-45df-a4ea-c6a83efedf61-proxy-tls\") pod \"ensemble-graph-2a210-76dff99855-v7s6j\" (UID: \"6f6b7d47-ecfc-45df-a4ea-c6a83efedf61\") " pod="kserve-ci-e2e-test/ensemble-graph-2a210-76dff99855-v7s6j" Apr 28 19:40:48.076908 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:48.076884 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6f6b7d47-ecfc-45df-a4ea-c6a83efedf61-proxy-tls\") pod \"ensemble-graph-2a210-76dff99855-v7s6j\" (UID: \"6f6b7d47-ecfc-45df-a4ea-c6a83efedf61\") " pod="kserve-ci-e2e-test/ensemble-graph-2a210-76dff99855-v7s6j" Apr 28 19:40:48.359576 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:48.359480 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-2a210-76dff99855-v7s6j" Apr 28 19:40:48.483565 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:48.483540 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-2a210-76dff99855-v7s6j"] Apr 28 19:40:48.486014 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:40:48.485989 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f6b7d47_ecfc_45df_a4ea_c6a83efedf61.slice/crio-cb72b5d440bbc65b875d8730147ab2670720d7188cab1bafe981f30051bc46da WatchSource:0}: Error finding container cb72b5d440bbc65b875d8730147ab2670720d7188cab1bafe981f30051bc46da: Status 404 returned error can't find the container with id cb72b5d440bbc65b875d8730147ab2670720d7188cab1bafe981f30051bc46da Apr 28 19:40:49.150460 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:49.150421 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-2a210-76dff99855-v7s6j" event={"ID":"6f6b7d47-ecfc-45df-a4ea-c6a83efedf61","Type":"ContainerStarted","Data":"8abc0bf8b834608efa8917910db365cf11d935aebe770269502a14e0fa42a430"} Apr 28 19:40:49.150460 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:49.150457 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-2a210-76dff99855-v7s6j" event={"ID":"6f6b7d47-ecfc-45df-a4ea-c6a83efedf61","Type":"ContainerStarted","Data":"cb72b5d440bbc65b875d8730147ab2670720d7188cab1bafe981f30051bc46da"} Apr 28 19:40:49.150758 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:49.150631 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/ensemble-graph-2a210-76dff99855-v7s6j" Apr 28 19:40:49.169760 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:49.169714 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/ensemble-graph-2a210-76dff99855-v7s6j" podStartSLOduration=2.169700194 podStartE2EDuration="2.169700194s" podCreationTimestamp="2026-04-28 19:40:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:40:49.167973254 +0000 UTC m=+1465.130873652" watchObservedRunningTime="2026-04-28 19:40:49.169700194 +0000 UTC m=+1465.132600706" Apr 28 19:40:49.450652 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:49.450531 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-6c7b3-86d869f885-wpbqp" podUID="28c849df-0a23-4f7a-a897-51577af0df9d" containerName="sequence-graph-6c7b3" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:40:54.451485 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:54.451444 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-6c7b3-86d869f885-wpbqp" podUID="28c849df-0a23-4f7a-a897-51577af0df9d" containerName="sequence-graph-6c7b3" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:40:55.089474 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:55.089436 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-26ebe-predictor-585cfc74d-f52v6" podUID="10453b2a-d390-4d9c-9088-02ea43478764" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 28 19:40:55.159015 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:55.158988 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/ensemble-graph-2a210-76dff99855-v7s6j" Apr 28 19:40:57.175698 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:57.175666 2576 generic.go:358] "Generic (PLEG): container finished" podID="28c849df-0a23-4f7a-a897-51577af0df9d" containerID="028d64c1a3d399d3711b9aa18e4626f02796f4a315f19cad0d03bde9d3aa61c1" exitCode=0 Apr 28 19:40:57.176069 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:57.175729 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-6c7b3-86d869f885-wpbqp" event={"ID":"28c849df-0a23-4f7a-a897-51577af0df9d","Type":"ContainerDied","Data":"028d64c1a3d399d3711b9aa18e4626f02796f4a315f19cad0d03bde9d3aa61c1"} Apr 28 19:40:57.261266 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:57.261241 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-6c7b3-86d869f885-wpbqp" Apr 28 19:40:57.351015 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:57.350974 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28c849df-0a23-4f7a-a897-51577af0df9d-openshift-service-ca-bundle\") pod \"28c849df-0a23-4f7a-a897-51577af0df9d\" (UID: \"28c849df-0a23-4f7a-a897-51577af0df9d\") " Apr 28 19:40:57.351173 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:57.351061 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/28c849df-0a23-4f7a-a897-51577af0df9d-proxy-tls\") pod \"28c849df-0a23-4f7a-a897-51577af0df9d\" (UID: \"28c849df-0a23-4f7a-a897-51577af0df9d\") " Apr 28 19:40:57.351377 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:57.351355 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28c849df-0a23-4f7a-a897-51577af0df9d-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "28c849df-0a23-4f7a-a897-51577af0df9d" (UID: "28c849df-0a23-4f7a-a897-51577af0df9d"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:40:57.353277 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:57.353260 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28c849df-0a23-4f7a-a897-51577af0df9d-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "28c849df-0a23-4f7a-a897-51577af0df9d" (UID: "28c849df-0a23-4f7a-a897-51577af0df9d"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:40:57.452145 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:57.452070 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/28c849df-0a23-4f7a-a897-51577af0df9d-proxy-tls\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 28 19:40:57.452145 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:57.452101 2576 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28c849df-0a23-4f7a-a897-51577af0df9d-openshift-service-ca-bundle\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 28 19:40:57.512844 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:57.512813 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-2a210-76dff99855-v7s6j"] Apr 28 19:40:57.513035 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:57.513019 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/ensemble-graph-2a210-76dff99855-v7s6j" podUID="6f6b7d47-ecfc-45df-a4ea-c6a83efedf61" containerName="ensemble-graph-2a210" containerID="cri-o://8abc0bf8b834608efa8917910db365cf11d935aebe770269502a14e0fa42a430" gracePeriod=30 Apr 28 19:40:57.621334 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:57.621297 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-2a210-predictor-846fc69544-c4mwm"] Apr 28 19:40:57.621668 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:57.621632 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-2a210-predictor-846fc69544-c4mwm" podUID="dfa7538c-fd88-4bac-ad4b-312c32e20b30" containerName="kserve-container" containerID="cri-o://f2e86e47305aa74164466d240e8e7965a70974a6381756f6a314aa1e748c8409" gracePeriod=30 Apr 28 19:40:57.621668 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:57.621663 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-2a210-predictor-846fc69544-c4mwm" podUID="dfa7538c-fd88-4bac-ad4b-312c32e20b30" containerName="kube-rbac-proxy" containerID="cri-o://a8db648776a8cd6fecb4746faeaca354b63ad743e56fbcf9c0b9ce1d9e02fa39" gracePeriod=30 Apr 28 19:40:57.657754 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:57.657720 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-0ca59-predictor-b4665bc9b-65wkq"] Apr 28 19:40:57.658093 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:57.658080 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="28c849df-0a23-4f7a-a897-51577af0df9d" containerName="sequence-graph-6c7b3" Apr 28 19:40:57.658139 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:57.658094 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="28c849df-0a23-4f7a-a897-51577af0df9d" containerName="sequence-graph-6c7b3" Apr 28 19:40:57.658180 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:57.658154 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="28c849df-0a23-4f7a-a897-51577af0df9d" containerName="sequence-graph-6c7b3" Apr 28 19:40:57.661415 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:57.661395 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-0ca59-predictor-b4665bc9b-65wkq" Apr 28 19:40:57.664166 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:57.664137 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-0ca59-predictor-serving-cert\"" Apr 28 19:40:57.664302 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:57.664140 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-0ca59-kube-rbac-proxy-sar-config\"" Apr 28 19:40:57.674633 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:57.674567 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-0ca59-predictor-b4665bc9b-65wkq"] Apr 28 19:40:57.754619 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:57.754560 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4f191223-363e-4407-ad08-1d395af94d5f-proxy-tls\") pod \"success-200-isvc-0ca59-predictor-b4665bc9b-65wkq\" (UID: \"4f191223-363e-4407-ad08-1d395af94d5f\") " pod="kserve-ci-e2e-test/success-200-isvc-0ca59-predictor-b4665bc9b-65wkq" Apr 28 19:40:57.754773 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:57.754627 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-0ca59-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4f191223-363e-4407-ad08-1d395af94d5f-success-200-isvc-0ca59-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-0ca59-predictor-b4665bc9b-65wkq\" (UID: \"4f191223-363e-4407-ad08-1d395af94d5f\") " pod="kserve-ci-e2e-test/success-200-isvc-0ca59-predictor-b4665bc9b-65wkq" Apr 28 19:40:57.754773 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:57.754698 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94qpr\" (UniqueName: \"kubernetes.io/projected/4f191223-363e-4407-ad08-1d395af94d5f-kube-api-access-94qpr\") pod \"success-200-isvc-0ca59-predictor-b4665bc9b-65wkq\" (UID: \"4f191223-363e-4407-ad08-1d395af94d5f\") " pod="kserve-ci-e2e-test/success-200-isvc-0ca59-predictor-b4665bc9b-65wkq" Apr 28 19:40:57.856154 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:57.856121 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-94qpr\" (UniqueName: \"kubernetes.io/projected/4f191223-363e-4407-ad08-1d395af94d5f-kube-api-access-94qpr\") pod \"success-200-isvc-0ca59-predictor-b4665bc9b-65wkq\" (UID: \"4f191223-363e-4407-ad08-1d395af94d5f\") " pod="kserve-ci-e2e-test/success-200-isvc-0ca59-predictor-b4665bc9b-65wkq" Apr 28 19:40:57.856323 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:57.856195 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4f191223-363e-4407-ad08-1d395af94d5f-proxy-tls\") pod \"success-200-isvc-0ca59-predictor-b4665bc9b-65wkq\" (UID: \"4f191223-363e-4407-ad08-1d395af94d5f\") " pod="kserve-ci-e2e-test/success-200-isvc-0ca59-predictor-b4665bc9b-65wkq" Apr 28 19:40:57.856323 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:57.856217 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-0ca59-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4f191223-363e-4407-ad08-1d395af94d5f-success-200-isvc-0ca59-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-0ca59-predictor-b4665bc9b-65wkq\" (UID: \"4f191223-363e-4407-ad08-1d395af94d5f\") " pod="kserve-ci-e2e-test/success-200-isvc-0ca59-predictor-b4665bc9b-65wkq" Apr 28 19:40:57.861661 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:57.857417 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-0ca59-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4f191223-363e-4407-ad08-1d395af94d5f-success-200-isvc-0ca59-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-0ca59-predictor-b4665bc9b-65wkq\" (UID: \"4f191223-363e-4407-ad08-1d395af94d5f\") " pod="kserve-ci-e2e-test/success-200-isvc-0ca59-predictor-b4665bc9b-65wkq" Apr 28 19:40:57.861661 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:57.859783 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4f191223-363e-4407-ad08-1d395af94d5f-proxy-tls\") pod \"success-200-isvc-0ca59-predictor-b4665bc9b-65wkq\" (UID: \"4f191223-363e-4407-ad08-1d395af94d5f\") " pod="kserve-ci-e2e-test/success-200-isvc-0ca59-predictor-b4665bc9b-65wkq" Apr 28 19:40:57.865400 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:57.865378 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-94qpr\" (UniqueName: \"kubernetes.io/projected/4f191223-363e-4407-ad08-1d395af94d5f-kube-api-access-94qpr\") pod \"success-200-isvc-0ca59-predictor-b4665bc9b-65wkq\" (UID: \"4f191223-363e-4407-ad08-1d395af94d5f\") " pod="kserve-ci-e2e-test/success-200-isvc-0ca59-predictor-b4665bc9b-65wkq" Apr 28 19:40:57.973664 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:57.973625 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-0ca59-predictor-b4665bc9b-65wkq" Apr 28 19:40:58.100870 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:58.100843 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-0ca59-predictor-b4665bc9b-65wkq"] Apr 28 19:40:58.102749 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:40:58.102711 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f191223_363e_4407_ad08_1d395af94d5f.slice/crio-cbad5cc5bbbea480f2872ecec52b7d1e3e8e27be451e0b44e13d8db1829b4f11 WatchSource:0}: Error finding container cbad5cc5bbbea480f2872ecec52b7d1e3e8e27be451e0b44e13d8db1829b4f11: Status 404 returned error can't find the container with id cbad5cc5bbbea480f2872ecec52b7d1e3e8e27be451e0b44e13d8db1829b4f11 Apr 28 19:40:58.180642 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:58.180585 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-0ca59-predictor-b4665bc9b-65wkq" event={"ID":"4f191223-363e-4407-ad08-1d395af94d5f","Type":"ContainerStarted","Data":"d403fe53d147a14bbeb3d2b8c7dd13be8e5085f0949251d32fb59e4849b19017"} Apr 28 19:40:58.180959 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:58.180704 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-0ca59-predictor-b4665bc9b-65wkq" event={"ID":"4f191223-363e-4407-ad08-1d395af94d5f","Type":"ContainerStarted","Data":"cbad5cc5bbbea480f2872ecec52b7d1e3e8e27be451e0b44e13d8db1829b4f11"} Apr 28 19:40:58.181726 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:58.181702 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-6c7b3-86d869f885-wpbqp" event={"ID":"28c849df-0a23-4f7a-a897-51577af0df9d","Type":"ContainerDied","Data":"9a0c808be01c2357e8b566245579ec3390a4e2cdcf10de17f6a36929f63e2cc5"} Apr 28 19:40:58.181825 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:58.181741 2576 scope.go:117] "RemoveContainer" containerID="028d64c1a3d399d3711b9aa18e4626f02796f4a315f19cad0d03bde9d3aa61c1" Apr 28 19:40:58.181825 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:58.181742 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-6c7b3-86d869f885-wpbqp" Apr 28 19:40:58.183881 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:58.183855 2576 generic.go:358] "Generic (PLEG): container finished" podID="dfa7538c-fd88-4bac-ad4b-312c32e20b30" containerID="a8db648776a8cd6fecb4746faeaca354b63ad743e56fbcf9c0b9ce1d9e02fa39" exitCode=2 Apr 28 19:40:58.183967 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:58.183917 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-2a210-predictor-846fc69544-c4mwm" event={"ID":"dfa7538c-fd88-4bac-ad4b-312c32e20b30","Type":"ContainerDied","Data":"a8db648776a8cd6fecb4746faeaca354b63ad743e56fbcf9c0b9ce1d9e02fa39"} Apr 28 19:40:58.203767 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:58.203748 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-6c7b3-86d869f885-wpbqp"] Apr 28 19:40:58.207760 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:58.207739 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-6c7b3-86d869f885-wpbqp"] Apr 28 19:40:58.513500 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:58.513470 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28c849df-0a23-4f7a-a897-51577af0df9d" path="/var/lib/kubelet/pods/28c849df-0a23-4f7a-a897-51577af0df9d/volumes" Apr 28 19:40:59.190527 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:59.190493 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-0ca59-predictor-b4665bc9b-65wkq" event={"ID":"4f191223-363e-4407-ad08-1d395af94d5f","Type":"ContainerStarted","Data":"2702e72a4ea04567285b64982c2540f4ef215b04951e191cf1ef4042cccba10a"} Apr 28 19:40:59.190963 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:59.190659 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-0ca59-predictor-b4665bc9b-65wkq" Apr 28 19:40:59.210516 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:40:59.210465 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-0ca59-predictor-b4665bc9b-65wkq" podStartSLOduration=2.210447573 podStartE2EDuration="2.210447573s" podCreationTimestamp="2026-04-28 19:40:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:40:59.208309499 +0000 UTC m=+1475.171209876" watchObservedRunningTime="2026-04-28 19:40:59.210447573 +0000 UTC m=+1475.173347932" Apr 28 19:41:00.157911 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:00.157867 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-2a210-76dff99855-v7s6j" podUID="6f6b7d47-ecfc-45df-a4ea-c6a83efedf61" containerName="ensemble-graph-2a210" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:41:00.194623 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:00.194577 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-0ca59-predictor-b4665bc9b-65wkq" Apr 28 19:41:00.195818 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:00.195794 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-0ca59-predictor-b4665bc9b-65wkq" podUID="4f191223-363e-4407-ad08-1d395af94d5f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 28 19:41:00.964427 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:00.964398 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-2a210-predictor-846fc69544-c4mwm" Apr 28 19:41:01.085257 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:01.085156 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p4ktb\" (UniqueName: \"kubernetes.io/projected/dfa7538c-fd88-4bac-ad4b-312c32e20b30-kube-api-access-p4ktb\") pod \"dfa7538c-fd88-4bac-ad4b-312c32e20b30\" (UID: \"dfa7538c-fd88-4bac-ad4b-312c32e20b30\") " Apr 28 19:41:01.085257 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:01.085243 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-2a210-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/dfa7538c-fd88-4bac-ad4b-312c32e20b30-success-200-isvc-2a210-kube-rbac-proxy-sar-config\") pod \"dfa7538c-fd88-4bac-ad4b-312c32e20b30\" (UID: \"dfa7538c-fd88-4bac-ad4b-312c32e20b30\") " Apr 28 19:41:01.085487 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:01.085289 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dfa7538c-fd88-4bac-ad4b-312c32e20b30-proxy-tls\") pod \"dfa7538c-fd88-4bac-ad4b-312c32e20b30\" (UID: \"dfa7538c-fd88-4bac-ad4b-312c32e20b30\") " Apr 28 19:41:01.085629 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:01.085565 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dfa7538c-fd88-4bac-ad4b-312c32e20b30-success-200-isvc-2a210-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-2a210-kube-rbac-proxy-sar-config") pod "dfa7538c-fd88-4bac-ad4b-312c32e20b30" (UID: "dfa7538c-fd88-4bac-ad4b-312c32e20b30"). InnerVolumeSpecName "success-200-isvc-2a210-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:41:01.087528 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:01.087503 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfa7538c-fd88-4bac-ad4b-312c32e20b30-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "dfa7538c-fd88-4bac-ad4b-312c32e20b30" (UID: "dfa7538c-fd88-4bac-ad4b-312c32e20b30"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:41:01.087656 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:01.087636 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dfa7538c-fd88-4bac-ad4b-312c32e20b30-kube-api-access-p4ktb" (OuterVolumeSpecName: "kube-api-access-p4ktb") pod "dfa7538c-fd88-4bac-ad4b-312c32e20b30" (UID: "dfa7538c-fd88-4bac-ad4b-312c32e20b30"). InnerVolumeSpecName "kube-api-access-p4ktb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:41:01.186850 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:01.186809 2576 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-2a210-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/dfa7538c-fd88-4bac-ad4b-312c32e20b30-success-200-isvc-2a210-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 28 19:41:01.186850 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:01.186842 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dfa7538c-fd88-4bac-ad4b-312c32e20b30-proxy-tls\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 28 19:41:01.186850 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:01.186852 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-p4ktb\" (UniqueName: \"kubernetes.io/projected/dfa7538c-fd88-4bac-ad4b-312c32e20b30-kube-api-access-p4ktb\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 28 19:41:01.199413 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:01.199380 2576 generic.go:358] "Generic (PLEG): container finished" podID="dfa7538c-fd88-4bac-ad4b-312c32e20b30" containerID="f2e86e47305aa74164466d240e8e7965a70974a6381756f6a314aa1e748c8409" exitCode=0 Apr 28 19:41:01.199867 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:01.199424 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-2a210-predictor-846fc69544-c4mwm" event={"ID":"dfa7538c-fd88-4bac-ad4b-312c32e20b30","Type":"ContainerDied","Data":"f2e86e47305aa74164466d240e8e7965a70974a6381756f6a314aa1e748c8409"} Apr 28 19:41:01.199867 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:01.199463 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-2a210-predictor-846fc69544-c4mwm" Apr 28 19:41:01.199867 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:01.199480 2576 scope.go:117] "RemoveContainer" containerID="a8db648776a8cd6fecb4746faeaca354b63ad743e56fbcf9c0b9ce1d9e02fa39" Apr 28 19:41:01.199867 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:01.199467 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-2a210-predictor-846fc69544-c4mwm" event={"ID":"dfa7538c-fd88-4bac-ad4b-312c32e20b30","Type":"ContainerDied","Data":"2307eb434fa62574666cd1730e9c020f33d00ab4d893249d299f53f9c94ca843"} Apr 28 19:41:01.200067 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:01.200039 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-0ca59-predictor-b4665bc9b-65wkq" podUID="4f191223-363e-4407-ad08-1d395af94d5f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 28 19:41:01.208514 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:01.208495 2576 scope.go:117] "RemoveContainer" containerID="f2e86e47305aa74164466d240e8e7965a70974a6381756f6a314aa1e748c8409" Apr 28 19:41:01.216331 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:01.216308 2576 scope.go:117] "RemoveContainer" containerID="a8db648776a8cd6fecb4746faeaca354b63ad743e56fbcf9c0b9ce1d9e02fa39" Apr 28 19:41:01.216662 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:41:01.216638 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8db648776a8cd6fecb4746faeaca354b63ad743e56fbcf9c0b9ce1d9e02fa39\": container with ID starting with a8db648776a8cd6fecb4746faeaca354b63ad743e56fbcf9c0b9ce1d9e02fa39 not found: ID does not exist" containerID="a8db648776a8cd6fecb4746faeaca354b63ad743e56fbcf9c0b9ce1d9e02fa39" Apr 28 19:41:01.216716 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:01.216673 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8db648776a8cd6fecb4746faeaca354b63ad743e56fbcf9c0b9ce1d9e02fa39"} err="failed to get container status \"a8db648776a8cd6fecb4746faeaca354b63ad743e56fbcf9c0b9ce1d9e02fa39\": rpc error: code = NotFound desc = could not find container \"a8db648776a8cd6fecb4746faeaca354b63ad743e56fbcf9c0b9ce1d9e02fa39\": container with ID starting with a8db648776a8cd6fecb4746faeaca354b63ad743e56fbcf9c0b9ce1d9e02fa39 not found: ID does not exist" Apr 28 19:41:01.216716 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:01.216695 2576 scope.go:117] "RemoveContainer" containerID="f2e86e47305aa74164466d240e8e7965a70974a6381756f6a314aa1e748c8409" Apr 28 19:41:01.216961 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:41:01.216942 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2e86e47305aa74164466d240e8e7965a70974a6381756f6a314aa1e748c8409\": container with ID starting with f2e86e47305aa74164466d240e8e7965a70974a6381756f6a314aa1e748c8409 not found: ID does not exist" containerID="f2e86e47305aa74164466d240e8e7965a70974a6381756f6a314aa1e748c8409" Apr 28 19:41:01.217016 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:01.216972 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2e86e47305aa74164466d240e8e7965a70974a6381756f6a314aa1e748c8409"} err="failed to get container status \"f2e86e47305aa74164466d240e8e7965a70974a6381756f6a314aa1e748c8409\": rpc error: code = NotFound desc = could not find container \"f2e86e47305aa74164466d240e8e7965a70974a6381756f6a314aa1e748c8409\": container with ID starting with f2e86e47305aa74164466d240e8e7965a70974a6381756f6a314aa1e748c8409 not found: ID does not exist" Apr 28 19:41:01.221576 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:01.221552 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-2a210-predictor-846fc69544-c4mwm"] Apr 28 19:41:01.226837 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:01.226806 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-2a210-predictor-846fc69544-c4mwm"] Apr 28 19:41:02.513295 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:02.513260 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dfa7538c-fd88-4bac-ad4b-312c32e20b30" path="/var/lib/kubelet/pods/dfa7538c-fd88-4bac-ad4b-312c32e20b30/volumes" Apr 28 19:41:05.090383 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:05.090341 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-26ebe-predictor-585cfc74d-f52v6" podUID="10453b2a-d390-4d9c-9088-02ea43478764" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 28 19:41:05.157483 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:05.157448 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-2a210-76dff99855-v7s6j" podUID="6f6b7d47-ecfc-45df-a4ea-c6a83efedf61" containerName="ensemble-graph-2a210" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:41:06.203577 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:06.203548 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-0ca59-predictor-b4665bc9b-65wkq" Apr 28 19:41:06.204121 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:06.204094 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-0ca59-predictor-b4665bc9b-65wkq" podUID="4f191223-363e-4407-ad08-1d395af94d5f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 28 19:41:10.157990 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:10.157946 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-2a210-76dff99855-v7s6j" podUID="6f6b7d47-ecfc-45df-a4ea-c6a83efedf61" containerName="ensemble-graph-2a210" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:41:10.158426 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:10.158121 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/ensemble-graph-2a210-76dff99855-v7s6j" Apr 28 19:41:15.090242 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:15.090209 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-26ebe-predictor-585cfc74d-f52v6" Apr 28 19:41:15.157671 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:15.157595 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-2a210-76dff99855-v7s6j" podUID="6f6b7d47-ecfc-45df-a4ea-c6a83efedf61" containerName="ensemble-graph-2a210" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:41:16.205013 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:16.204973 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-0ca59-predictor-b4665bc9b-65wkq" podUID="4f191223-363e-4407-ad08-1d395af94d5f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 28 19:41:20.158510 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:20.158471 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-2a210-76dff99855-v7s6j" podUID="6f6b7d47-ecfc-45df-a4ea-c6a83efedf61" containerName="ensemble-graph-2a210" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:41:24.521584 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:24.521542 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fts6r_6551d5d0-2583-4478-98ab-1efc22016165/console-operator/1.log" Apr 28 19:41:24.527155 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:24.527129 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5tdk8_ff2f50e1-de53-4f11-a477-9236b340536b/ovn-acl-logging/0.log" Apr 28 19:41:24.529057 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:24.529035 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fts6r_6551d5d0-2583-4478-98ab-1efc22016165/console-operator/1.log" Apr 28 19:41:24.534114 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:24.534095 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5tdk8_ff2f50e1-de53-4f11-a477-9236b340536b/ovn-acl-logging/0.log" Apr 28 19:41:25.157162 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:25.157116 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-2a210-76dff99855-v7s6j" podUID="6f6b7d47-ecfc-45df-a4ea-c6a83efedf61" containerName="ensemble-graph-2a210" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:41:26.204046 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:26.204002 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-0ca59-predictor-b4665bc9b-65wkq" podUID="4f191223-363e-4407-ad08-1d395af94d5f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 28 19:41:27.289959 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:27.289933 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/sequence-graph-26ebe-8fb564fb5-7z45l"] Apr 28 19:41:27.290296 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:27.290250 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dfa7538c-fd88-4bac-ad4b-312c32e20b30" containerName="kube-rbac-proxy" Apr 28 19:41:27.290296 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:27.290260 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfa7538c-fd88-4bac-ad4b-312c32e20b30" containerName="kube-rbac-proxy" Apr 28 19:41:27.290296 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:27.290279 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dfa7538c-fd88-4bac-ad4b-312c32e20b30" containerName="kserve-container" Apr 28 19:41:27.290296 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:27.290285 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfa7538c-fd88-4bac-ad4b-312c32e20b30" containerName="kserve-container" Apr 28 19:41:27.290422 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:27.290341 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="dfa7538c-fd88-4bac-ad4b-312c32e20b30" containerName="kube-rbac-proxy" Apr 28 19:41:27.290422 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:27.290349 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="dfa7538c-fd88-4bac-ad4b-312c32e20b30" containerName="kserve-container" Apr 28 19:41:27.294493 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:27.294475 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-26ebe-8fb564fb5-7z45l" Apr 28 19:41:27.297199 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:27.297177 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-26ebe-kube-rbac-proxy-sar-config\"" Apr 28 19:41:27.297287 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:27.297244 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-26ebe-serving-cert\"" Apr 28 19:41:27.303665 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:27.303641 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-26ebe-8fb564fb5-7z45l"] Apr 28 19:41:27.404066 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:27.404035 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1797a188-5e80-4a4b-bae4-da7ab3832311-openshift-service-ca-bundle\") pod \"sequence-graph-26ebe-8fb564fb5-7z45l\" (UID: \"1797a188-5e80-4a4b-bae4-da7ab3832311\") " pod="kserve-ci-e2e-test/sequence-graph-26ebe-8fb564fb5-7z45l" Apr 28 19:41:27.404066 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:27.404072 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1797a188-5e80-4a4b-bae4-da7ab3832311-proxy-tls\") pod \"sequence-graph-26ebe-8fb564fb5-7z45l\" (UID: \"1797a188-5e80-4a4b-bae4-da7ab3832311\") " pod="kserve-ci-e2e-test/sequence-graph-26ebe-8fb564fb5-7z45l" Apr 28 19:41:27.504738 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:27.504705 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1797a188-5e80-4a4b-bae4-da7ab3832311-openshift-service-ca-bundle\") pod \"sequence-graph-26ebe-8fb564fb5-7z45l\" (UID: \"1797a188-5e80-4a4b-bae4-da7ab3832311\") " pod="kserve-ci-e2e-test/sequence-graph-26ebe-8fb564fb5-7z45l" Apr 28 19:41:27.504738 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:27.504738 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1797a188-5e80-4a4b-bae4-da7ab3832311-proxy-tls\") pod \"sequence-graph-26ebe-8fb564fb5-7z45l\" (UID: \"1797a188-5e80-4a4b-bae4-da7ab3832311\") " pod="kserve-ci-e2e-test/sequence-graph-26ebe-8fb564fb5-7z45l" Apr 28 19:41:27.505358 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:27.505339 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1797a188-5e80-4a4b-bae4-da7ab3832311-openshift-service-ca-bundle\") pod \"sequence-graph-26ebe-8fb564fb5-7z45l\" (UID: \"1797a188-5e80-4a4b-bae4-da7ab3832311\") " pod="kserve-ci-e2e-test/sequence-graph-26ebe-8fb564fb5-7z45l" Apr 28 19:41:27.507138 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:27.507121 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1797a188-5e80-4a4b-bae4-da7ab3832311-proxy-tls\") pod \"sequence-graph-26ebe-8fb564fb5-7z45l\" (UID: \"1797a188-5e80-4a4b-bae4-da7ab3832311\") " pod="kserve-ci-e2e-test/sequence-graph-26ebe-8fb564fb5-7z45l" Apr 28 19:41:27.604948 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:27.604918 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-26ebe-8fb564fb5-7z45l" Apr 28 19:41:27.656372 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:27.656347 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-2a210-76dff99855-v7s6j" Apr 28 19:41:27.739445 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:27.739358 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-26ebe-8fb564fb5-7z45l"] Apr 28 19:41:27.742042 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:41:27.742018 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1797a188_5e80_4a4b_bae4_da7ab3832311.slice/crio-7b3f2e769978eb0da3072517faad11cc3a9b3806cd8abf631fbd8ea69cb0f742 WatchSource:0}: Error finding container 7b3f2e769978eb0da3072517faad11cc3a9b3806cd8abf631fbd8ea69cb0f742: Status 404 returned error can't find the container with id 7b3f2e769978eb0da3072517faad11cc3a9b3806cd8abf631fbd8ea69cb0f742 Apr 28 19:41:27.806991 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:27.806961 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6f6b7d47-ecfc-45df-a4ea-c6a83efedf61-openshift-service-ca-bundle\") pod \"6f6b7d47-ecfc-45df-a4ea-c6a83efedf61\" (UID: \"6f6b7d47-ecfc-45df-a4ea-c6a83efedf61\") " Apr 28 19:41:27.807092 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:27.807005 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6f6b7d47-ecfc-45df-a4ea-c6a83efedf61-proxy-tls\") pod \"6f6b7d47-ecfc-45df-a4ea-c6a83efedf61\" (UID: \"6f6b7d47-ecfc-45df-a4ea-c6a83efedf61\") " Apr 28 19:41:27.807397 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:27.807368 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f6b7d47-ecfc-45df-a4ea-c6a83efedf61-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "6f6b7d47-ecfc-45df-a4ea-c6a83efedf61" (UID: "6f6b7d47-ecfc-45df-a4ea-c6a83efedf61"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:41:27.809105 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:27.809083 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f6b7d47-ecfc-45df-a4ea-c6a83efedf61-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "6f6b7d47-ecfc-45df-a4ea-c6a83efedf61" (UID: "6f6b7d47-ecfc-45df-a4ea-c6a83efedf61"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:41:27.907980 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:27.907911 2576 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6f6b7d47-ecfc-45df-a4ea-c6a83efedf61-openshift-service-ca-bundle\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 28 19:41:27.907980 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:27.907941 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6f6b7d47-ecfc-45df-a4ea-c6a83efedf61-proxy-tls\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 28 19:41:28.288806 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:28.288762 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-26ebe-8fb564fb5-7z45l" event={"ID":"1797a188-5e80-4a4b-bae4-da7ab3832311","Type":"ContainerStarted","Data":"cffb8b9ab039ad167e0fb2cb79219fb70ee647eb5e2b7f3cf53ccf3024b2ee8b"} Apr 28 19:41:28.288806 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:28.288803 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-26ebe-8fb564fb5-7z45l" event={"ID":"1797a188-5e80-4a4b-bae4-da7ab3832311","Type":"ContainerStarted","Data":"7b3f2e769978eb0da3072517faad11cc3a9b3806cd8abf631fbd8ea69cb0f742"} Apr 28 19:41:28.289064 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:28.288841 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-26ebe-8fb564fb5-7z45l" Apr 28 19:41:28.289758 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:28.289737 2576 generic.go:358] "Generic (PLEG): container finished" podID="6f6b7d47-ecfc-45df-a4ea-c6a83efedf61" containerID="8abc0bf8b834608efa8917910db365cf11d935aebe770269502a14e0fa42a430" exitCode=0 Apr 28 19:41:28.289875 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:28.289787 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-2a210-76dff99855-v7s6j" Apr 28 19:41:28.289875 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:28.289799 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-2a210-76dff99855-v7s6j" event={"ID":"6f6b7d47-ecfc-45df-a4ea-c6a83efedf61","Type":"ContainerDied","Data":"8abc0bf8b834608efa8917910db365cf11d935aebe770269502a14e0fa42a430"} Apr 28 19:41:28.289875 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:28.289826 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-2a210-76dff99855-v7s6j" event={"ID":"6f6b7d47-ecfc-45df-a4ea-c6a83efedf61","Type":"ContainerDied","Data":"cb72b5d440bbc65b875d8730147ab2670720d7188cab1bafe981f30051bc46da"} Apr 28 19:41:28.289875 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:28.289845 2576 scope.go:117] "RemoveContainer" containerID="8abc0bf8b834608efa8917910db365cf11d935aebe770269502a14e0fa42a430" Apr 28 19:41:28.299404 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:28.299376 2576 scope.go:117] "RemoveContainer" containerID="8abc0bf8b834608efa8917910db365cf11d935aebe770269502a14e0fa42a430" Apr 28 19:41:28.299861 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:41:28.299832 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8abc0bf8b834608efa8917910db365cf11d935aebe770269502a14e0fa42a430\": container with ID starting with 8abc0bf8b834608efa8917910db365cf11d935aebe770269502a14e0fa42a430 not found: ID does not exist" containerID="8abc0bf8b834608efa8917910db365cf11d935aebe770269502a14e0fa42a430" Apr 28 19:41:28.299989 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:28.299885 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8abc0bf8b834608efa8917910db365cf11d935aebe770269502a14e0fa42a430"} err="failed to get container status \"8abc0bf8b834608efa8917910db365cf11d935aebe770269502a14e0fa42a430\": rpc error: code = NotFound desc = could not find container \"8abc0bf8b834608efa8917910db365cf11d935aebe770269502a14e0fa42a430\": container with ID starting with 8abc0bf8b834608efa8917910db365cf11d935aebe770269502a14e0fa42a430 not found: ID does not exist" Apr 28 19:41:28.305700 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:28.305660 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/sequence-graph-26ebe-8fb564fb5-7z45l" podStartSLOduration=1.30564988 podStartE2EDuration="1.30564988s" podCreationTimestamp="2026-04-28 19:41:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:41:28.304915842 +0000 UTC m=+1504.267816234" watchObservedRunningTime="2026-04-28 19:41:28.30564988 +0000 UTC m=+1504.268550246" Apr 28 19:41:28.318018 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:28.317992 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-2a210-76dff99855-v7s6j"] Apr 28 19:41:28.319511 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:28.319491 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-2a210-76dff99855-v7s6j"] Apr 28 19:41:28.515290 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:28.515264 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f6b7d47-ecfc-45df-a4ea-c6a83efedf61" path="/var/lib/kubelet/pods/6f6b7d47-ecfc-45df-a4ea-c6a83efedf61/volumes" Apr 28 19:41:34.299783 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:34.299751 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sequence-graph-26ebe-8fb564fb5-7z45l" Apr 28 19:41:36.204929 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:36.204879 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-0ca59-predictor-b4665bc9b-65wkq" podUID="4f191223-363e-4407-ad08-1d395af94d5f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 28 19:41:37.358867 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:37.358830 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-26ebe-8fb564fb5-7z45l"] Apr 28 19:41:37.359238 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:37.359097 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sequence-graph-26ebe-8fb564fb5-7z45l" podUID="1797a188-5e80-4a4b-bae4-da7ab3832311" containerName="sequence-graph-26ebe" containerID="cri-o://cffb8b9ab039ad167e0fb2cb79219fb70ee647eb5e2b7f3cf53ccf3024b2ee8b" gracePeriod=30 Apr 28 19:41:37.465064 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:37.465031 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-26ebe-predictor-585cfc74d-f52v6"] Apr 28 19:41:37.465373 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:37.465348 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-26ebe-predictor-585cfc74d-f52v6" podUID="10453b2a-d390-4d9c-9088-02ea43478764" containerName="kserve-container" containerID="cri-o://ff4869663af6adeec9f40306311c9d51eeda0db22753198d3439ccb971a63df5" gracePeriod=30 Apr 28 19:41:37.465460 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:37.465361 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-26ebe-predictor-585cfc74d-f52v6" podUID="10453b2a-d390-4d9c-9088-02ea43478764" containerName="kube-rbac-proxy" containerID="cri-o://10afb3f960d784ae9bdf3f33caa109a3c1d73823b9d7d0b84e1d812ff53cf6a7" gracePeriod=30 Apr 28 19:41:37.497520 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:37.497488 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-7809f-predictor-b4788b9f6-r48p7"] Apr 28 19:41:37.497863 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:37.497850 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6f6b7d47-ecfc-45df-a4ea-c6a83efedf61" containerName="ensemble-graph-2a210" Apr 28 19:41:37.497907 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:37.497865 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f6b7d47-ecfc-45df-a4ea-c6a83efedf61" containerName="ensemble-graph-2a210" Apr 28 19:41:37.497940 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:37.497928 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="6f6b7d47-ecfc-45df-a4ea-c6a83efedf61" containerName="ensemble-graph-2a210" Apr 28 19:41:37.502118 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:37.502098 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-7809f-predictor-b4788b9f6-r48p7" Apr 28 19:41:37.504745 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:37.504718 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-7809f-predictor-serving-cert\"" Apr 28 19:41:37.504888 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:37.504722 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-7809f-kube-rbac-proxy-sar-config\"" Apr 28 19:41:37.511501 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:37.511401 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-7809f-predictor-b4788b9f6-r48p7"] Apr 28 19:41:37.694332 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:37.694231 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-7809f-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/140418b4-b40e-44c3-8c66-f09c82585534-success-200-isvc-7809f-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-7809f-predictor-b4788b9f6-r48p7\" (UID: \"140418b4-b40e-44c3-8c66-f09c82585534\") " pod="kserve-ci-e2e-test/success-200-isvc-7809f-predictor-b4788b9f6-r48p7" Apr 28 19:41:37.694499 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:37.694334 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/140418b4-b40e-44c3-8c66-f09c82585534-proxy-tls\") pod \"success-200-isvc-7809f-predictor-b4788b9f6-r48p7\" (UID: \"140418b4-b40e-44c3-8c66-f09c82585534\") " pod="kserve-ci-e2e-test/success-200-isvc-7809f-predictor-b4788b9f6-r48p7" Apr 28 19:41:37.694499 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:37.694364 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75jrw\" (UniqueName: \"kubernetes.io/projected/140418b4-b40e-44c3-8c66-f09c82585534-kube-api-access-75jrw\") pod \"success-200-isvc-7809f-predictor-b4788b9f6-r48p7\" (UID: \"140418b4-b40e-44c3-8c66-f09c82585534\") " pod="kserve-ci-e2e-test/success-200-isvc-7809f-predictor-b4788b9f6-r48p7" Apr 28 19:41:37.795143 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:37.795102 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/140418b4-b40e-44c3-8c66-f09c82585534-proxy-tls\") pod \"success-200-isvc-7809f-predictor-b4788b9f6-r48p7\" (UID: \"140418b4-b40e-44c3-8c66-f09c82585534\") " pod="kserve-ci-e2e-test/success-200-isvc-7809f-predictor-b4788b9f6-r48p7" Apr 28 19:41:37.795340 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:37.795160 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-75jrw\" (UniqueName: \"kubernetes.io/projected/140418b4-b40e-44c3-8c66-f09c82585534-kube-api-access-75jrw\") pod \"success-200-isvc-7809f-predictor-b4788b9f6-r48p7\" (UID: \"140418b4-b40e-44c3-8c66-f09c82585534\") " pod="kserve-ci-e2e-test/success-200-isvc-7809f-predictor-b4788b9f6-r48p7" Apr 28 19:41:37.795340 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:37.795202 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-7809f-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/140418b4-b40e-44c3-8c66-f09c82585534-success-200-isvc-7809f-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-7809f-predictor-b4788b9f6-r48p7\" (UID: \"140418b4-b40e-44c3-8c66-f09c82585534\") " pod="kserve-ci-e2e-test/success-200-isvc-7809f-predictor-b4788b9f6-r48p7" Apr 28 19:41:37.795340 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:41:37.795256 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/success-200-isvc-7809f-predictor-serving-cert: secret "success-200-isvc-7809f-predictor-serving-cert" not found Apr 28 19:41:37.795340 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:41:37.795318 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/140418b4-b40e-44c3-8c66-f09c82585534-proxy-tls podName:140418b4-b40e-44c3-8c66-f09c82585534 nodeName:}" failed. No retries permitted until 2026-04-28 19:41:38.295300054 +0000 UTC m=+1514.258200397 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/140418b4-b40e-44c3-8c66-f09c82585534-proxy-tls") pod "success-200-isvc-7809f-predictor-b4788b9f6-r48p7" (UID: "140418b4-b40e-44c3-8c66-f09c82585534") : secret "success-200-isvc-7809f-predictor-serving-cert" not found Apr 28 19:41:37.795974 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:37.795948 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-7809f-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/140418b4-b40e-44c3-8c66-f09c82585534-success-200-isvc-7809f-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-7809f-predictor-b4788b9f6-r48p7\" (UID: \"140418b4-b40e-44c3-8c66-f09c82585534\") " pod="kserve-ci-e2e-test/success-200-isvc-7809f-predictor-b4788b9f6-r48p7" Apr 28 19:41:37.810229 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:37.810200 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-75jrw\" (UniqueName: \"kubernetes.io/projected/140418b4-b40e-44c3-8c66-f09c82585534-kube-api-access-75jrw\") pod \"success-200-isvc-7809f-predictor-b4788b9f6-r48p7\" (UID: \"140418b4-b40e-44c3-8c66-f09c82585534\") " pod="kserve-ci-e2e-test/success-200-isvc-7809f-predictor-b4788b9f6-r48p7" Apr 28 19:41:38.300534 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:38.300494 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/140418b4-b40e-44c3-8c66-f09c82585534-proxy-tls\") pod \"success-200-isvc-7809f-predictor-b4788b9f6-r48p7\" (UID: \"140418b4-b40e-44c3-8c66-f09c82585534\") " pod="kserve-ci-e2e-test/success-200-isvc-7809f-predictor-b4788b9f6-r48p7" Apr 28 19:41:38.303151 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:38.303128 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/140418b4-b40e-44c3-8c66-f09c82585534-proxy-tls\") pod \"success-200-isvc-7809f-predictor-b4788b9f6-r48p7\" (UID: \"140418b4-b40e-44c3-8c66-f09c82585534\") " pod="kserve-ci-e2e-test/success-200-isvc-7809f-predictor-b4788b9f6-r48p7" Apr 28 19:41:38.327665 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:38.327631 2576 generic.go:358] "Generic (PLEG): container finished" podID="10453b2a-d390-4d9c-9088-02ea43478764" containerID="10afb3f960d784ae9bdf3f33caa109a3c1d73823b9d7d0b84e1d812ff53cf6a7" exitCode=2 Apr 28 19:41:38.327665 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:38.327638 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-26ebe-predictor-585cfc74d-f52v6" event={"ID":"10453b2a-d390-4d9c-9088-02ea43478764","Type":"ContainerDied","Data":"10afb3f960d784ae9bdf3f33caa109a3c1d73823b9d7d0b84e1d812ff53cf6a7"} Apr 28 19:41:38.415739 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:38.415700 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-7809f-predictor-b4788b9f6-r48p7" Apr 28 19:41:38.548248 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:38.548223 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-7809f-predictor-b4788b9f6-r48p7"] Apr 28 19:41:38.549783 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:41:38.549748 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod140418b4_b40e_44c3_8c66_f09c82585534.slice/crio-324165cc17f7c17b667e97d14c5cdeade1a4583647e4424c37c9815988a41efe WatchSource:0}: Error finding container 324165cc17f7c17b667e97d14c5cdeade1a4583647e4424c37c9815988a41efe: Status 404 returned error can't find the container with id 324165cc17f7c17b667e97d14c5cdeade1a4583647e4424c37c9815988a41efe Apr 28 19:41:39.298872 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:39.298829 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-26ebe-8fb564fb5-7z45l" podUID="1797a188-5e80-4a4b-bae4-da7ab3832311" containerName="sequence-graph-26ebe" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:41:39.332402 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:39.332370 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-7809f-predictor-b4788b9f6-r48p7" event={"ID":"140418b4-b40e-44c3-8c66-f09c82585534","Type":"ContainerStarted","Data":"a83bfdd036d2a2a33b19c759d2c8da67711aca57cbebc0eb86b11ac0ce84127b"} Apr 28 19:41:39.332554 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:39.332407 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-7809f-predictor-b4788b9f6-r48p7" event={"ID":"140418b4-b40e-44c3-8c66-f09c82585534","Type":"ContainerStarted","Data":"0a9105f93b179065027567828d7290d80a93449fec16e7845210e9d33134ed08"} Apr 28 19:41:39.332554 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:39.332417 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-7809f-predictor-b4788b9f6-r48p7" event={"ID":"140418b4-b40e-44c3-8c66-f09c82585534","Type":"ContainerStarted","Data":"324165cc17f7c17b667e97d14c5cdeade1a4583647e4424c37c9815988a41efe"} Apr 28 19:41:39.332676 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:39.332619 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-7809f-predictor-b4788b9f6-r48p7" Apr 28 19:41:39.332676 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:39.332649 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-7809f-predictor-b4788b9f6-r48p7" Apr 28 19:41:39.333967 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:39.333944 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-7809f-predictor-b4788b9f6-r48p7" podUID="140418b4-b40e-44c3-8c66-f09c82585534" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 28 19:41:39.350682 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:39.350619 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-7809f-predictor-b4788b9f6-r48p7" podStartSLOduration=2.350573084 podStartE2EDuration="2.350573084s" podCreationTimestamp="2026-04-28 19:41:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:41:39.349207439 +0000 UTC m=+1515.312107806" watchObservedRunningTime="2026-04-28 19:41:39.350573084 +0000 UTC m=+1515.313473452" Apr 28 19:41:40.085288 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:40.085242 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-26ebe-predictor-585cfc74d-f52v6" podUID="10453b2a-d390-4d9c-9088-02ea43478764" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.37:8643/healthz\": dial tcp 10.134.0.37:8643: connect: connection refused" Apr 28 19:41:40.336150 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:40.336060 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-7809f-predictor-b4788b9f6-r48p7" podUID="140418b4-b40e-44c3-8c66-f09c82585534" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 28 19:41:40.602912 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:40.602884 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-26ebe-predictor-585cfc74d-f52v6" Apr 28 19:41:40.620169 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:40.620081 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5gqct\" (UniqueName: \"kubernetes.io/projected/10453b2a-d390-4d9c-9088-02ea43478764-kube-api-access-5gqct\") pod \"10453b2a-d390-4d9c-9088-02ea43478764\" (UID: \"10453b2a-d390-4d9c-9088-02ea43478764\") " Apr 28 19:41:40.622476 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:40.622442 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10453b2a-d390-4d9c-9088-02ea43478764-kube-api-access-5gqct" (OuterVolumeSpecName: "kube-api-access-5gqct") pod "10453b2a-d390-4d9c-9088-02ea43478764" (UID: "10453b2a-d390-4d9c-9088-02ea43478764"). InnerVolumeSpecName "kube-api-access-5gqct". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:41:40.721249 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:40.721205 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-26ebe-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/10453b2a-d390-4d9c-9088-02ea43478764-success-200-isvc-26ebe-kube-rbac-proxy-sar-config\") pod \"10453b2a-d390-4d9c-9088-02ea43478764\" (UID: \"10453b2a-d390-4d9c-9088-02ea43478764\") " Apr 28 19:41:40.721403 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:40.721264 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/10453b2a-d390-4d9c-9088-02ea43478764-proxy-tls\") pod \"10453b2a-d390-4d9c-9088-02ea43478764\" (UID: \"10453b2a-d390-4d9c-9088-02ea43478764\") " Apr 28 19:41:40.721498 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:40.721483 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5gqct\" (UniqueName: \"kubernetes.io/projected/10453b2a-d390-4d9c-9088-02ea43478764-kube-api-access-5gqct\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 28 19:41:40.721649 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:40.721629 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10453b2a-d390-4d9c-9088-02ea43478764-success-200-isvc-26ebe-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-26ebe-kube-rbac-proxy-sar-config") pod "10453b2a-d390-4d9c-9088-02ea43478764" (UID: "10453b2a-d390-4d9c-9088-02ea43478764"). InnerVolumeSpecName "success-200-isvc-26ebe-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:41:40.723617 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:40.723579 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10453b2a-d390-4d9c-9088-02ea43478764-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "10453b2a-d390-4d9c-9088-02ea43478764" (UID: "10453b2a-d390-4d9c-9088-02ea43478764"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:41:40.821959 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:40.821907 2576 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-26ebe-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/10453b2a-d390-4d9c-9088-02ea43478764-success-200-isvc-26ebe-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 28 19:41:40.821959 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:40.821952 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/10453b2a-d390-4d9c-9088-02ea43478764-proxy-tls\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 28 19:41:41.342456 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:41.342420 2576 generic.go:358] "Generic (PLEG): container finished" podID="10453b2a-d390-4d9c-9088-02ea43478764" containerID="ff4869663af6adeec9f40306311c9d51eeda0db22753198d3439ccb971a63df5" exitCode=0 Apr 28 19:41:41.342929 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:41.342488 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-26ebe-predictor-585cfc74d-f52v6" Apr 28 19:41:41.342929 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:41.342499 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-26ebe-predictor-585cfc74d-f52v6" event={"ID":"10453b2a-d390-4d9c-9088-02ea43478764","Type":"ContainerDied","Data":"ff4869663af6adeec9f40306311c9d51eeda0db22753198d3439ccb971a63df5"} Apr 28 19:41:41.342929 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:41.342538 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-26ebe-predictor-585cfc74d-f52v6" event={"ID":"10453b2a-d390-4d9c-9088-02ea43478764","Type":"ContainerDied","Data":"9e4e9fefb12308bdfcaba4d432d3a498670cfc3099548c56aa3b52ff630498c0"} Apr 28 19:41:41.342929 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:41.342554 2576 scope.go:117] "RemoveContainer" containerID="10afb3f960d784ae9bdf3f33caa109a3c1d73823b9d7d0b84e1d812ff53cf6a7" Apr 28 19:41:41.352653 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:41.352597 2576 scope.go:117] "RemoveContainer" containerID="ff4869663af6adeec9f40306311c9d51eeda0db22753198d3439ccb971a63df5" Apr 28 19:41:41.359729 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:41.359707 2576 scope.go:117] "RemoveContainer" containerID="10afb3f960d784ae9bdf3f33caa109a3c1d73823b9d7d0b84e1d812ff53cf6a7" Apr 28 19:41:41.359962 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:41:41.359942 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10afb3f960d784ae9bdf3f33caa109a3c1d73823b9d7d0b84e1d812ff53cf6a7\": container with ID starting with 10afb3f960d784ae9bdf3f33caa109a3c1d73823b9d7d0b84e1d812ff53cf6a7 not found: ID does not exist" containerID="10afb3f960d784ae9bdf3f33caa109a3c1d73823b9d7d0b84e1d812ff53cf6a7" Apr 28 19:41:41.360029 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:41.359974 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10afb3f960d784ae9bdf3f33caa109a3c1d73823b9d7d0b84e1d812ff53cf6a7"} err="failed to get container status \"10afb3f960d784ae9bdf3f33caa109a3c1d73823b9d7d0b84e1d812ff53cf6a7\": rpc error: code = NotFound desc = could not find container \"10afb3f960d784ae9bdf3f33caa109a3c1d73823b9d7d0b84e1d812ff53cf6a7\": container with ID starting with 10afb3f960d784ae9bdf3f33caa109a3c1d73823b9d7d0b84e1d812ff53cf6a7 not found: ID does not exist" Apr 28 19:41:41.360029 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:41.359999 2576 scope.go:117] "RemoveContainer" containerID="ff4869663af6adeec9f40306311c9d51eeda0db22753198d3439ccb971a63df5" Apr 28 19:41:41.360254 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:41:41.360236 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff4869663af6adeec9f40306311c9d51eeda0db22753198d3439ccb971a63df5\": container with ID starting with ff4869663af6adeec9f40306311c9d51eeda0db22753198d3439ccb971a63df5 not found: ID does not exist" containerID="ff4869663af6adeec9f40306311c9d51eeda0db22753198d3439ccb971a63df5" Apr 28 19:41:41.360411 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:41.360260 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff4869663af6adeec9f40306311c9d51eeda0db22753198d3439ccb971a63df5"} err="failed to get container status \"ff4869663af6adeec9f40306311c9d51eeda0db22753198d3439ccb971a63df5\": rpc error: code = NotFound desc = could not find container \"ff4869663af6adeec9f40306311c9d51eeda0db22753198d3439ccb971a63df5\": container with ID starting with ff4869663af6adeec9f40306311c9d51eeda0db22753198d3439ccb971a63df5 not found: ID does not exist" Apr 28 19:41:41.364332 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:41.364313 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-26ebe-predictor-585cfc74d-f52v6"] Apr 28 19:41:41.367932 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:41.367914 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-26ebe-predictor-585cfc74d-f52v6"] Apr 28 19:41:42.513995 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:42.513919 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10453b2a-d390-4d9c-9088-02ea43478764" path="/var/lib/kubelet/pods/10453b2a-d390-4d9c-9088-02ea43478764/volumes" Apr 28 19:41:44.298360 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:44.298318 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-26ebe-8fb564fb5-7z45l" podUID="1797a188-5e80-4a4b-bae4-da7ab3832311" containerName="sequence-graph-26ebe" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:41:45.341235 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:45.341203 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-7809f-predictor-b4788b9f6-r48p7" Apr 28 19:41:45.341746 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:45.341715 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-7809f-predictor-b4788b9f6-r48p7" podUID="140418b4-b40e-44c3-8c66-f09c82585534" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 28 19:41:46.204766 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:46.204737 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-0ca59-predictor-b4665bc9b-65wkq" Apr 28 19:41:49.298167 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:49.298129 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-26ebe-8fb564fb5-7z45l" podUID="1797a188-5e80-4a4b-bae4-da7ab3832311" containerName="sequence-graph-26ebe" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:41:49.298502 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:49.298226 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-26ebe-8fb564fb5-7z45l" Apr 28 19:41:54.298676 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:54.298634 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-26ebe-8fb564fb5-7z45l" podUID="1797a188-5e80-4a4b-bae4-da7ab3832311" containerName="sequence-graph-26ebe" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:41:55.342548 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:55.342508 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-7809f-predictor-b4788b9f6-r48p7" podUID="140418b4-b40e-44c3-8c66-f09c82585534" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 28 19:41:57.727671 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:57.727630 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-0ca59-8bcb68dfb-v4b49"] Apr 28 19:41:57.728089 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:57.727984 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="10453b2a-d390-4d9c-9088-02ea43478764" containerName="kube-rbac-proxy" Apr 28 19:41:57.728089 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:57.727995 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="10453b2a-d390-4d9c-9088-02ea43478764" containerName="kube-rbac-proxy" Apr 28 19:41:57.728089 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:57.728005 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="10453b2a-d390-4d9c-9088-02ea43478764" containerName="kserve-container" Apr 28 19:41:57.728089 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:57.728011 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="10453b2a-d390-4d9c-9088-02ea43478764" containerName="kserve-container" Apr 28 19:41:57.728089 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:57.728063 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="10453b2a-d390-4d9c-9088-02ea43478764" containerName="kserve-container" Apr 28 19:41:57.728089 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:57.728076 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="10453b2a-d390-4d9c-9088-02ea43478764" containerName="kube-rbac-proxy" Apr 28 19:41:57.730980 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:57.730961 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-0ca59-8bcb68dfb-v4b49" Apr 28 19:41:57.731563 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:57.731529 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7ce3f5e5-d1d4-453c-9956-fb5b0e720ca4-proxy-tls\") pod \"ensemble-graph-0ca59-8bcb68dfb-v4b49\" (UID: \"7ce3f5e5-d1d4-453c-9956-fb5b0e720ca4\") " pod="kserve-ci-e2e-test/ensemble-graph-0ca59-8bcb68dfb-v4b49" Apr 28 19:41:57.731695 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:57.731649 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7ce3f5e5-d1d4-453c-9956-fb5b0e720ca4-openshift-service-ca-bundle\") pod \"ensemble-graph-0ca59-8bcb68dfb-v4b49\" (UID: \"7ce3f5e5-d1d4-453c-9956-fb5b0e720ca4\") " pod="kserve-ci-e2e-test/ensemble-graph-0ca59-8bcb68dfb-v4b49" Apr 28 19:41:57.733575 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:57.733557 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"ensemble-graph-0ca59-kube-rbac-proxy-sar-config\"" Apr 28 19:41:57.733679 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:57.733558 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"ensemble-graph-0ca59-serving-cert\"" Apr 28 19:41:57.742655 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:57.742627 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-0ca59-8bcb68dfb-v4b49"] Apr 28 19:41:57.832037 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:57.832000 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7ce3f5e5-d1d4-453c-9956-fb5b0e720ca4-proxy-tls\") pod \"ensemble-graph-0ca59-8bcb68dfb-v4b49\" (UID: \"7ce3f5e5-d1d4-453c-9956-fb5b0e720ca4\") " pod="kserve-ci-e2e-test/ensemble-graph-0ca59-8bcb68dfb-v4b49" Apr 28 19:41:57.832208 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:57.832046 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7ce3f5e5-d1d4-453c-9956-fb5b0e720ca4-openshift-service-ca-bundle\") pod \"ensemble-graph-0ca59-8bcb68dfb-v4b49\" (UID: \"7ce3f5e5-d1d4-453c-9956-fb5b0e720ca4\") " pod="kserve-ci-e2e-test/ensemble-graph-0ca59-8bcb68dfb-v4b49" Apr 28 19:41:57.832641 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:57.832618 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7ce3f5e5-d1d4-453c-9956-fb5b0e720ca4-openshift-service-ca-bundle\") pod \"ensemble-graph-0ca59-8bcb68dfb-v4b49\" (UID: \"7ce3f5e5-d1d4-453c-9956-fb5b0e720ca4\") " pod="kserve-ci-e2e-test/ensemble-graph-0ca59-8bcb68dfb-v4b49" Apr 28 19:41:57.834785 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:57.834742 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7ce3f5e5-d1d4-453c-9956-fb5b0e720ca4-proxy-tls\") pod \"ensemble-graph-0ca59-8bcb68dfb-v4b49\" (UID: \"7ce3f5e5-d1d4-453c-9956-fb5b0e720ca4\") " pod="kserve-ci-e2e-test/ensemble-graph-0ca59-8bcb68dfb-v4b49" Apr 28 19:41:58.041318 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:58.041281 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-0ca59-8bcb68dfb-v4b49" Apr 28 19:41:58.163287 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:58.163262 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-0ca59-8bcb68dfb-v4b49"] Apr 28 19:41:58.164939 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:41:58.164908 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ce3f5e5_d1d4_453c_9956_fb5b0e720ca4.slice/crio-e3915b04d34fce2f06abd93d9609aa68f480937996cbc375642128491fba385c WatchSource:0}: Error finding container e3915b04d34fce2f06abd93d9609aa68f480937996cbc375642128491fba385c: Status 404 returned error can't find the container with id e3915b04d34fce2f06abd93d9609aa68f480937996cbc375642128491fba385c Apr 28 19:41:58.404795 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:58.404697 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-0ca59-8bcb68dfb-v4b49" event={"ID":"7ce3f5e5-d1d4-453c-9956-fb5b0e720ca4","Type":"ContainerStarted","Data":"30ab35661630317a2448f3e5452f46121ed62f3cbbad189edd4bb96e0e8f6c66"} Apr 28 19:41:58.404795 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:58.404745 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-0ca59-8bcb68dfb-v4b49" event={"ID":"7ce3f5e5-d1d4-453c-9956-fb5b0e720ca4","Type":"ContainerStarted","Data":"e3915b04d34fce2f06abd93d9609aa68f480937996cbc375642128491fba385c"} Apr 28 19:41:58.405012 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:58.404840 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/ensemble-graph-0ca59-8bcb68dfb-v4b49" Apr 28 19:41:58.421406 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:58.421354 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/ensemble-graph-0ca59-8bcb68dfb-v4b49" podStartSLOduration=1.421340099 podStartE2EDuration="1.421340099s" podCreationTimestamp="2026-04-28 19:41:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:41:58.420307986 +0000 UTC m=+1534.383208353" watchObservedRunningTime="2026-04-28 19:41:58.421340099 +0000 UTC m=+1534.384240465" Apr 28 19:41:59.298735 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:41:59.298698 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-26ebe-8fb564fb5-7z45l" podUID="1797a188-5e80-4a4b-bae4-da7ab3832311" containerName="sequence-graph-26ebe" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:42:04.298211 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:42:04.298172 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-26ebe-8fb564fb5-7z45l" podUID="1797a188-5e80-4a4b-bae4-da7ab3832311" containerName="sequence-graph-26ebe" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:42:04.414202 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:42:04.414165 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/ensemble-graph-0ca59-8bcb68dfb-v4b49" Apr 28 19:42:05.342460 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:42:05.342421 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-7809f-predictor-b4788b9f6-r48p7" podUID="140418b4-b40e-44c3-8c66-f09c82585534" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 28 19:42:07.441628 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:42:07.441529 2576 generic.go:358] "Generic (PLEG): container finished" podID="1797a188-5e80-4a4b-bae4-da7ab3832311" containerID="cffb8b9ab039ad167e0fb2cb79219fb70ee647eb5e2b7f3cf53ccf3024b2ee8b" exitCode=0 Apr 28 19:42:07.442042 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:42:07.441656 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-26ebe-8fb564fb5-7z45l" event={"ID":"1797a188-5e80-4a4b-bae4-da7ab3832311","Type":"ContainerDied","Data":"cffb8b9ab039ad167e0fb2cb79219fb70ee647eb5e2b7f3cf53ccf3024b2ee8b"} Apr 28 19:42:07.504487 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:42:07.504462 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-26ebe-8fb564fb5-7z45l" Apr 28 19:42:07.609559 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:42:07.609527 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1797a188-5e80-4a4b-bae4-da7ab3832311-proxy-tls\") pod \"1797a188-5e80-4a4b-bae4-da7ab3832311\" (UID: \"1797a188-5e80-4a4b-bae4-da7ab3832311\") " Apr 28 19:42:07.609559 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:42:07.609563 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1797a188-5e80-4a4b-bae4-da7ab3832311-openshift-service-ca-bundle\") pod \"1797a188-5e80-4a4b-bae4-da7ab3832311\" (UID: \"1797a188-5e80-4a4b-bae4-da7ab3832311\") " Apr 28 19:42:07.609967 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:42:07.609938 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1797a188-5e80-4a4b-bae4-da7ab3832311-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "1797a188-5e80-4a4b-bae4-da7ab3832311" (UID: "1797a188-5e80-4a4b-bae4-da7ab3832311"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:42:07.611867 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:42:07.611850 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1797a188-5e80-4a4b-bae4-da7ab3832311-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "1797a188-5e80-4a4b-bae4-da7ab3832311" (UID: "1797a188-5e80-4a4b-bae4-da7ab3832311"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:42:07.710861 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:42:07.710825 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1797a188-5e80-4a4b-bae4-da7ab3832311-proxy-tls\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 28 19:42:07.710861 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:42:07.710854 2576 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1797a188-5e80-4a4b-bae4-da7ab3832311-openshift-service-ca-bundle\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 28 19:42:08.445878 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:42:08.445845 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-26ebe-8fb564fb5-7z45l" event={"ID":"1797a188-5e80-4a4b-bae4-da7ab3832311","Type":"ContainerDied","Data":"7b3f2e769978eb0da3072517faad11cc3a9b3806cd8abf631fbd8ea69cb0f742"} Apr 28 19:42:08.446306 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:42:08.445890 2576 scope.go:117] "RemoveContainer" containerID="cffb8b9ab039ad167e0fb2cb79219fb70ee647eb5e2b7f3cf53ccf3024b2ee8b" Apr 28 19:42:08.446306 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:42:08.445909 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-26ebe-8fb564fb5-7z45l" Apr 28 19:42:08.468425 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:42:08.468400 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-26ebe-8fb564fb5-7z45l"] Apr 28 19:42:08.472473 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:42:08.472446 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-26ebe-8fb564fb5-7z45l"] Apr 28 19:42:08.514098 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:42:08.514068 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1797a188-5e80-4a4b-bae4-da7ab3832311" path="/var/lib/kubelet/pods/1797a188-5e80-4a4b-bae4-da7ab3832311/volumes" Apr 28 19:42:15.342770 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:42:15.342723 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-7809f-predictor-b4788b9f6-r48p7" podUID="140418b4-b40e-44c3-8c66-f09c82585534" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 28 19:42:25.343232 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:42:25.343200 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-7809f-predictor-b4788b9f6-r48p7" Apr 28 19:42:37.582711 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:42:37.582673 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/sequence-graph-7809f-56768674bd-l46kl"] Apr 28 19:42:37.583272 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:42:37.583188 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1797a188-5e80-4a4b-bae4-da7ab3832311" containerName="sequence-graph-26ebe" Apr 28 19:42:37.583272 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:42:37.583207 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="1797a188-5e80-4a4b-bae4-da7ab3832311" containerName="sequence-graph-26ebe" Apr 28 19:42:37.583418 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:42:37.583291 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="1797a188-5e80-4a4b-bae4-da7ab3832311" containerName="sequence-graph-26ebe" Apr 28 19:42:37.587638 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:42:37.587589 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-7809f-56768674bd-l46kl" Apr 28 19:42:37.590105 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:42:37.590081 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-7809f-serving-cert\"" Apr 28 19:42:37.590204 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:42:37.590080 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-7809f-kube-rbac-proxy-sar-config\"" Apr 28 19:42:37.595463 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:42:37.595440 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-7809f-56768674bd-l46kl"] Apr 28 19:42:37.657525 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:42:37.657497 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8a0de09b-f33b-4bc3-a0d1-00b83d801e34-proxy-tls\") pod \"sequence-graph-7809f-56768674bd-l46kl\" (UID: \"8a0de09b-f33b-4bc3-a0d1-00b83d801e34\") " pod="kserve-ci-e2e-test/sequence-graph-7809f-56768674bd-l46kl" Apr 28 19:42:37.657698 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:42:37.657631 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8a0de09b-f33b-4bc3-a0d1-00b83d801e34-openshift-service-ca-bundle\") pod \"sequence-graph-7809f-56768674bd-l46kl\" (UID: \"8a0de09b-f33b-4bc3-a0d1-00b83d801e34\") " pod="kserve-ci-e2e-test/sequence-graph-7809f-56768674bd-l46kl" Apr 28 19:42:37.758415 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:42:37.758377 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8a0de09b-f33b-4bc3-a0d1-00b83d801e34-proxy-tls\") pod \"sequence-graph-7809f-56768674bd-l46kl\" (UID: \"8a0de09b-f33b-4bc3-a0d1-00b83d801e34\") " pod="kserve-ci-e2e-test/sequence-graph-7809f-56768674bd-l46kl" Apr 28 19:42:37.758582 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:42:37.758465 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8a0de09b-f33b-4bc3-a0d1-00b83d801e34-openshift-service-ca-bundle\") pod \"sequence-graph-7809f-56768674bd-l46kl\" (UID: \"8a0de09b-f33b-4bc3-a0d1-00b83d801e34\") " pod="kserve-ci-e2e-test/sequence-graph-7809f-56768674bd-l46kl" Apr 28 19:42:37.759158 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:42:37.759138 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8a0de09b-f33b-4bc3-a0d1-00b83d801e34-openshift-service-ca-bundle\") pod \"sequence-graph-7809f-56768674bd-l46kl\" (UID: \"8a0de09b-f33b-4bc3-a0d1-00b83d801e34\") " pod="kserve-ci-e2e-test/sequence-graph-7809f-56768674bd-l46kl" Apr 28 19:42:37.760943 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:42:37.760912 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8a0de09b-f33b-4bc3-a0d1-00b83d801e34-proxy-tls\") pod \"sequence-graph-7809f-56768674bd-l46kl\" (UID: \"8a0de09b-f33b-4bc3-a0d1-00b83d801e34\") " pod="kserve-ci-e2e-test/sequence-graph-7809f-56768674bd-l46kl" Apr 28 19:42:37.898495 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:42:37.898415 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-7809f-56768674bd-l46kl" Apr 28 19:42:38.023477 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:42:38.023378 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-7809f-56768674bd-l46kl"] Apr 28 19:42:38.025949 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:42:38.025913 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a0de09b_f33b_4bc3_a0d1_00b83d801e34.slice/crio-b65e3c633cf46c5957c948c721aaf2d70b34be755f0ef8f7125cd148ff01357d WatchSource:0}: Error finding container b65e3c633cf46c5957c948c721aaf2d70b34be755f0ef8f7125cd148ff01357d: Status 404 returned error can't find the container with id b65e3c633cf46c5957c948c721aaf2d70b34be755f0ef8f7125cd148ff01357d Apr 28 19:42:38.543521 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:42:38.543484 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-7809f-56768674bd-l46kl" event={"ID":"8a0de09b-f33b-4bc3-a0d1-00b83d801e34","Type":"ContainerStarted","Data":"442ff091d44cc276df277ae6d47ecdde7953d91486fc72110f34c40a7e492b8b"} Apr 28 19:42:38.543521 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:42:38.543523 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-7809f-56768674bd-l46kl" event={"ID":"8a0de09b-f33b-4bc3-a0d1-00b83d801e34","Type":"ContainerStarted","Data":"b65e3c633cf46c5957c948c721aaf2d70b34be755f0ef8f7125cd148ff01357d"} Apr 28 19:42:38.543792 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:42:38.543627 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-7809f-56768674bd-l46kl" Apr 28 19:42:38.560787 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:42:38.560738 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/sequence-graph-7809f-56768674bd-l46kl" podStartSLOduration=1.560725753 podStartE2EDuration="1.560725753s" podCreationTimestamp="2026-04-28 19:42:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:42:38.558669973 +0000 UTC m=+1574.521570336" watchObservedRunningTime="2026-04-28 19:42:38.560725753 +0000 UTC m=+1574.523626119" Apr 28 19:42:44.553840 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:42:44.553810 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sequence-graph-7809f-56768674bd-l46kl" Apr 28 19:46:24.542130 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:46:24.542106 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fts6r_6551d5d0-2583-4478-98ab-1efc22016165/console-operator/1.log" Apr 28 19:46:24.547864 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:46:24.547840 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5tdk8_ff2f50e1-de53-4f11-a477-9236b340536b/ovn-acl-logging/0.log" Apr 28 19:46:24.553533 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:46:24.553509 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fts6r_6551d5d0-2583-4478-98ab-1efc22016165/console-operator/1.log" Apr 28 19:46:24.558336 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:46:24.558317 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5tdk8_ff2f50e1-de53-4f11-a477-9236b340536b/ovn-acl-logging/0.log" Apr 28 19:50:12.506510 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:12.506477 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-0ca59-8bcb68dfb-v4b49"] Apr 28 19:50:12.508966 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:12.506740 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/ensemble-graph-0ca59-8bcb68dfb-v4b49" podUID="7ce3f5e5-d1d4-453c-9956-fb5b0e720ca4" containerName="ensemble-graph-0ca59" containerID="cri-o://30ab35661630317a2448f3e5452f46121ed62f3cbbad189edd4bb96e0e8f6c66" gracePeriod=30 Apr 28 19:50:12.615551 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:12.615518 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-0ca59-predictor-b4665bc9b-65wkq"] Apr 28 19:50:12.615871 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:12.615838 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-0ca59-predictor-b4665bc9b-65wkq" podUID="4f191223-363e-4407-ad08-1d395af94d5f" containerName="kserve-container" containerID="cri-o://d403fe53d147a14bbeb3d2b8c7dd13be8e5085f0949251d32fb59e4849b19017" gracePeriod=30 Apr 28 19:50:12.615998 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:12.615894 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-0ca59-predictor-b4665bc9b-65wkq" podUID="4f191223-363e-4407-ad08-1d395af94d5f" containerName="kube-rbac-proxy" containerID="cri-o://2702e72a4ea04567285b64982c2540f4ef215b04951e191cf1ef4042cccba10a" gracePeriod=30 Apr 28 19:50:12.703108 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:12.703081 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-6bad9-predictor-5d7df76d56-h7pdr"] Apr 28 19:50:12.706539 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:12.706515 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-6bad9-predictor-5d7df76d56-h7pdr" Apr 28 19:50:12.709237 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:12.709216 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-6bad9-predictor-serving-cert\"" Apr 28 19:50:12.709351 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:12.709224 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-6bad9-kube-rbac-proxy-sar-config\"" Apr 28 19:50:12.725549 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:12.725520 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-6bad9-predictor-5d7df76d56-h7pdr"] Apr 28 19:50:12.834479 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:12.834385 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wn64x\" (UniqueName: \"kubernetes.io/projected/e692a5c3-0746-4dbb-9f52-0baa64cb948f-kube-api-access-wn64x\") pod \"success-200-isvc-6bad9-predictor-5d7df76d56-h7pdr\" (UID: \"e692a5c3-0746-4dbb-9f52-0baa64cb948f\") " pod="kserve-ci-e2e-test/success-200-isvc-6bad9-predictor-5d7df76d56-h7pdr" Apr 28 19:50:12.834479 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:12.834442 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-6bad9-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e692a5c3-0746-4dbb-9f52-0baa64cb948f-success-200-isvc-6bad9-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-6bad9-predictor-5d7df76d56-h7pdr\" (UID: \"e692a5c3-0746-4dbb-9f52-0baa64cb948f\") " pod="kserve-ci-e2e-test/success-200-isvc-6bad9-predictor-5d7df76d56-h7pdr" Apr 28 19:50:12.834479 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:12.834476 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e692a5c3-0746-4dbb-9f52-0baa64cb948f-proxy-tls\") pod \"success-200-isvc-6bad9-predictor-5d7df76d56-h7pdr\" (UID: \"e692a5c3-0746-4dbb-9f52-0baa64cb948f\") " pod="kserve-ci-e2e-test/success-200-isvc-6bad9-predictor-5d7df76d56-h7pdr" Apr 28 19:50:12.935224 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:12.935182 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wn64x\" (UniqueName: \"kubernetes.io/projected/e692a5c3-0746-4dbb-9f52-0baa64cb948f-kube-api-access-wn64x\") pod \"success-200-isvc-6bad9-predictor-5d7df76d56-h7pdr\" (UID: \"e692a5c3-0746-4dbb-9f52-0baa64cb948f\") " pod="kserve-ci-e2e-test/success-200-isvc-6bad9-predictor-5d7df76d56-h7pdr" Apr 28 19:50:12.935224 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:12.935228 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-6bad9-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e692a5c3-0746-4dbb-9f52-0baa64cb948f-success-200-isvc-6bad9-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-6bad9-predictor-5d7df76d56-h7pdr\" (UID: \"e692a5c3-0746-4dbb-9f52-0baa64cb948f\") " pod="kserve-ci-e2e-test/success-200-isvc-6bad9-predictor-5d7df76d56-h7pdr" Apr 28 19:50:12.935505 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:12.935252 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e692a5c3-0746-4dbb-9f52-0baa64cb948f-proxy-tls\") pod \"success-200-isvc-6bad9-predictor-5d7df76d56-h7pdr\" (UID: \"e692a5c3-0746-4dbb-9f52-0baa64cb948f\") " pod="kserve-ci-e2e-test/success-200-isvc-6bad9-predictor-5d7df76d56-h7pdr" Apr 28 19:50:12.935975 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:12.935950 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-6bad9-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e692a5c3-0746-4dbb-9f52-0baa64cb948f-success-200-isvc-6bad9-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-6bad9-predictor-5d7df76d56-h7pdr\" (UID: \"e692a5c3-0746-4dbb-9f52-0baa64cb948f\") " pod="kserve-ci-e2e-test/success-200-isvc-6bad9-predictor-5d7df76d56-h7pdr" Apr 28 19:50:12.938000 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:12.937978 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e692a5c3-0746-4dbb-9f52-0baa64cb948f-proxy-tls\") pod \"success-200-isvc-6bad9-predictor-5d7df76d56-h7pdr\" (UID: \"e692a5c3-0746-4dbb-9f52-0baa64cb948f\") " pod="kserve-ci-e2e-test/success-200-isvc-6bad9-predictor-5d7df76d56-h7pdr" Apr 28 19:50:12.944027 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:12.944000 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wn64x\" (UniqueName: \"kubernetes.io/projected/e692a5c3-0746-4dbb-9f52-0baa64cb948f-kube-api-access-wn64x\") pod \"success-200-isvc-6bad9-predictor-5d7df76d56-h7pdr\" (UID: \"e692a5c3-0746-4dbb-9f52-0baa64cb948f\") " pod="kserve-ci-e2e-test/success-200-isvc-6bad9-predictor-5d7df76d56-h7pdr" Apr 28 19:50:13.016226 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:13.016200 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-6bad9-predictor-5d7df76d56-h7pdr" Apr 28 19:50:13.020716 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:13.020686 2576 generic.go:358] "Generic (PLEG): container finished" podID="4f191223-363e-4407-ad08-1d395af94d5f" containerID="2702e72a4ea04567285b64982c2540f4ef215b04951e191cf1ef4042cccba10a" exitCode=2 Apr 28 19:50:13.020816 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:13.020754 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-0ca59-predictor-b4665bc9b-65wkq" event={"ID":"4f191223-363e-4407-ad08-1d395af94d5f","Type":"ContainerDied","Data":"2702e72a4ea04567285b64982c2540f4ef215b04951e191cf1ef4042cccba10a"} Apr 28 19:50:13.139656 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:13.139622 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-6bad9-predictor-5d7df76d56-h7pdr"] Apr 28 19:50:13.143096 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:50:13.143050 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode692a5c3_0746_4dbb_9f52_0baa64cb948f.slice/crio-76cf8509c575fd476b7f9ee9cf694e10f63d7e223c99431cbfbdaef7f12ff4ee WatchSource:0}: Error finding container 76cf8509c575fd476b7f9ee9cf694e10f63d7e223c99431cbfbdaef7f12ff4ee: Status 404 returned error can't find the container with id 76cf8509c575fd476b7f9ee9cf694e10f63d7e223c99431cbfbdaef7f12ff4ee Apr 28 19:50:13.144916 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:13.144897 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 28 19:50:14.026956 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:14.026913 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-6bad9-predictor-5d7df76d56-h7pdr" event={"ID":"e692a5c3-0746-4dbb-9f52-0baa64cb948f","Type":"ContainerStarted","Data":"4e478a80ea77c03de82c368e16c9a60ace4bea456918ab60628bff052ce40b7f"} Apr 28 19:50:14.026956 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:14.026952 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-6bad9-predictor-5d7df76d56-h7pdr" event={"ID":"e692a5c3-0746-4dbb-9f52-0baa64cb948f","Type":"ContainerStarted","Data":"07e5c37be67cc104451b9438d0d6b1f43c61495683ba7d06ed365ff03f9b7b96"} Apr 28 19:50:14.026956 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:14.026964 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-6bad9-predictor-5d7df76d56-h7pdr" event={"ID":"e692a5c3-0746-4dbb-9f52-0baa64cb948f","Type":"ContainerStarted","Data":"76cf8509c575fd476b7f9ee9cf694e10f63d7e223c99431cbfbdaef7f12ff4ee"} Apr 28 19:50:14.027440 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:14.027054 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-6bad9-predictor-5d7df76d56-h7pdr" Apr 28 19:50:14.047862 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:14.047802 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-6bad9-predictor-5d7df76d56-h7pdr" podStartSLOduration=2.047782208 podStartE2EDuration="2.047782208s" podCreationTimestamp="2026-04-28 19:50:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:50:14.044795109 +0000 UTC m=+2030.007695496" watchObservedRunningTime="2026-04-28 19:50:14.047782208 +0000 UTC m=+2030.010682574" Apr 28 19:50:14.412964 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:14.412869 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-0ca59-8bcb68dfb-v4b49" podUID="7ce3f5e5-d1d4-453c-9956-fb5b0e720ca4" containerName="ensemble-graph-0ca59" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:50:15.031274 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:15.031238 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-6bad9-predictor-5d7df76d56-h7pdr" Apr 28 19:50:15.032657 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:15.032632 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-6bad9-predictor-5d7df76d56-h7pdr" podUID="e692a5c3-0746-4dbb-9f52-0baa64cb948f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.44:8080: connect: connection refused" Apr 28 19:50:15.756117 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:15.756094 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-0ca59-predictor-b4665bc9b-65wkq" Apr 28 19:50:15.856858 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:15.856786 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-0ca59-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4f191223-363e-4407-ad08-1d395af94d5f-success-200-isvc-0ca59-kube-rbac-proxy-sar-config\") pod \"4f191223-363e-4407-ad08-1d395af94d5f\" (UID: \"4f191223-363e-4407-ad08-1d395af94d5f\") " Apr 28 19:50:15.856987 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:15.856871 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4f191223-363e-4407-ad08-1d395af94d5f-proxy-tls\") pod \"4f191223-363e-4407-ad08-1d395af94d5f\" (UID: \"4f191223-363e-4407-ad08-1d395af94d5f\") " Apr 28 19:50:15.856987 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:15.856932 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-94qpr\" (UniqueName: \"kubernetes.io/projected/4f191223-363e-4407-ad08-1d395af94d5f-kube-api-access-94qpr\") pod \"4f191223-363e-4407-ad08-1d395af94d5f\" (UID: \"4f191223-363e-4407-ad08-1d395af94d5f\") " Apr 28 19:50:15.857122 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:15.857100 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f191223-363e-4407-ad08-1d395af94d5f-success-200-isvc-0ca59-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-0ca59-kube-rbac-proxy-sar-config") pod "4f191223-363e-4407-ad08-1d395af94d5f" (UID: "4f191223-363e-4407-ad08-1d395af94d5f"). InnerVolumeSpecName "success-200-isvc-0ca59-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:50:15.859092 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:15.859066 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f191223-363e-4407-ad08-1d395af94d5f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "4f191223-363e-4407-ad08-1d395af94d5f" (UID: "4f191223-363e-4407-ad08-1d395af94d5f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:50:15.859212 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:15.859094 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f191223-363e-4407-ad08-1d395af94d5f-kube-api-access-94qpr" (OuterVolumeSpecName: "kube-api-access-94qpr") pod "4f191223-363e-4407-ad08-1d395af94d5f" (UID: "4f191223-363e-4407-ad08-1d395af94d5f"). InnerVolumeSpecName "kube-api-access-94qpr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:50:15.957827 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:15.957783 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4f191223-363e-4407-ad08-1d395af94d5f-proxy-tls\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 28 19:50:15.957827 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:15.957821 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-94qpr\" (UniqueName: \"kubernetes.io/projected/4f191223-363e-4407-ad08-1d395af94d5f-kube-api-access-94qpr\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 28 19:50:15.957827 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:15.957833 2576 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-0ca59-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4f191223-363e-4407-ad08-1d395af94d5f-success-200-isvc-0ca59-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 28 19:50:16.036018 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:16.035983 2576 generic.go:358] "Generic (PLEG): container finished" podID="4f191223-363e-4407-ad08-1d395af94d5f" containerID="d403fe53d147a14bbeb3d2b8c7dd13be8e5085f0949251d32fb59e4849b19017" exitCode=0 Apr 28 19:50:16.036348 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:16.036071 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-0ca59-predictor-b4665bc9b-65wkq" Apr 28 19:50:16.036348 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:16.036073 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-0ca59-predictor-b4665bc9b-65wkq" event={"ID":"4f191223-363e-4407-ad08-1d395af94d5f","Type":"ContainerDied","Data":"d403fe53d147a14bbeb3d2b8c7dd13be8e5085f0949251d32fb59e4849b19017"} Apr 28 19:50:16.036348 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:16.036181 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-0ca59-predictor-b4665bc9b-65wkq" event={"ID":"4f191223-363e-4407-ad08-1d395af94d5f","Type":"ContainerDied","Data":"cbad5cc5bbbea480f2872ecec52b7d1e3e8e27be451e0b44e13d8db1829b4f11"} Apr 28 19:50:16.036348 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:16.036201 2576 scope.go:117] "RemoveContainer" containerID="2702e72a4ea04567285b64982c2540f4ef215b04951e191cf1ef4042cccba10a" Apr 28 19:50:16.036811 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:16.036786 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-6bad9-predictor-5d7df76d56-h7pdr" podUID="e692a5c3-0746-4dbb-9f52-0baa64cb948f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.44:8080: connect: connection refused" Apr 28 19:50:16.045095 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:16.045080 2576 scope.go:117] "RemoveContainer" containerID="d403fe53d147a14bbeb3d2b8c7dd13be8e5085f0949251d32fb59e4849b19017" Apr 28 19:50:16.052255 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:16.052240 2576 scope.go:117] "RemoveContainer" containerID="2702e72a4ea04567285b64982c2540f4ef215b04951e191cf1ef4042cccba10a" Apr 28 19:50:16.052504 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:50:16.052487 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2702e72a4ea04567285b64982c2540f4ef215b04951e191cf1ef4042cccba10a\": container with ID starting with 2702e72a4ea04567285b64982c2540f4ef215b04951e191cf1ef4042cccba10a not found: ID does not exist" containerID="2702e72a4ea04567285b64982c2540f4ef215b04951e191cf1ef4042cccba10a" Apr 28 19:50:16.052569 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:16.052517 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2702e72a4ea04567285b64982c2540f4ef215b04951e191cf1ef4042cccba10a"} err="failed to get container status \"2702e72a4ea04567285b64982c2540f4ef215b04951e191cf1ef4042cccba10a\": rpc error: code = NotFound desc = could not find container \"2702e72a4ea04567285b64982c2540f4ef215b04951e191cf1ef4042cccba10a\": container with ID starting with 2702e72a4ea04567285b64982c2540f4ef215b04951e191cf1ef4042cccba10a not found: ID does not exist" Apr 28 19:50:16.052569 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:16.052540 2576 scope.go:117] "RemoveContainer" containerID="d403fe53d147a14bbeb3d2b8c7dd13be8e5085f0949251d32fb59e4849b19017" Apr 28 19:50:16.052823 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:50:16.052798 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d403fe53d147a14bbeb3d2b8c7dd13be8e5085f0949251d32fb59e4849b19017\": container with ID starting with d403fe53d147a14bbeb3d2b8c7dd13be8e5085f0949251d32fb59e4849b19017 not found: ID does not exist" containerID="d403fe53d147a14bbeb3d2b8c7dd13be8e5085f0949251d32fb59e4849b19017" Apr 28 19:50:16.052905 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:16.052828 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d403fe53d147a14bbeb3d2b8c7dd13be8e5085f0949251d32fb59e4849b19017"} err="failed to get container status \"d403fe53d147a14bbeb3d2b8c7dd13be8e5085f0949251d32fb59e4849b19017\": rpc error: code = NotFound desc = could not find container \"d403fe53d147a14bbeb3d2b8c7dd13be8e5085f0949251d32fb59e4849b19017\": container with ID starting with d403fe53d147a14bbeb3d2b8c7dd13be8e5085f0949251d32fb59e4849b19017 not found: ID does not exist" Apr 28 19:50:16.059193 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:16.059170 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-0ca59-predictor-b4665bc9b-65wkq"] Apr 28 19:50:16.063142 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:16.063120 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-0ca59-predictor-b4665bc9b-65wkq"] Apr 28 19:50:16.517140 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:16.517106 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f191223-363e-4407-ad08-1d395af94d5f" path="/var/lib/kubelet/pods/4f191223-363e-4407-ad08-1d395af94d5f/volumes" Apr 28 19:50:19.413151 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:19.413107 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-0ca59-8bcb68dfb-v4b49" podUID="7ce3f5e5-d1d4-453c-9956-fb5b0e720ca4" containerName="ensemble-graph-0ca59" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:50:21.041905 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:21.041879 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-6bad9-predictor-5d7df76d56-h7pdr" Apr 28 19:50:21.042414 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:21.042388 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-6bad9-predictor-5d7df76d56-h7pdr" podUID="e692a5c3-0746-4dbb-9f52-0baa64cb948f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.44:8080: connect: connection refused" Apr 28 19:50:24.412502 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:24.412449 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-0ca59-8bcb68dfb-v4b49" podUID="7ce3f5e5-d1d4-453c-9956-fb5b0e720ca4" containerName="ensemble-graph-0ca59" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:50:24.412911 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:24.412569 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/ensemble-graph-0ca59-8bcb68dfb-v4b49" Apr 28 19:50:29.411965 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:29.411923 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-0ca59-8bcb68dfb-v4b49" podUID="7ce3f5e5-d1d4-453c-9956-fb5b0e720ca4" containerName="ensemble-graph-0ca59" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:50:31.042491 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:31.042452 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-6bad9-predictor-5d7df76d56-h7pdr" podUID="e692a5c3-0746-4dbb-9f52-0baa64cb948f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.44:8080: connect: connection refused" Apr 28 19:50:34.412062 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:34.412022 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-0ca59-8bcb68dfb-v4b49" podUID="7ce3f5e5-d1d4-453c-9956-fb5b0e720ca4" containerName="ensemble-graph-0ca59" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:50:39.412127 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:39.412082 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-0ca59-8bcb68dfb-v4b49" podUID="7ce3f5e5-d1d4-453c-9956-fb5b0e720ca4" containerName="ensemble-graph-0ca59" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:50:41.042276 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:41.042242 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-6bad9-predictor-5d7df76d56-h7pdr" podUID="e692a5c3-0746-4dbb-9f52-0baa64cb948f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.44:8080: connect: connection refused" Apr 28 19:50:43.124984 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:43.124952 2576 generic.go:358] "Generic (PLEG): container finished" podID="7ce3f5e5-d1d4-453c-9956-fb5b0e720ca4" containerID="30ab35661630317a2448f3e5452f46121ed62f3cbbad189edd4bb96e0e8f6c66" exitCode=0 Apr 28 19:50:43.125295 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:43.125010 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-0ca59-8bcb68dfb-v4b49" event={"ID":"7ce3f5e5-d1d4-453c-9956-fb5b0e720ca4","Type":"ContainerDied","Data":"30ab35661630317a2448f3e5452f46121ed62f3cbbad189edd4bb96e0e8f6c66"} Apr 28 19:50:43.160665 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:43.160645 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-0ca59-8bcb68dfb-v4b49" Apr 28 19:50:43.281632 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:43.281564 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7ce3f5e5-d1d4-453c-9956-fb5b0e720ca4-proxy-tls\") pod \"7ce3f5e5-d1d4-453c-9956-fb5b0e720ca4\" (UID: \"7ce3f5e5-d1d4-453c-9956-fb5b0e720ca4\") " Apr 28 19:50:43.281802 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:43.281654 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7ce3f5e5-d1d4-453c-9956-fb5b0e720ca4-openshift-service-ca-bundle\") pod \"7ce3f5e5-d1d4-453c-9956-fb5b0e720ca4\" (UID: \"7ce3f5e5-d1d4-453c-9956-fb5b0e720ca4\") " Apr 28 19:50:43.282006 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:43.281980 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ce3f5e5-d1d4-453c-9956-fb5b0e720ca4-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "7ce3f5e5-d1d4-453c-9956-fb5b0e720ca4" (UID: "7ce3f5e5-d1d4-453c-9956-fb5b0e720ca4"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:50:43.283840 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:43.283820 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ce3f5e5-d1d4-453c-9956-fb5b0e720ca4-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "7ce3f5e5-d1d4-453c-9956-fb5b0e720ca4" (UID: "7ce3f5e5-d1d4-453c-9956-fb5b0e720ca4"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:50:43.382313 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:43.382286 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7ce3f5e5-d1d4-453c-9956-fb5b0e720ca4-proxy-tls\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 28 19:50:43.382313 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:43.382312 2576 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7ce3f5e5-d1d4-453c-9956-fb5b0e720ca4-openshift-service-ca-bundle\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 28 19:50:44.129720 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:44.129684 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-0ca59-8bcb68dfb-v4b49" event={"ID":"7ce3f5e5-d1d4-453c-9956-fb5b0e720ca4","Type":"ContainerDied","Data":"e3915b04d34fce2f06abd93d9609aa68f480937996cbc375642128491fba385c"} Apr 28 19:50:44.129720 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:44.129727 2576 scope.go:117] "RemoveContainer" containerID="30ab35661630317a2448f3e5452f46121ed62f3cbbad189edd4bb96e0e8f6c66" Apr 28 19:50:44.130201 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:44.129734 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-0ca59-8bcb68dfb-v4b49" Apr 28 19:50:44.157344 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:44.157317 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-0ca59-8bcb68dfb-v4b49"] Apr 28 19:50:44.164765 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:44.164745 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-0ca59-8bcb68dfb-v4b49"] Apr 28 19:50:44.513980 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:44.513951 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ce3f5e5-d1d4-453c-9956-fb5b0e720ca4" path="/var/lib/kubelet/pods/7ce3f5e5-d1d4-453c-9956-fb5b0e720ca4/volumes" Apr 28 19:50:51.042471 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:51.042431 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-6bad9-predictor-5d7df76d56-h7pdr" podUID="e692a5c3-0746-4dbb-9f52-0baa64cb948f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.44:8080: connect: connection refused" Apr 28 19:50:52.232623 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:52.232572 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-7809f-56768674bd-l46kl"] Apr 28 19:50:52.233008 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:52.232812 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sequence-graph-7809f-56768674bd-l46kl" podUID="8a0de09b-f33b-4bc3-a0d1-00b83d801e34" containerName="sequence-graph-7809f" containerID="cri-o://442ff091d44cc276df277ae6d47ecdde7953d91486fc72110f34c40a7e492b8b" gracePeriod=30 Apr 28 19:50:52.390801 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:52.390767 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-7809f-predictor-b4788b9f6-r48p7"] Apr 28 19:50:52.391067 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:52.391040 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-7809f-predictor-b4788b9f6-r48p7" podUID="140418b4-b40e-44c3-8c66-f09c82585534" containerName="kserve-container" containerID="cri-o://0a9105f93b179065027567828d7290d80a93449fec16e7845210e9d33134ed08" gracePeriod=30 Apr 28 19:50:52.391211 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:52.391128 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-7809f-predictor-b4788b9f6-r48p7" podUID="140418b4-b40e-44c3-8c66-f09c82585534" containerName="kube-rbac-proxy" containerID="cri-o://a83bfdd036d2a2a33b19c759d2c8da67711aca57cbebc0eb86b11ac0ce84127b" gracePeriod=30 Apr 28 19:50:52.418705 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:52.418676 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-e5e5a-predictor-7bd5747694-zwckw"] Apr 28 19:50:52.419014 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:52.419003 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4f191223-363e-4407-ad08-1d395af94d5f" containerName="kserve-container" Apr 28 19:50:52.419056 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:52.419016 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f191223-363e-4407-ad08-1d395af94d5f" containerName="kserve-container" Apr 28 19:50:52.419056 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:52.419028 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4f191223-363e-4407-ad08-1d395af94d5f" containerName="kube-rbac-proxy" Apr 28 19:50:52.419056 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:52.419033 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f191223-363e-4407-ad08-1d395af94d5f" containerName="kube-rbac-proxy" Apr 28 19:50:52.419056 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:52.419049 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7ce3f5e5-d1d4-453c-9956-fb5b0e720ca4" containerName="ensemble-graph-0ca59" Apr 28 19:50:52.419056 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:52.419056 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ce3f5e5-d1d4-453c-9956-fb5b0e720ca4" containerName="ensemble-graph-0ca59" Apr 28 19:50:52.419208 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:52.419108 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="4f191223-363e-4407-ad08-1d395af94d5f" containerName="kserve-container" Apr 28 19:50:52.419208 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:52.419115 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="4f191223-363e-4407-ad08-1d395af94d5f" containerName="kube-rbac-proxy" Apr 28 19:50:52.419208 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:52.419125 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="7ce3f5e5-d1d4-453c-9956-fb5b0e720ca4" containerName="ensemble-graph-0ca59" Apr 28 19:50:52.423585 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:52.423566 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-e5e5a-predictor-7bd5747694-zwckw" Apr 28 19:50:52.426304 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:52.426281 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-e5e5a-predictor-serving-cert\"" Apr 28 19:50:52.426432 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:52.426413 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-e5e5a-kube-rbac-proxy-sar-config\"" Apr 28 19:50:52.432388 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:52.432366 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-e5e5a-predictor-7bd5747694-zwckw"] Apr 28 19:50:52.454997 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:52.454954 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7xms\" (UniqueName: \"kubernetes.io/projected/dfc04099-98ac-4199-beeb-90bdf50f0477-kube-api-access-s7xms\") pod \"success-200-isvc-e5e5a-predictor-7bd5747694-zwckw\" (UID: \"dfc04099-98ac-4199-beeb-90bdf50f0477\") " pod="kserve-ci-e2e-test/success-200-isvc-e5e5a-predictor-7bd5747694-zwckw" Apr 28 19:50:52.455171 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:52.455033 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-e5e5a-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/dfc04099-98ac-4199-beeb-90bdf50f0477-success-200-isvc-e5e5a-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-e5e5a-predictor-7bd5747694-zwckw\" (UID: \"dfc04099-98ac-4199-beeb-90bdf50f0477\") " pod="kserve-ci-e2e-test/success-200-isvc-e5e5a-predictor-7bd5747694-zwckw" Apr 28 19:50:52.455171 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:52.455057 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dfc04099-98ac-4199-beeb-90bdf50f0477-proxy-tls\") pod \"success-200-isvc-e5e5a-predictor-7bd5747694-zwckw\" (UID: \"dfc04099-98ac-4199-beeb-90bdf50f0477\") " pod="kserve-ci-e2e-test/success-200-isvc-e5e5a-predictor-7bd5747694-zwckw" Apr 28 19:50:52.555536 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:52.555497 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s7xms\" (UniqueName: \"kubernetes.io/projected/dfc04099-98ac-4199-beeb-90bdf50f0477-kube-api-access-s7xms\") pod \"success-200-isvc-e5e5a-predictor-7bd5747694-zwckw\" (UID: \"dfc04099-98ac-4199-beeb-90bdf50f0477\") " pod="kserve-ci-e2e-test/success-200-isvc-e5e5a-predictor-7bd5747694-zwckw" Apr 28 19:50:52.555730 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:52.555588 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-e5e5a-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/dfc04099-98ac-4199-beeb-90bdf50f0477-success-200-isvc-e5e5a-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-e5e5a-predictor-7bd5747694-zwckw\" (UID: \"dfc04099-98ac-4199-beeb-90bdf50f0477\") " pod="kserve-ci-e2e-test/success-200-isvc-e5e5a-predictor-7bd5747694-zwckw" Apr 28 19:50:52.555730 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:52.555635 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dfc04099-98ac-4199-beeb-90bdf50f0477-proxy-tls\") pod \"success-200-isvc-e5e5a-predictor-7bd5747694-zwckw\" (UID: \"dfc04099-98ac-4199-beeb-90bdf50f0477\") " pod="kserve-ci-e2e-test/success-200-isvc-e5e5a-predictor-7bd5747694-zwckw" Apr 28 19:50:52.556341 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:52.556315 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-e5e5a-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/dfc04099-98ac-4199-beeb-90bdf50f0477-success-200-isvc-e5e5a-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-e5e5a-predictor-7bd5747694-zwckw\" (UID: \"dfc04099-98ac-4199-beeb-90bdf50f0477\") " pod="kserve-ci-e2e-test/success-200-isvc-e5e5a-predictor-7bd5747694-zwckw" Apr 28 19:50:52.558271 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:52.558249 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dfc04099-98ac-4199-beeb-90bdf50f0477-proxy-tls\") pod \"success-200-isvc-e5e5a-predictor-7bd5747694-zwckw\" (UID: \"dfc04099-98ac-4199-beeb-90bdf50f0477\") " pod="kserve-ci-e2e-test/success-200-isvc-e5e5a-predictor-7bd5747694-zwckw" Apr 28 19:50:52.563778 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:52.563748 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7xms\" (UniqueName: \"kubernetes.io/projected/dfc04099-98ac-4199-beeb-90bdf50f0477-kube-api-access-s7xms\") pod \"success-200-isvc-e5e5a-predictor-7bd5747694-zwckw\" (UID: \"dfc04099-98ac-4199-beeb-90bdf50f0477\") " pod="kserve-ci-e2e-test/success-200-isvc-e5e5a-predictor-7bd5747694-zwckw" Apr 28 19:50:52.734364 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:52.734324 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-e5e5a-predictor-7bd5747694-zwckw" Apr 28 19:50:52.857726 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:52.857641 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-e5e5a-predictor-7bd5747694-zwckw"] Apr 28 19:50:52.860044 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:50:52.860022 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddfc04099_98ac_4199_beeb_90bdf50f0477.slice/crio-b0e829fc43fa5a17b8162244cdb792f3ced292594962e40f198cbb24329c970b WatchSource:0}: Error finding container b0e829fc43fa5a17b8162244cdb792f3ced292594962e40f198cbb24329c970b: Status 404 returned error can't find the container with id b0e829fc43fa5a17b8162244cdb792f3ced292594962e40f198cbb24329c970b Apr 28 19:50:53.159251 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:53.159167 2576 generic.go:358] "Generic (PLEG): container finished" podID="140418b4-b40e-44c3-8c66-f09c82585534" containerID="a83bfdd036d2a2a33b19c759d2c8da67711aca57cbebc0eb86b11ac0ce84127b" exitCode=2 Apr 28 19:50:53.159251 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:53.159226 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-7809f-predictor-b4788b9f6-r48p7" event={"ID":"140418b4-b40e-44c3-8c66-f09c82585534","Type":"ContainerDied","Data":"a83bfdd036d2a2a33b19c759d2c8da67711aca57cbebc0eb86b11ac0ce84127b"} Apr 28 19:50:53.160809 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:53.160784 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-e5e5a-predictor-7bd5747694-zwckw" event={"ID":"dfc04099-98ac-4199-beeb-90bdf50f0477","Type":"ContainerStarted","Data":"ee8d13a421615001853fb3d84d4164e18119a513066d7582d788efa81cdf3ecf"} Apr 28 19:50:53.160809 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:53.160812 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-e5e5a-predictor-7bd5747694-zwckw" event={"ID":"dfc04099-98ac-4199-beeb-90bdf50f0477","Type":"ContainerStarted","Data":"238f6c83e59a0532054fbb8cd02cdfffe8a71511867b52b1808becc55abe299a"} Apr 28 19:50:53.160993 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:53.160821 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-e5e5a-predictor-7bd5747694-zwckw" event={"ID":"dfc04099-98ac-4199-beeb-90bdf50f0477","Type":"ContainerStarted","Data":"b0e829fc43fa5a17b8162244cdb792f3ced292594962e40f198cbb24329c970b"} Apr 28 19:50:53.160993 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:53.160960 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-e5e5a-predictor-7bd5747694-zwckw" Apr 28 19:50:53.184735 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:53.184684 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-e5e5a-predictor-7bd5747694-zwckw" podStartSLOduration=1.184671139 podStartE2EDuration="1.184671139s" podCreationTimestamp="2026-04-28 19:50:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:50:53.183392275 +0000 UTC m=+2069.146292654" watchObservedRunningTime="2026-04-28 19:50:53.184671139 +0000 UTC m=+2069.147571502" Apr 28 19:50:54.164391 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:54.164363 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-e5e5a-predictor-7bd5747694-zwckw" Apr 28 19:50:54.165782 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:54.165753 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-e5e5a-predictor-7bd5747694-zwckw" podUID="dfc04099-98ac-4199-beeb-90bdf50f0477" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.45:8080: connect: connection refused" Apr 28 19:50:54.552519 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:54.552470 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-7809f-56768674bd-l46kl" podUID="8a0de09b-f33b-4bc3-a0d1-00b83d801e34" containerName="sequence-graph-7809f" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:50:55.167748 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:55.167707 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-e5e5a-predictor-7bd5747694-zwckw" podUID="dfc04099-98ac-4199-beeb-90bdf50f0477" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.45:8080: connect: connection refused" Apr 28 19:50:55.336268 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:55.336229 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-7809f-predictor-b4788b9f6-r48p7" podUID="140418b4-b40e-44c3-8c66-f09c82585534" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.41:8643/healthz\": dial tcp 10.134.0.41:8643: connect: connection refused" Apr 28 19:50:55.341902 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:55.341879 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-7809f-predictor-b4788b9f6-r48p7" podUID="140418b4-b40e-44c3-8c66-f09c82585534" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 28 19:50:55.733551 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:55.733527 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-7809f-predictor-b4788b9f6-r48p7" Apr 28 19:50:55.783744 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:55.783703 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/140418b4-b40e-44c3-8c66-f09c82585534-proxy-tls\") pod \"140418b4-b40e-44c3-8c66-f09c82585534\" (UID: \"140418b4-b40e-44c3-8c66-f09c82585534\") " Apr 28 19:50:55.783915 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:55.783760 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-7809f-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/140418b4-b40e-44c3-8c66-f09c82585534-success-200-isvc-7809f-kube-rbac-proxy-sar-config\") pod \"140418b4-b40e-44c3-8c66-f09c82585534\" (UID: \"140418b4-b40e-44c3-8c66-f09c82585534\") " Apr 28 19:50:55.783915 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:55.783839 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-75jrw\" (UniqueName: \"kubernetes.io/projected/140418b4-b40e-44c3-8c66-f09c82585534-kube-api-access-75jrw\") pod \"140418b4-b40e-44c3-8c66-f09c82585534\" (UID: \"140418b4-b40e-44c3-8c66-f09c82585534\") " Apr 28 19:50:55.784126 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:55.784102 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/140418b4-b40e-44c3-8c66-f09c82585534-success-200-isvc-7809f-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-7809f-kube-rbac-proxy-sar-config") pod "140418b4-b40e-44c3-8c66-f09c82585534" (UID: "140418b4-b40e-44c3-8c66-f09c82585534"). InnerVolumeSpecName "success-200-isvc-7809f-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:50:55.786091 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:55.786064 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/140418b4-b40e-44c3-8c66-f09c82585534-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "140418b4-b40e-44c3-8c66-f09c82585534" (UID: "140418b4-b40e-44c3-8c66-f09c82585534"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:50:55.786091 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:55.786080 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/140418b4-b40e-44c3-8c66-f09c82585534-kube-api-access-75jrw" (OuterVolumeSpecName: "kube-api-access-75jrw") pod "140418b4-b40e-44c3-8c66-f09c82585534" (UID: "140418b4-b40e-44c3-8c66-f09c82585534"). InnerVolumeSpecName "kube-api-access-75jrw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:50:55.884897 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:55.884795 2576 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-7809f-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/140418b4-b40e-44c3-8c66-f09c82585534-success-200-isvc-7809f-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 28 19:50:55.884897 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:55.884843 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-75jrw\" (UniqueName: \"kubernetes.io/projected/140418b4-b40e-44c3-8c66-f09c82585534-kube-api-access-75jrw\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 28 19:50:55.884897 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:55.884855 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/140418b4-b40e-44c3-8c66-f09c82585534-proxy-tls\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 28 19:50:56.171384 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:56.171296 2576 generic.go:358] "Generic (PLEG): container finished" podID="140418b4-b40e-44c3-8c66-f09c82585534" containerID="0a9105f93b179065027567828d7290d80a93449fec16e7845210e9d33134ed08" exitCode=0 Apr 28 19:50:56.171384 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:56.171376 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-7809f-predictor-b4788b9f6-r48p7" Apr 28 19:50:56.171914 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:56.171378 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-7809f-predictor-b4788b9f6-r48p7" event={"ID":"140418b4-b40e-44c3-8c66-f09c82585534","Type":"ContainerDied","Data":"0a9105f93b179065027567828d7290d80a93449fec16e7845210e9d33134ed08"} Apr 28 19:50:56.171914 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:56.171416 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-7809f-predictor-b4788b9f6-r48p7" event={"ID":"140418b4-b40e-44c3-8c66-f09c82585534","Type":"ContainerDied","Data":"324165cc17f7c17b667e97d14c5cdeade1a4583647e4424c37c9815988a41efe"} Apr 28 19:50:56.171914 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:56.171430 2576 scope.go:117] "RemoveContainer" containerID="a83bfdd036d2a2a33b19c759d2c8da67711aca57cbebc0eb86b11ac0ce84127b" Apr 28 19:50:56.180504 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:56.180488 2576 scope.go:117] "RemoveContainer" containerID="0a9105f93b179065027567828d7290d80a93449fec16e7845210e9d33134ed08" Apr 28 19:50:56.187320 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:56.187302 2576 scope.go:117] "RemoveContainer" containerID="a83bfdd036d2a2a33b19c759d2c8da67711aca57cbebc0eb86b11ac0ce84127b" Apr 28 19:50:56.187558 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:50:56.187520 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a83bfdd036d2a2a33b19c759d2c8da67711aca57cbebc0eb86b11ac0ce84127b\": container with ID starting with a83bfdd036d2a2a33b19c759d2c8da67711aca57cbebc0eb86b11ac0ce84127b not found: ID does not exist" containerID="a83bfdd036d2a2a33b19c759d2c8da67711aca57cbebc0eb86b11ac0ce84127b" Apr 28 19:50:56.187657 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:56.187567 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a83bfdd036d2a2a33b19c759d2c8da67711aca57cbebc0eb86b11ac0ce84127b"} err="failed to get container status \"a83bfdd036d2a2a33b19c759d2c8da67711aca57cbebc0eb86b11ac0ce84127b\": rpc error: code = NotFound desc = could not find container \"a83bfdd036d2a2a33b19c759d2c8da67711aca57cbebc0eb86b11ac0ce84127b\": container with ID starting with a83bfdd036d2a2a33b19c759d2c8da67711aca57cbebc0eb86b11ac0ce84127b not found: ID does not exist" Apr 28 19:50:56.187657 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:56.187584 2576 scope.go:117] "RemoveContainer" containerID="0a9105f93b179065027567828d7290d80a93449fec16e7845210e9d33134ed08" Apr 28 19:50:56.187877 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:50:56.187861 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a9105f93b179065027567828d7290d80a93449fec16e7845210e9d33134ed08\": container with ID starting with 0a9105f93b179065027567828d7290d80a93449fec16e7845210e9d33134ed08 not found: ID does not exist" containerID="0a9105f93b179065027567828d7290d80a93449fec16e7845210e9d33134ed08" Apr 28 19:50:56.187923 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:56.187882 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a9105f93b179065027567828d7290d80a93449fec16e7845210e9d33134ed08"} err="failed to get container status \"0a9105f93b179065027567828d7290d80a93449fec16e7845210e9d33134ed08\": rpc error: code = NotFound desc = could not find container \"0a9105f93b179065027567828d7290d80a93449fec16e7845210e9d33134ed08\": container with ID starting with 0a9105f93b179065027567828d7290d80a93449fec16e7845210e9d33134ed08 not found: ID does not exist" Apr 28 19:50:56.196215 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:56.196190 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-7809f-predictor-b4788b9f6-r48p7"] Apr 28 19:50:56.202151 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:56.202130 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-7809f-predictor-b4788b9f6-r48p7"] Apr 28 19:50:56.514554 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:56.514514 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="140418b4-b40e-44c3-8c66-f09c82585534" path="/var/lib/kubelet/pods/140418b4-b40e-44c3-8c66-f09c82585534/volumes" Apr 28 19:50:59.552472 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:50:59.552437 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-7809f-56768674bd-l46kl" podUID="8a0de09b-f33b-4bc3-a0d1-00b83d801e34" containerName="sequence-graph-7809f" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:51:00.171883 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:00.171855 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-e5e5a-predictor-7bd5747694-zwckw" Apr 28 19:51:00.172405 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:00.172376 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-e5e5a-predictor-7bd5747694-zwckw" podUID="dfc04099-98ac-4199-beeb-90bdf50f0477" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.45:8080: connect: connection refused" Apr 28 19:51:01.043338 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:01.043309 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-6bad9-predictor-5d7df76d56-h7pdr" Apr 28 19:51:04.552059 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:04.552024 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-7809f-56768674bd-l46kl" podUID="8a0de09b-f33b-4bc3-a0d1-00b83d801e34" containerName="sequence-graph-7809f" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:51:04.552418 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:04.552115 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-7809f-56768674bd-l46kl" Apr 28 19:51:09.552461 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:09.552419 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-7809f-56768674bd-l46kl" podUID="8a0de09b-f33b-4bc3-a0d1-00b83d801e34" containerName="sequence-graph-7809f" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:51:10.173224 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:10.173186 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-e5e5a-predictor-7bd5747694-zwckw" podUID="dfc04099-98ac-4199-beeb-90bdf50f0477" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.45:8080: connect: connection refused" Apr 28 19:51:12.819213 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:12.819162 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/splitter-graph-6bad9-5444bd87c8-6slkv"] Apr 28 19:51:12.819735 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:12.819491 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="140418b4-b40e-44c3-8c66-f09c82585534" containerName="kube-rbac-proxy" Apr 28 19:51:12.819735 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:12.819502 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="140418b4-b40e-44c3-8c66-f09c82585534" containerName="kube-rbac-proxy" Apr 28 19:51:12.819735 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:12.819522 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="140418b4-b40e-44c3-8c66-f09c82585534" containerName="kserve-container" Apr 28 19:51:12.819735 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:12.819530 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="140418b4-b40e-44c3-8c66-f09c82585534" containerName="kserve-container" Apr 28 19:51:12.819735 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:12.819587 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="140418b4-b40e-44c3-8c66-f09c82585534" containerName="kserve-container" Apr 28 19:51:12.819735 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:12.819597 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="140418b4-b40e-44c3-8c66-f09c82585534" containerName="kube-rbac-proxy" Apr 28 19:51:12.824083 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:12.824065 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-6bad9-5444bd87c8-6slkv" Apr 28 19:51:12.826778 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:12.826756 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"splitter-graph-6bad9-kube-rbac-proxy-sar-config\"" Apr 28 19:51:12.826870 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:12.826760 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"splitter-graph-6bad9-serving-cert\"" Apr 28 19:51:12.831660 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:12.831636 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-6bad9-5444bd87c8-6slkv"] Apr 28 19:51:12.926363 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:12.926317 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/29006596-169d-4609-ad97-0a2610da37c1-proxy-tls\") pod \"splitter-graph-6bad9-5444bd87c8-6slkv\" (UID: \"29006596-169d-4609-ad97-0a2610da37c1\") " pod="kserve-ci-e2e-test/splitter-graph-6bad9-5444bd87c8-6slkv" Apr 28 19:51:12.926580 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:12.926431 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/29006596-169d-4609-ad97-0a2610da37c1-openshift-service-ca-bundle\") pod \"splitter-graph-6bad9-5444bd87c8-6slkv\" (UID: \"29006596-169d-4609-ad97-0a2610da37c1\") " pod="kserve-ci-e2e-test/splitter-graph-6bad9-5444bd87c8-6slkv" Apr 28 19:51:13.027363 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:13.027330 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/29006596-169d-4609-ad97-0a2610da37c1-proxy-tls\") pod \"splitter-graph-6bad9-5444bd87c8-6slkv\" (UID: \"29006596-169d-4609-ad97-0a2610da37c1\") " pod="kserve-ci-e2e-test/splitter-graph-6bad9-5444bd87c8-6slkv" Apr 28 19:51:13.027528 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:13.027419 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/29006596-169d-4609-ad97-0a2610da37c1-openshift-service-ca-bundle\") pod \"splitter-graph-6bad9-5444bd87c8-6slkv\" (UID: \"29006596-169d-4609-ad97-0a2610da37c1\") " pod="kserve-ci-e2e-test/splitter-graph-6bad9-5444bd87c8-6slkv" Apr 28 19:51:13.028148 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:13.028102 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/29006596-169d-4609-ad97-0a2610da37c1-openshift-service-ca-bundle\") pod \"splitter-graph-6bad9-5444bd87c8-6slkv\" (UID: \"29006596-169d-4609-ad97-0a2610da37c1\") " pod="kserve-ci-e2e-test/splitter-graph-6bad9-5444bd87c8-6slkv" Apr 28 19:51:13.029946 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:13.029922 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/29006596-169d-4609-ad97-0a2610da37c1-proxy-tls\") pod \"splitter-graph-6bad9-5444bd87c8-6slkv\" (UID: \"29006596-169d-4609-ad97-0a2610da37c1\") " pod="kserve-ci-e2e-test/splitter-graph-6bad9-5444bd87c8-6slkv" Apr 28 19:51:13.134821 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:13.134727 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-6bad9-5444bd87c8-6slkv" Apr 28 19:51:13.256440 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:13.256413 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-6bad9-5444bd87c8-6slkv"] Apr 28 19:51:13.258803 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:51:13.258774 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod29006596_169d_4609_ad97_0a2610da37c1.slice/crio-747756d3a74da5e7c449cb20b70a24c55c023b1b7db441a34a0ab086576c0aa0 WatchSource:0}: Error finding container 747756d3a74da5e7c449cb20b70a24c55c023b1b7db441a34a0ab086576c0aa0: Status 404 returned error can't find the container with id 747756d3a74da5e7c449cb20b70a24c55c023b1b7db441a34a0ab086576c0aa0 Apr 28 19:51:14.231805 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:14.231762 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-6bad9-5444bd87c8-6slkv" event={"ID":"29006596-169d-4609-ad97-0a2610da37c1","Type":"ContainerStarted","Data":"19338b0a8391d0410e414999c667a2b01aba60e64f59ee8277e0ede6b64f422d"} Apr 28 19:51:14.231805 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:14.231800 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-6bad9-5444bd87c8-6slkv" event={"ID":"29006596-169d-4609-ad97-0a2610da37c1","Type":"ContainerStarted","Data":"747756d3a74da5e7c449cb20b70a24c55c023b1b7db441a34a0ab086576c0aa0"} Apr 28 19:51:14.232415 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:14.231849 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/splitter-graph-6bad9-5444bd87c8-6slkv" Apr 28 19:51:14.249921 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:14.249865 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/splitter-graph-6bad9-5444bd87c8-6slkv" podStartSLOduration=2.249850548 podStartE2EDuration="2.249850548s" podCreationTimestamp="2026-04-28 19:51:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:51:14.248587274 +0000 UTC m=+2090.211487663" watchObservedRunningTime="2026-04-28 19:51:14.249850548 +0000 UTC m=+2090.212750913" Apr 28 19:51:14.552519 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:14.552482 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-7809f-56768674bd-l46kl" podUID="8a0de09b-f33b-4bc3-a0d1-00b83d801e34" containerName="sequence-graph-7809f" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:51:19.552878 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:19.552839 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-7809f-56768674bd-l46kl" podUID="8a0de09b-f33b-4bc3-a0d1-00b83d801e34" containerName="sequence-graph-7809f" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:51:20.172591 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:20.172550 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-e5e5a-predictor-7bd5747694-zwckw" podUID="dfc04099-98ac-4199-beeb-90bdf50f0477" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.45:8080: connect: connection refused" Apr 28 19:51:20.241867 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:20.241835 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/splitter-graph-6bad9-5444bd87c8-6slkv" Apr 28 19:51:22.259827 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:22.259796 2576 generic.go:358] "Generic (PLEG): container finished" podID="8a0de09b-f33b-4bc3-a0d1-00b83d801e34" containerID="442ff091d44cc276df277ae6d47ecdde7953d91486fc72110f34c40a7e492b8b" exitCode=0 Apr 28 19:51:22.260236 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:22.259844 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-7809f-56768674bd-l46kl" event={"ID":"8a0de09b-f33b-4bc3-a0d1-00b83d801e34","Type":"ContainerDied","Data":"442ff091d44cc276df277ae6d47ecdde7953d91486fc72110f34c40a7e492b8b"} Apr 28 19:51:22.374247 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:22.374223 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-7809f-56768674bd-l46kl" Apr 28 19:51:22.510210 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:22.510143 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8a0de09b-f33b-4bc3-a0d1-00b83d801e34-openshift-service-ca-bundle\") pod \"8a0de09b-f33b-4bc3-a0d1-00b83d801e34\" (UID: \"8a0de09b-f33b-4bc3-a0d1-00b83d801e34\") " Apr 28 19:51:22.510210 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:22.510190 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8a0de09b-f33b-4bc3-a0d1-00b83d801e34-proxy-tls\") pod \"8a0de09b-f33b-4bc3-a0d1-00b83d801e34\" (UID: \"8a0de09b-f33b-4bc3-a0d1-00b83d801e34\") " Apr 28 19:51:22.510565 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:22.510538 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a0de09b-f33b-4bc3-a0d1-00b83d801e34-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "8a0de09b-f33b-4bc3-a0d1-00b83d801e34" (UID: "8a0de09b-f33b-4bc3-a0d1-00b83d801e34"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:51:22.512431 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:22.512409 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a0de09b-f33b-4bc3-a0d1-00b83d801e34-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "8a0de09b-f33b-4bc3-a0d1-00b83d801e34" (UID: "8a0de09b-f33b-4bc3-a0d1-00b83d801e34"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:51:22.611107 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:22.611079 2576 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8a0de09b-f33b-4bc3-a0d1-00b83d801e34-openshift-service-ca-bundle\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 28 19:51:22.611107 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:22.611103 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8a0de09b-f33b-4bc3-a0d1-00b83d801e34-proxy-tls\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 28 19:51:22.824462 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:22.824384 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-6bad9-5444bd87c8-6slkv"] Apr 28 19:51:22.824665 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:22.824642 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/splitter-graph-6bad9-5444bd87c8-6slkv" podUID="29006596-169d-4609-ad97-0a2610da37c1" containerName="splitter-graph-6bad9" containerID="cri-o://19338b0a8391d0410e414999c667a2b01aba60e64f59ee8277e0ede6b64f422d" gracePeriod=30 Apr 28 19:51:22.925383 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:22.925344 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-6bad9-predictor-5d7df76d56-h7pdr"] Apr 28 19:51:22.925677 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:22.925645 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-6bad9-predictor-5d7df76d56-h7pdr" podUID="e692a5c3-0746-4dbb-9f52-0baa64cb948f" containerName="kserve-container" containerID="cri-o://07e5c37be67cc104451b9438d0d6b1f43c61495683ba7d06ed365ff03f9b7b96" gracePeriod=30 Apr 28 19:51:22.925823 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:22.925718 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-6bad9-predictor-5d7df76d56-h7pdr" podUID="e692a5c3-0746-4dbb-9f52-0baa64cb948f" containerName="kube-rbac-proxy" containerID="cri-o://4e478a80ea77c03de82c368e16c9a60ace4bea456918ab60628bff052ce40b7f" gracePeriod=30 Apr 28 19:51:22.971580 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:22.971556 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-93cee-predictor-7d9b86d498-48dr9"] Apr 28 19:51:22.971902 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:22.971890 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8a0de09b-f33b-4bc3-a0d1-00b83d801e34" containerName="sequence-graph-7809f" Apr 28 19:51:22.971944 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:22.971903 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a0de09b-f33b-4bc3-a0d1-00b83d801e34" containerName="sequence-graph-7809f" Apr 28 19:51:22.971977 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:22.971954 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="8a0de09b-f33b-4bc3-a0d1-00b83d801e34" containerName="sequence-graph-7809f" Apr 28 19:51:22.975163 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:22.975147 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-93cee-predictor-7d9b86d498-48dr9" Apr 28 19:51:22.977695 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:22.977673 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-93cee-predictor-serving-cert\"" Apr 28 19:51:22.977784 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:22.977725 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-93cee-kube-rbac-proxy-sar-config\"" Apr 28 19:51:22.990076 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:22.990055 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-93cee-predictor-7d9b86d498-48dr9"] Apr 28 19:51:23.114760 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:23.114678 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a3c065ac-d90a-42d8-86fd-49d138e4cde4-proxy-tls\") pod \"success-200-isvc-93cee-predictor-7d9b86d498-48dr9\" (UID: \"a3c065ac-d90a-42d8-86fd-49d138e4cde4\") " pod="kserve-ci-e2e-test/success-200-isvc-93cee-predictor-7d9b86d498-48dr9" Apr 28 19:51:23.114894 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:23.114799 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-297vc\" (UniqueName: \"kubernetes.io/projected/a3c065ac-d90a-42d8-86fd-49d138e4cde4-kube-api-access-297vc\") pod \"success-200-isvc-93cee-predictor-7d9b86d498-48dr9\" (UID: \"a3c065ac-d90a-42d8-86fd-49d138e4cde4\") " pod="kserve-ci-e2e-test/success-200-isvc-93cee-predictor-7d9b86d498-48dr9" Apr 28 19:51:23.114894 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:23.114873 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-93cee-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a3c065ac-d90a-42d8-86fd-49d138e4cde4-success-200-isvc-93cee-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-93cee-predictor-7d9b86d498-48dr9\" (UID: \"a3c065ac-d90a-42d8-86fd-49d138e4cde4\") " pod="kserve-ci-e2e-test/success-200-isvc-93cee-predictor-7d9b86d498-48dr9" Apr 28 19:51:23.216210 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:23.216182 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-93cee-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a3c065ac-d90a-42d8-86fd-49d138e4cde4-success-200-isvc-93cee-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-93cee-predictor-7d9b86d498-48dr9\" (UID: \"a3c065ac-d90a-42d8-86fd-49d138e4cde4\") " pod="kserve-ci-e2e-test/success-200-isvc-93cee-predictor-7d9b86d498-48dr9" Apr 28 19:51:23.216370 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:23.216229 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a3c065ac-d90a-42d8-86fd-49d138e4cde4-proxy-tls\") pod \"success-200-isvc-93cee-predictor-7d9b86d498-48dr9\" (UID: \"a3c065ac-d90a-42d8-86fd-49d138e4cde4\") " pod="kserve-ci-e2e-test/success-200-isvc-93cee-predictor-7d9b86d498-48dr9" Apr 28 19:51:23.216370 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:23.216265 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-297vc\" (UniqueName: \"kubernetes.io/projected/a3c065ac-d90a-42d8-86fd-49d138e4cde4-kube-api-access-297vc\") pod \"success-200-isvc-93cee-predictor-7d9b86d498-48dr9\" (UID: \"a3c065ac-d90a-42d8-86fd-49d138e4cde4\") " pod="kserve-ci-e2e-test/success-200-isvc-93cee-predictor-7d9b86d498-48dr9" Apr 28 19:51:23.216880 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:23.216853 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-93cee-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a3c065ac-d90a-42d8-86fd-49d138e4cde4-success-200-isvc-93cee-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-93cee-predictor-7d9b86d498-48dr9\" (UID: \"a3c065ac-d90a-42d8-86fd-49d138e4cde4\") " pod="kserve-ci-e2e-test/success-200-isvc-93cee-predictor-7d9b86d498-48dr9" Apr 28 19:51:23.218730 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:23.218713 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a3c065ac-d90a-42d8-86fd-49d138e4cde4-proxy-tls\") pod \"success-200-isvc-93cee-predictor-7d9b86d498-48dr9\" (UID: \"a3c065ac-d90a-42d8-86fd-49d138e4cde4\") " pod="kserve-ci-e2e-test/success-200-isvc-93cee-predictor-7d9b86d498-48dr9" Apr 28 19:51:23.224055 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:23.224037 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-297vc\" (UniqueName: \"kubernetes.io/projected/a3c065ac-d90a-42d8-86fd-49d138e4cde4-kube-api-access-297vc\") pod \"success-200-isvc-93cee-predictor-7d9b86d498-48dr9\" (UID: \"a3c065ac-d90a-42d8-86fd-49d138e4cde4\") " pod="kserve-ci-e2e-test/success-200-isvc-93cee-predictor-7d9b86d498-48dr9" Apr 28 19:51:23.265961 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:23.265930 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-7809f-56768674bd-l46kl" event={"ID":"8a0de09b-f33b-4bc3-a0d1-00b83d801e34","Type":"ContainerDied","Data":"b65e3c633cf46c5957c948c721aaf2d70b34be755f0ef8f7125cd148ff01357d"} Apr 28 19:51:23.266290 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:23.265967 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-7809f-56768674bd-l46kl" Apr 28 19:51:23.266290 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:23.265975 2576 scope.go:117] "RemoveContainer" containerID="442ff091d44cc276df277ae6d47ecdde7953d91486fc72110f34c40a7e492b8b" Apr 28 19:51:23.268050 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:23.268019 2576 generic.go:358] "Generic (PLEG): container finished" podID="e692a5c3-0746-4dbb-9f52-0baa64cb948f" containerID="4e478a80ea77c03de82c368e16c9a60ace4bea456918ab60628bff052ce40b7f" exitCode=2 Apr 28 19:51:23.268187 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:23.268102 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-6bad9-predictor-5d7df76d56-h7pdr" event={"ID":"e692a5c3-0746-4dbb-9f52-0baa64cb948f","Type":"ContainerDied","Data":"4e478a80ea77c03de82c368e16c9a60ace4bea456918ab60628bff052ce40b7f"} Apr 28 19:51:23.289007 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:23.288984 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-93cee-predictor-7d9b86d498-48dr9" Apr 28 19:51:23.291464 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:23.290172 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-7809f-56768674bd-l46kl"] Apr 28 19:51:23.291464 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:23.290206 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-7809f-56768674bd-l46kl"] Apr 28 19:51:23.413275 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:23.413254 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-93cee-predictor-7d9b86d498-48dr9"] Apr 28 19:51:23.415331 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:51:23.415299 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3c065ac_d90a_42d8_86fd_49d138e4cde4.slice/crio-8c35a4a839d320f1bb876a18eb0b594814b274af61e7bf9b98ebea27d97d29af WatchSource:0}: Error finding container 8c35a4a839d320f1bb876a18eb0b594814b274af61e7bf9b98ebea27d97d29af: Status 404 returned error can't find the container with id 8c35a4a839d320f1bb876a18eb0b594814b274af61e7bf9b98ebea27d97d29af Apr 28 19:51:24.273952 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:24.273917 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-93cee-predictor-7d9b86d498-48dr9" event={"ID":"a3c065ac-d90a-42d8-86fd-49d138e4cde4","Type":"ContainerStarted","Data":"bce99bbd504e016d7f3f18dbae7d94fb58d64e942e5cf85781bf0a101efd2d55"} Apr 28 19:51:24.273952 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:24.273954 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-93cee-predictor-7d9b86d498-48dr9" event={"ID":"a3c065ac-d90a-42d8-86fd-49d138e4cde4","Type":"ContainerStarted","Data":"70142a6b903ff37405f244cb1c878343b18b3bc40d1a1f0af80703fb1177f9a6"} Apr 28 19:51:24.274416 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:24.273965 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-93cee-predictor-7d9b86d498-48dr9" event={"ID":"a3c065ac-d90a-42d8-86fd-49d138e4cde4","Type":"ContainerStarted","Data":"8c35a4a839d320f1bb876a18eb0b594814b274af61e7bf9b98ebea27d97d29af"} Apr 28 19:51:24.274416 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:24.274200 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-93cee-predictor-7d9b86d498-48dr9" Apr 28 19:51:24.295328 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:24.295266 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-93cee-predictor-7d9b86d498-48dr9" podStartSLOduration=2.2952503650000002 podStartE2EDuration="2.295250365s" podCreationTimestamp="2026-04-28 19:51:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:51:24.293108797 +0000 UTC m=+2100.256009161" watchObservedRunningTime="2026-04-28 19:51:24.295250365 +0000 UTC m=+2100.258150791" Apr 28 19:51:24.514195 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:24.514151 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a0de09b-f33b-4bc3-a0d1-00b83d801e34" path="/var/lib/kubelet/pods/8a0de09b-f33b-4bc3-a0d1-00b83d801e34/volumes" Apr 28 19:51:24.566076 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:24.565994 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fts6r_6551d5d0-2583-4478-98ab-1efc22016165/console-operator/1.log" Apr 28 19:51:24.571942 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:24.571914 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5tdk8_ff2f50e1-de53-4f11-a477-9236b340536b/ovn-acl-logging/0.log" Apr 28 19:51:24.576839 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:24.576816 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fts6r_6551d5d0-2583-4478-98ab-1efc22016165/console-operator/1.log" Apr 28 19:51:24.582391 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:24.582369 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5tdk8_ff2f50e1-de53-4f11-a477-9236b340536b/ovn-acl-logging/0.log" Apr 28 19:51:25.240261 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:25.240225 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-6bad9-5444bd87c8-6slkv" podUID="29006596-169d-4609-ad97-0a2610da37c1" containerName="splitter-graph-6bad9" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:51:25.277659 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:25.277626 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-93cee-predictor-7d9b86d498-48dr9" Apr 28 19:51:25.279015 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:25.278986 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-93cee-predictor-7d9b86d498-48dr9" podUID="a3c065ac-d90a-42d8-86fd-49d138e4cde4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.47:8080: connect: connection refused" Apr 28 19:51:26.037488 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:26.037438 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-6bad9-predictor-5d7df76d56-h7pdr" podUID="e692a5c3-0746-4dbb-9f52-0baa64cb948f" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.44:8643/healthz\": dial tcp 10.134.0.44:8643: connect: connection refused" Apr 28 19:51:26.266190 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:26.266170 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-6bad9-predictor-5d7df76d56-h7pdr" Apr 28 19:51:26.281166 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:26.281139 2576 generic.go:358] "Generic (PLEG): container finished" podID="e692a5c3-0746-4dbb-9f52-0baa64cb948f" containerID="07e5c37be67cc104451b9438d0d6b1f43c61495683ba7d06ed365ff03f9b7b96" exitCode=0 Apr 28 19:51:26.281543 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:26.281202 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-6bad9-predictor-5d7df76d56-h7pdr" Apr 28 19:51:26.281543 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:26.281205 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-6bad9-predictor-5d7df76d56-h7pdr" event={"ID":"e692a5c3-0746-4dbb-9f52-0baa64cb948f","Type":"ContainerDied","Data":"07e5c37be67cc104451b9438d0d6b1f43c61495683ba7d06ed365ff03f9b7b96"} Apr 28 19:51:26.281543 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:26.281247 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-6bad9-predictor-5d7df76d56-h7pdr" event={"ID":"e692a5c3-0746-4dbb-9f52-0baa64cb948f","Type":"ContainerDied","Data":"76cf8509c575fd476b7f9ee9cf694e10f63d7e223c99431cbfbdaef7f12ff4ee"} Apr 28 19:51:26.281543 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:26.281263 2576 scope.go:117] "RemoveContainer" containerID="4e478a80ea77c03de82c368e16c9a60ace4bea456918ab60628bff052ce40b7f" Apr 28 19:51:26.281813 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:26.281778 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-93cee-predictor-7d9b86d498-48dr9" podUID="a3c065ac-d90a-42d8-86fd-49d138e4cde4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.47:8080: connect: connection refused" Apr 28 19:51:26.291386 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:26.291317 2576 scope.go:117] "RemoveContainer" containerID="07e5c37be67cc104451b9438d0d6b1f43c61495683ba7d06ed365ff03f9b7b96" Apr 28 19:51:26.302798 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:26.302776 2576 scope.go:117] "RemoveContainer" containerID="4e478a80ea77c03de82c368e16c9a60ace4bea456918ab60628bff052ce40b7f" Apr 28 19:51:26.303074 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:51:26.303041 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e478a80ea77c03de82c368e16c9a60ace4bea456918ab60628bff052ce40b7f\": container with ID starting with 4e478a80ea77c03de82c368e16c9a60ace4bea456918ab60628bff052ce40b7f not found: ID does not exist" containerID="4e478a80ea77c03de82c368e16c9a60ace4bea456918ab60628bff052ce40b7f" Apr 28 19:51:26.303128 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:26.303085 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e478a80ea77c03de82c368e16c9a60ace4bea456918ab60628bff052ce40b7f"} err="failed to get container status \"4e478a80ea77c03de82c368e16c9a60ace4bea456918ab60628bff052ce40b7f\": rpc error: code = NotFound desc = could not find container \"4e478a80ea77c03de82c368e16c9a60ace4bea456918ab60628bff052ce40b7f\": container with ID starting with 4e478a80ea77c03de82c368e16c9a60ace4bea456918ab60628bff052ce40b7f not found: ID does not exist" Apr 28 19:51:26.303128 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:26.303109 2576 scope.go:117] "RemoveContainer" containerID="07e5c37be67cc104451b9438d0d6b1f43c61495683ba7d06ed365ff03f9b7b96" Apr 28 19:51:26.303299 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:51:26.303277 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07e5c37be67cc104451b9438d0d6b1f43c61495683ba7d06ed365ff03f9b7b96\": container with ID starting with 07e5c37be67cc104451b9438d0d6b1f43c61495683ba7d06ed365ff03f9b7b96 not found: ID does not exist" containerID="07e5c37be67cc104451b9438d0d6b1f43c61495683ba7d06ed365ff03f9b7b96" Apr 28 19:51:26.303338 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:26.303306 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07e5c37be67cc104451b9438d0d6b1f43c61495683ba7d06ed365ff03f9b7b96"} err="failed to get container status \"07e5c37be67cc104451b9438d0d6b1f43c61495683ba7d06ed365ff03f9b7b96\": rpc error: code = NotFound desc = could not find container \"07e5c37be67cc104451b9438d0d6b1f43c61495683ba7d06ed365ff03f9b7b96\": container with ID starting with 07e5c37be67cc104451b9438d0d6b1f43c61495683ba7d06ed365ff03f9b7b96 not found: ID does not exist" Apr 28 19:51:26.442176 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:26.442143 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wn64x\" (UniqueName: \"kubernetes.io/projected/e692a5c3-0746-4dbb-9f52-0baa64cb948f-kube-api-access-wn64x\") pod \"e692a5c3-0746-4dbb-9f52-0baa64cb948f\" (UID: \"e692a5c3-0746-4dbb-9f52-0baa64cb948f\") " Apr 28 19:51:26.442343 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:26.442228 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e692a5c3-0746-4dbb-9f52-0baa64cb948f-proxy-tls\") pod \"e692a5c3-0746-4dbb-9f52-0baa64cb948f\" (UID: \"e692a5c3-0746-4dbb-9f52-0baa64cb948f\") " Apr 28 19:51:26.442343 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:26.442318 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-6bad9-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e692a5c3-0746-4dbb-9f52-0baa64cb948f-success-200-isvc-6bad9-kube-rbac-proxy-sar-config\") pod \"e692a5c3-0746-4dbb-9f52-0baa64cb948f\" (UID: \"e692a5c3-0746-4dbb-9f52-0baa64cb948f\") " Apr 28 19:51:26.442774 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:26.442740 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e692a5c3-0746-4dbb-9f52-0baa64cb948f-success-200-isvc-6bad9-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-6bad9-kube-rbac-proxy-sar-config") pod "e692a5c3-0746-4dbb-9f52-0baa64cb948f" (UID: "e692a5c3-0746-4dbb-9f52-0baa64cb948f"). InnerVolumeSpecName "success-200-isvc-6bad9-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:51:26.444764 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:26.444723 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e692a5c3-0746-4dbb-9f52-0baa64cb948f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "e692a5c3-0746-4dbb-9f52-0baa64cb948f" (UID: "e692a5c3-0746-4dbb-9f52-0baa64cb948f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:51:26.444867 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:26.444789 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e692a5c3-0746-4dbb-9f52-0baa64cb948f-kube-api-access-wn64x" (OuterVolumeSpecName: "kube-api-access-wn64x") pod "e692a5c3-0746-4dbb-9f52-0baa64cb948f" (UID: "e692a5c3-0746-4dbb-9f52-0baa64cb948f"). InnerVolumeSpecName "kube-api-access-wn64x". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:51:26.544113 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:26.544046 2576 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-6bad9-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e692a5c3-0746-4dbb-9f52-0baa64cb948f-success-200-isvc-6bad9-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 28 19:51:26.544113 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:26.544082 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wn64x\" (UniqueName: \"kubernetes.io/projected/e692a5c3-0746-4dbb-9f52-0baa64cb948f-kube-api-access-wn64x\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 28 19:51:26.544113 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:26.544097 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e692a5c3-0746-4dbb-9f52-0baa64cb948f-proxy-tls\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 28 19:51:26.606870 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:26.606835 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-6bad9-predictor-5d7df76d56-h7pdr"] Apr 28 19:51:26.622136 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:26.622105 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-6bad9-predictor-5d7df76d56-h7pdr"] Apr 28 19:51:28.513719 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:28.513684 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e692a5c3-0746-4dbb-9f52-0baa64cb948f" path="/var/lib/kubelet/pods/e692a5c3-0746-4dbb-9f52-0baa64cb948f/volumes" Apr 28 19:51:30.172418 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:30.172381 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-e5e5a-predictor-7bd5747694-zwckw" podUID="dfc04099-98ac-4199-beeb-90bdf50f0477" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.45:8080: connect: connection refused" Apr 28 19:51:30.239570 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:30.239524 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-6bad9-5444bd87c8-6slkv" podUID="29006596-169d-4609-ad97-0a2610da37c1" containerName="splitter-graph-6bad9" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:51:31.287570 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:31.287540 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-93cee-predictor-7d9b86d498-48dr9" Apr 28 19:51:31.288125 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:31.288101 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-93cee-predictor-7d9b86d498-48dr9" podUID="a3c065ac-d90a-42d8-86fd-49d138e4cde4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.47:8080: connect: connection refused" Apr 28 19:51:35.240327 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:35.240282 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-6bad9-5444bd87c8-6slkv" podUID="29006596-169d-4609-ad97-0a2610da37c1" containerName="splitter-graph-6bad9" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:51:35.240800 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:35.240421 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/splitter-graph-6bad9-5444bd87c8-6slkv" Apr 28 19:51:40.173684 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:40.173654 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-e5e5a-predictor-7bd5747694-zwckw" Apr 28 19:51:40.239788 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:40.239754 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-6bad9-5444bd87c8-6slkv" podUID="29006596-169d-4609-ad97-0a2610da37c1" containerName="splitter-graph-6bad9" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:51:41.288666 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:41.288628 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-93cee-predictor-7d9b86d498-48dr9" podUID="a3c065ac-d90a-42d8-86fd-49d138e4cde4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.47:8080: connect: connection refused" Apr 28 19:51:45.239736 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:45.239695 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-6bad9-5444bd87c8-6slkv" podUID="29006596-169d-4609-ad97-0a2610da37c1" containerName="splitter-graph-6bad9" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:51:50.239946 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:50.239907 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-6bad9-5444bd87c8-6slkv" podUID="29006596-169d-4609-ad97-0a2610da37c1" containerName="splitter-graph-6bad9" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:51:51.288222 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:51.288184 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-93cee-predictor-7d9b86d498-48dr9" podUID="a3c065ac-d90a-42d8-86fd-49d138e4cde4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.47:8080: connect: connection refused" Apr 28 19:51:52.489614 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:52.489565 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/switch-graph-e5e5a-65c9566545-brp96"] Apr 28 19:51:52.489957 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:52.489893 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e692a5c3-0746-4dbb-9f52-0baa64cb948f" containerName="kube-rbac-proxy" Apr 28 19:51:52.489957 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:52.489906 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="e692a5c3-0746-4dbb-9f52-0baa64cb948f" containerName="kube-rbac-proxy" Apr 28 19:51:52.489957 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:52.489917 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e692a5c3-0746-4dbb-9f52-0baa64cb948f" containerName="kserve-container" Apr 28 19:51:52.489957 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:52.489923 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="e692a5c3-0746-4dbb-9f52-0baa64cb948f" containerName="kserve-container" Apr 28 19:51:52.490081 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:52.489976 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="e692a5c3-0746-4dbb-9f52-0baa64cb948f" containerName="kube-rbac-proxy" Apr 28 19:51:52.490081 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:52.489985 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="e692a5c3-0746-4dbb-9f52-0baa64cb948f" containerName="kserve-container" Apr 28 19:51:52.494172 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:52.494155 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-e5e5a-65c9566545-brp96" Apr 28 19:51:52.496950 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:52.496926 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-e5e5a-serving-cert\"" Apr 28 19:51:52.497155 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:52.497139 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-e5e5a-kube-rbac-proxy-sar-config\"" Apr 28 19:51:52.501475 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:52.501452 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-e5e5a-65c9566545-brp96"] Apr 28 19:51:52.551627 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:52.551571 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/44f4ad2a-fae1-4e8e-932b-b7d20fb0eb60-proxy-tls\") pod \"switch-graph-e5e5a-65c9566545-brp96\" (UID: \"44f4ad2a-fae1-4e8e-932b-b7d20fb0eb60\") " pod="kserve-ci-e2e-test/switch-graph-e5e5a-65c9566545-brp96" Apr 28 19:51:52.551808 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:52.551736 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/44f4ad2a-fae1-4e8e-932b-b7d20fb0eb60-openshift-service-ca-bundle\") pod \"switch-graph-e5e5a-65c9566545-brp96\" (UID: \"44f4ad2a-fae1-4e8e-932b-b7d20fb0eb60\") " pod="kserve-ci-e2e-test/switch-graph-e5e5a-65c9566545-brp96" Apr 28 19:51:52.652586 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:52.652544 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/44f4ad2a-fae1-4e8e-932b-b7d20fb0eb60-proxy-tls\") pod \"switch-graph-e5e5a-65c9566545-brp96\" (UID: \"44f4ad2a-fae1-4e8e-932b-b7d20fb0eb60\") " pod="kserve-ci-e2e-test/switch-graph-e5e5a-65c9566545-brp96" Apr 28 19:51:52.652586 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:52.652589 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/44f4ad2a-fae1-4e8e-932b-b7d20fb0eb60-openshift-service-ca-bundle\") pod \"switch-graph-e5e5a-65c9566545-brp96\" (UID: \"44f4ad2a-fae1-4e8e-932b-b7d20fb0eb60\") " pod="kserve-ci-e2e-test/switch-graph-e5e5a-65c9566545-brp96" Apr 28 19:51:52.653400 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:52.653370 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/44f4ad2a-fae1-4e8e-932b-b7d20fb0eb60-openshift-service-ca-bundle\") pod \"switch-graph-e5e5a-65c9566545-brp96\" (UID: \"44f4ad2a-fae1-4e8e-932b-b7d20fb0eb60\") " pod="kserve-ci-e2e-test/switch-graph-e5e5a-65c9566545-brp96" Apr 28 19:51:52.655301 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:52.655281 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/44f4ad2a-fae1-4e8e-932b-b7d20fb0eb60-proxy-tls\") pod \"switch-graph-e5e5a-65c9566545-brp96\" (UID: \"44f4ad2a-fae1-4e8e-932b-b7d20fb0eb60\") " pod="kserve-ci-e2e-test/switch-graph-e5e5a-65c9566545-brp96" Apr 28 19:51:52.804784 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:52.804694 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-e5e5a-65c9566545-brp96" Apr 28 19:51:52.847955 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:51:52.847908 2576 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod29006596_169d_4609_ad97_0a2610da37c1.slice/crio-747756d3a74da5e7c449cb20b70a24c55c023b1b7db441a34a0ab086576c0aa0\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode692a5c3_0746_4dbb_9f52_0baa64cb948f.slice/crio-07e5c37be67cc104451b9438d0d6b1f43c61495683ba7d06ed365ff03f9b7b96.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode692a5c3_0746_4dbb_9f52_0baa64cb948f.slice/crio-76cf8509c575fd476b7f9ee9cf694e10f63d7e223c99431cbfbdaef7f12ff4ee\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode692a5c3_0746_4dbb_9f52_0baa64cb948f.slice/crio-conmon-07e5c37be67cc104451b9438d0d6b1f43c61495683ba7d06ed365ff03f9b7b96.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod29006596_169d_4609_ad97_0a2610da37c1.slice/crio-conmon-19338b0a8391d0410e414999c667a2b01aba60e64f59ee8277e0ede6b64f422d.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode692a5c3_0746_4dbb_9f52_0baa64cb948f.slice\": RecentStats: unable to find data in memory cache]" Apr 28 19:51:52.848193 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:51:52.847907 2576 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode692a5c3_0746_4dbb_9f52_0baa64cb948f.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod29006596_169d_4609_ad97_0a2610da37c1.slice/crio-conmon-19338b0a8391d0410e414999c667a2b01aba60e64f59ee8277e0ede6b64f422d.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod29006596_169d_4609_ad97_0a2610da37c1.slice/crio-19338b0a8391d0410e414999c667a2b01aba60e64f59ee8277e0ede6b64f422d.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode692a5c3_0746_4dbb_9f52_0baa64cb948f.slice/crio-conmon-07e5c37be67cc104451b9438d0d6b1f43c61495683ba7d06ed365ff03f9b7b96.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode692a5c3_0746_4dbb_9f52_0baa64cb948f.slice/crio-07e5c37be67cc104451b9438d0d6b1f43c61495683ba7d06ed365ff03f9b7b96.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode692a5c3_0746_4dbb_9f52_0baa64cb948f.slice/crio-76cf8509c575fd476b7f9ee9cf694e10f63d7e223c99431cbfbdaef7f12ff4ee\": RecentStats: unable to find data in memory cache]" Apr 28 19:51:52.848536 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:51:52.848496 2576 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode692a5c3_0746_4dbb_9f52_0baa64cb948f.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode692a5c3_0746_4dbb_9f52_0baa64cb948f.slice/crio-76cf8509c575fd476b7f9ee9cf694e10f63d7e223c99431cbfbdaef7f12ff4ee\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode692a5c3_0746_4dbb_9f52_0baa64cb948f.slice/crio-conmon-07e5c37be67cc104451b9438d0d6b1f43c61495683ba7d06ed365ff03f9b7b96.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod29006596_169d_4609_ad97_0a2610da37c1.slice/crio-conmon-19338b0a8391d0410e414999c667a2b01aba60e64f59ee8277e0ede6b64f422d.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode692a5c3_0746_4dbb_9f52_0baa64cb948f.slice/crio-07e5c37be67cc104451b9438d0d6b1f43c61495683ba7d06ed365ff03f9b7b96.scope\": RecentStats: unable to find data in memory cache]" Apr 28 19:51:52.849230 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:51:52.848005 2576 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod29006596_169d_4609_ad97_0a2610da37c1.slice/crio-conmon-19338b0a8391d0410e414999c667a2b01aba60e64f59ee8277e0ede6b64f422d.scope\": RecentStats: unable to find data in memory cache]" Apr 28 19:51:52.990635 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:52.990586 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-6bad9-5444bd87c8-6slkv" Apr 28 19:51:53.056421 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:53.056337 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/29006596-169d-4609-ad97-0a2610da37c1-proxy-tls\") pod \"29006596-169d-4609-ad97-0a2610da37c1\" (UID: \"29006596-169d-4609-ad97-0a2610da37c1\") " Apr 28 19:51:53.056421 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:53.056401 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/29006596-169d-4609-ad97-0a2610da37c1-openshift-service-ca-bundle\") pod \"29006596-169d-4609-ad97-0a2610da37c1\" (UID: \"29006596-169d-4609-ad97-0a2610da37c1\") " Apr 28 19:51:53.056771 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:53.056748 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29006596-169d-4609-ad97-0a2610da37c1-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "29006596-169d-4609-ad97-0a2610da37c1" (UID: "29006596-169d-4609-ad97-0a2610da37c1"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:51:53.058494 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:53.058469 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29006596-169d-4609-ad97-0a2610da37c1-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "29006596-169d-4609-ad97-0a2610da37c1" (UID: "29006596-169d-4609-ad97-0a2610da37c1"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:51:53.145018 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:53.144989 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-e5e5a-65c9566545-brp96"] Apr 28 19:51:53.147169 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:51:53.147134 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44f4ad2a_fae1_4e8e_932b_b7d20fb0eb60.slice/crio-7a70f1624068a7ddc9eb2150b6ba1190d668b2b0265e55b2ae06b05573cf8138 WatchSource:0}: Error finding container 7a70f1624068a7ddc9eb2150b6ba1190d668b2b0265e55b2ae06b05573cf8138: Status 404 returned error can't find the container with id 7a70f1624068a7ddc9eb2150b6ba1190d668b2b0265e55b2ae06b05573cf8138 Apr 28 19:51:53.157624 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:53.157583 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/29006596-169d-4609-ad97-0a2610da37c1-proxy-tls\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 28 19:51:53.157624 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:53.157622 2576 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/29006596-169d-4609-ad97-0a2610da37c1-openshift-service-ca-bundle\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 28 19:51:53.375819 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:53.375723 2576 generic.go:358] "Generic (PLEG): container finished" podID="29006596-169d-4609-ad97-0a2610da37c1" containerID="19338b0a8391d0410e414999c667a2b01aba60e64f59ee8277e0ede6b64f422d" exitCode=0 Apr 28 19:51:53.375819 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:53.375794 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-6bad9-5444bd87c8-6slkv" Apr 28 19:51:53.376034 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:53.375814 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-6bad9-5444bd87c8-6slkv" event={"ID":"29006596-169d-4609-ad97-0a2610da37c1","Type":"ContainerDied","Data":"19338b0a8391d0410e414999c667a2b01aba60e64f59ee8277e0ede6b64f422d"} Apr 28 19:51:53.376034 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:53.375850 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-6bad9-5444bd87c8-6slkv" event={"ID":"29006596-169d-4609-ad97-0a2610da37c1","Type":"ContainerDied","Data":"747756d3a74da5e7c449cb20b70a24c55c023b1b7db441a34a0ab086576c0aa0"} Apr 28 19:51:53.376034 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:53.375874 2576 scope.go:117] "RemoveContainer" containerID="19338b0a8391d0410e414999c667a2b01aba60e64f59ee8277e0ede6b64f422d" Apr 28 19:51:53.377287 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:53.377259 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-e5e5a-65c9566545-brp96" event={"ID":"44f4ad2a-fae1-4e8e-932b-b7d20fb0eb60","Type":"ContainerStarted","Data":"a2792d848b775db9b76c7624f594568d803b56f4ae5776be411d7ddf317c2dc2"} Apr 28 19:51:53.377398 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:53.377288 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-e5e5a-65c9566545-brp96" event={"ID":"44f4ad2a-fae1-4e8e-932b-b7d20fb0eb60","Type":"ContainerStarted","Data":"7a70f1624068a7ddc9eb2150b6ba1190d668b2b0265e55b2ae06b05573cf8138"} Apr 28 19:51:53.377572 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:53.377561 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-e5e5a-65c9566545-brp96" Apr 28 19:51:53.386754 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:53.386732 2576 scope.go:117] "RemoveContainer" containerID="19338b0a8391d0410e414999c667a2b01aba60e64f59ee8277e0ede6b64f422d" Apr 28 19:51:53.387086 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:51:53.387065 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19338b0a8391d0410e414999c667a2b01aba60e64f59ee8277e0ede6b64f422d\": container with ID starting with 19338b0a8391d0410e414999c667a2b01aba60e64f59ee8277e0ede6b64f422d not found: ID does not exist" containerID="19338b0a8391d0410e414999c667a2b01aba60e64f59ee8277e0ede6b64f422d" Apr 28 19:51:53.387160 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:53.387101 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19338b0a8391d0410e414999c667a2b01aba60e64f59ee8277e0ede6b64f422d"} err="failed to get container status \"19338b0a8391d0410e414999c667a2b01aba60e64f59ee8277e0ede6b64f422d\": rpc error: code = NotFound desc = could not find container \"19338b0a8391d0410e414999c667a2b01aba60e64f59ee8277e0ede6b64f422d\": container with ID starting with 19338b0a8391d0410e414999c667a2b01aba60e64f59ee8277e0ede6b64f422d not found: ID does not exist" Apr 28 19:51:53.396946 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:53.396895 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/switch-graph-e5e5a-65c9566545-brp96" podStartSLOduration=1.39688092 podStartE2EDuration="1.39688092s" podCreationTimestamp="2026-04-28 19:51:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:51:53.395533712 +0000 UTC m=+2129.358434074" watchObservedRunningTime="2026-04-28 19:51:53.39688092 +0000 UTC m=+2129.359781284" Apr 28 19:51:53.407865 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:53.407824 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-6bad9-5444bd87c8-6slkv"] Apr 28 19:51:53.410888 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:53.410856 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-6bad9-5444bd87c8-6slkv"] Apr 28 19:51:54.513958 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:54.513923 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29006596-169d-4609-ad97-0a2610da37c1" path="/var/lib/kubelet/pods/29006596-169d-4609-ad97-0a2610da37c1/volumes" Apr 28 19:51:59.387577 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:51:59.387545 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/switch-graph-e5e5a-65c9566545-brp96" Apr 28 19:52:01.288128 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:52:01.288081 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-93cee-predictor-7d9b86d498-48dr9" podUID="a3c065ac-d90a-42d8-86fd-49d138e4cde4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.47:8080: connect: connection refused" Apr 28 19:52:11.289712 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:52:11.289675 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-93cee-predictor-7d9b86d498-48dr9" Apr 28 19:52:23.038793 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:52:23.038758 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/splitter-graph-93cee-7d756c9f5c-6gn5j"] Apr 28 19:52:23.039189 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:52:23.039094 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="29006596-169d-4609-ad97-0a2610da37c1" containerName="splitter-graph-6bad9" Apr 28 19:52:23.039189 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:52:23.039105 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="29006596-169d-4609-ad97-0a2610da37c1" containerName="splitter-graph-6bad9" Apr 28 19:52:23.039189 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:52:23.039155 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="29006596-169d-4609-ad97-0a2610da37c1" containerName="splitter-graph-6bad9" Apr 28 19:52:23.042232 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:52:23.042212 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-93cee-7d756c9f5c-6gn5j" Apr 28 19:52:23.044528 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:52:23.044504 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"splitter-graph-93cee-serving-cert\"" Apr 28 19:52:23.044678 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:52:23.044656 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"splitter-graph-93cee-kube-rbac-proxy-sar-config\"" Apr 28 19:52:23.050693 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:52:23.050674 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-93cee-7d756c9f5c-6gn5j"] Apr 28 19:52:23.105335 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:52:23.105298 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8ea68c33-7034-4b7d-a6b6-4642a4b344f0-proxy-tls\") pod \"splitter-graph-93cee-7d756c9f5c-6gn5j\" (UID: \"8ea68c33-7034-4b7d-a6b6-4642a4b344f0\") " pod="kserve-ci-e2e-test/splitter-graph-93cee-7d756c9f5c-6gn5j" Apr 28 19:52:23.105536 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:52:23.105355 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8ea68c33-7034-4b7d-a6b6-4642a4b344f0-openshift-service-ca-bundle\") pod \"splitter-graph-93cee-7d756c9f5c-6gn5j\" (UID: \"8ea68c33-7034-4b7d-a6b6-4642a4b344f0\") " pod="kserve-ci-e2e-test/splitter-graph-93cee-7d756c9f5c-6gn5j" Apr 28 19:52:23.205835 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:52:23.205800 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8ea68c33-7034-4b7d-a6b6-4642a4b344f0-proxy-tls\") pod \"splitter-graph-93cee-7d756c9f5c-6gn5j\" (UID: \"8ea68c33-7034-4b7d-a6b6-4642a4b344f0\") " pod="kserve-ci-e2e-test/splitter-graph-93cee-7d756c9f5c-6gn5j" Apr 28 19:52:23.206023 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:52:23.205854 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8ea68c33-7034-4b7d-a6b6-4642a4b344f0-openshift-service-ca-bundle\") pod \"splitter-graph-93cee-7d756c9f5c-6gn5j\" (UID: \"8ea68c33-7034-4b7d-a6b6-4642a4b344f0\") " pod="kserve-ci-e2e-test/splitter-graph-93cee-7d756c9f5c-6gn5j" Apr 28 19:52:23.206023 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:52:23.205973 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/splitter-graph-93cee-serving-cert: secret "splitter-graph-93cee-serving-cert" not found Apr 28 19:52:23.206139 ip-10-0-139-184 kubenswrapper[2576]: E0428 19:52:23.206050 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8ea68c33-7034-4b7d-a6b6-4642a4b344f0-proxy-tls podName:8ea68c33-7034-4b7d-a6b6-4642a4b344f0 nodeName:}" failed. No retries permitted until 2026-04-28 19:52:23.70603015 +0000 UTC m=+2159.668930496 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/8ea68c33-7034-4b7d-a6b6-4642a4b344f0-proxy-tls") pod "splitter-graph-93cee-7d756c9f5c-6gn5j" (UID: "8ea68c33-7034-4b7d-a6b6-4642a4b344f0") : secret "splitter-graph-93cee-serving-cert" not found Apr 28 19:52:23.206448 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:52:23.206429 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8ea68c33-7034-4b7d-a6b6-4642a4b344f0-openshift-service-ca-bundle\") pod \"splitter-graph-93cee-7d756c9f5c-6gn5j\" (UID: \"8ea68c33-7034-4b7d-a6b6-4642a4b344f0\") " pod="kserve-ci-e2e-test/splitter-graph-93cee-7d756c9f5c-6gn5j" Apr 28 19:52:23.709783 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:52:23.709750 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8ea68c33-7034-4b7d-a6b6-4642a4b344f0-proxy-tls\") pod \"splitter-graph-93cee-7d756c9f5c-6gn5j\" (UID: \"8ea68c33-7034-4b7d-a6b6-4642a4b344f0\") " pod="kserve-ci-e2e-test/splitter-graph-93cee-7d756c9f5c-6gn5j" Apr 28 19:52:23.712445 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:52:23.712411 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8ea68c33-7034-4b7d-a6b6-4642a4b344f0-proxy-tls\") pod \"splitter-graph-93cee-7d756c9f5c-6gn5j\" (UID: \"8ea68c33-7034-4b7d-a6b6-4642a4b344f0\") " pod="kserve-ci-e2e-test/splitter-graph-93cee-7d756c9f5c-6gn5j" Apr 28 19:52:23.952816 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:52:23.952777 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-93cee-7d756c9f5c-6gn5j" Apr 28 19:52:24.080195 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:52:24.080164 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-93cee-7d756c9f5c-6gn5j"] Apr 28 19:52:24.082403 ip-10-0-139-184 kubenswrapper[2576]: W0428 19:52:24.082368 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ea68c33_7034_4b7d_a6b6_4642a4b344f0.slice/crio-f9eddc1958fa57a8659ee6f7d3819f6cf8a4b82f87645963fdf035029630c6fe WatchSource:0}: Error finding container f9eddc1958fa57a8659ee6f7d3819f6cf8a4b82f87645963fdf035029630c6fe: Status 404 returned error can't find the container with id f9eddc1958fa57a8659ee6f7d3819f6cf8a4b82f87645963fdf035029630c6fe Apr 28 19:52:24.486055 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:52:24.486019 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-93cee-7d756c9f5c-6gn5j" event={"ID":"8ea68c33-7034-4b7d-a6b6-4642a4b344f0","Type":"ContainerStarted","Data":"783f67e797572525e4b79b302e83406a804f5609876262dac0e392b98089e0df"} Apr 28 19:52:24.486055 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:52:24.486058 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-93cee-7d756c9f5c-6gn5j" event={"ID":"8ea68c33-7034-4b7d-a6b6-4642a4b344f0","Type":"ContainerStarted","Data":"f9eddc1958fa57a8659ee6f7d3819f6cf8a4b82f87645963fdf035029630c6fe"} Apr 28 19:52:24.486288 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:52:24.486175 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/splitter-graph-93cee-7d756c9f5c-6gn5j" Apr 28 19:52:24.513059 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:52:24.512993 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/splitter-graph-93cee-7d756c9f5c-6gn5j" podStartSLOduration=1.5129748219999999 podStartE2EDuration="1.512974822s" podCreationTimestamp="2026-04-28 19:52:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:52:24.511097522 +0000 UTC m=+2160.473997888" watchObservedRunningTime="2026-04-28 19:52:24.512974822 +0000 UTC m=+2160.475875188" Apr 28 19:52:30.494910 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:52:30.494880 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/splitter-graph-93cee-7d756c9f5c-6gn5j" Apr 28 19:56:24.587264 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:56:24.587236 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fts6r_6551d5d0-2583-4478-98ab-1efc22016165/console-operator/1.log" Apr 28 19:56:24.593026 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:56:24.593003 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5tdk8_ff2f50e1-de53-4f11-a477-9236b340536b/ovn-acl-logging/0.log" Apr 28 19:56:24.598592 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:56:24.598575 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fts6r_6551d5d0-2583-4478-98ab-1efc22016165/console-operator/1.log" Apr 28 19:56:24.603586 ip-10-0-139-184 kubenswrapper[2576]: I0428 19:56:24.603570 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5tdk8_ff2f50e1-de53-4f11-a477-9236b340536b/ovn-acl-logging/0.log" Apr 28 20:00:37.817434 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:00:37.817399 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-93cee-7d756c9f5c-6gn5j"] Apr 28 20:00:37.818033 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:00:37.817674 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/splitter-graph-93cee-7d756c9f5c-6gn5j" podUID="8ea68c33-7034-4b7d-a6b6-4642a4b344f0" containerName="splitter-graph-93cee" containerID="cri-o://783f67e797572525e4b79b302e83406a804f5609876262dac0e392b98089e0df" gracePeriod=30 Apr 28 20:00:37.925836 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:00:37.925803 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-93cee-predictor-7d9b86d498-48dr9"] Apr 28 20:00:37.926137 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:00:37.926108 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-93cee-predictor-7d9b86d498-48dr9" podUID="a3c065ac-d90a-42d8-86fd-49d138e4cde4" containerName="kserve-container" containerID="cri-o://70142a6b903ff37405f244cb1c878343b18b3bc40d1a1f0af80703fb1177f9a6" gracePeriod=30 Apr 28 20:00:37.926256 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:00:37.926140 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-93cee-predictor-7d9b86d498-48dr9" podUID="a3c065ac-d90a-42d8-86fd-49d138e4cde4" containerName="kube-rbac-proxy" containerID="cri-o://bce99bbd504e016d7f3f18dbae7d94fb58d64e942e5cf85781bf0a101efd2d55" gracePeriod=30 Apr 28 20:00:38.098547 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:00:38.098451 2576 generic.go:358] "Generic (PLEG): container finished" podID="a3c065ac-d90a-42d8-86fd-49d138e4cde4" containerID="bce99bbd504e016d7f3f18dbae7d94fb58d64e942e5cf85781bf0a101efd2d55" exitCode=2 Apr 28 20:00:38.098547 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:00:38.098506 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-93cee-predictor-7d9b86d498-48dr9" event={"ID":"a3c065ac-d90a-42d8-86fd-49d138e4cde4","Type":"ContainerDied","Data":"bce99bbd504e016d7f3f18dbae7d94fb58d64e942e5cf85781bf0a101efd2d55"} Apr 28 20:00:40.493215 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:00:40.493177 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-93cee-7d756c9f5c-6gn5j" podUID="8ea68c33-7034-4b7d-a6b6-4642a4b344f0" containerName="splitter-graph-93cee" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 20:00:41.075795 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:00:41.075770 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-93cee-predictor-7d9b86d498-48dr9" Apr 28 20:00:41.110220 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:00:41.110125 2576 generic.go:358] "Generic (PLEG): container finished" podID="a3c065ac-d90a-42d8-86fd-49d138e4cde4" containerID="70142a6b903ff37405f244cb1c878343b18b3bc40d1a1f0af80703fb1177f9a6" exitCode=0 Apr 28 20:00:41.110220 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:00:41.110186 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-93cee-predictor-7d9b86d498-48dr9" event={"ID":"a3c065ac-d90a-42d8-86fd-49d138e4cde4","Type":"ContainerDied","Data":"70142a6b903ff37405f244cb1c878343b18b3bc40d1a1f0af80703fb1177f9a6"} Apr 28 20:00:41.110220 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:00:41.110220 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-93cee-predictor-7d9b86d498-48dr9" event={"ID":"a3c065ac-d90a-42d8-86fd-49d138e4cde4","Type":"ContainerDied","Data":"8c35a4a839d320f1bb876a18eb0b594814b274af61e7bf9b98ebea27d97d29af"} Apr 28 20:00:41.110477 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:00:41.110241 2576 scope.go:117] "RemoveContainer" containerID="bce99bbd504e016d7f3f18dbae7d94fb58d64e942e5cf85781bf0a101efd2d55" Apr 28 20:00:41.110477 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:00:41.110275 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-93cee-predictor-7d9b86d498-48dr9" Apr 28 20:00:41.120108 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:00:41.120085 2576 scope.go:117] "RemoveContainer" containerID="70142a6b903ff37405f244cb1c878343b18b3bc40d1a1f0af80703fb1177f9a6" Apr 28 20:00:41.128801 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:00:41.128771 2576 scope.go:117] "RemoveContainer" containerID="bce99bbd504e016d7f3f18dbae7d94fb58d64e942e5cf85781bf0a101efd2d55" Apr 28 20:00:41.129075 ip-10-0-139-184 kubenswrapper[2576]: E0428 20:00:41.129054 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bce99bbd504e016d7f3f18dbae7d94fb58d64e942e5cf85781bf0a101efd2d55\": container with ID starting with bce99bbd504e016d7f3f18dbae7d94fb58d64e942e5cf85781bf0a101efd2d55 not found: ID does not exist" containerID="bce99bbd504e016d7f3f18dbae7d94fb58d64e942e5cf85781bf0a101efd2d55" Apr 28 20:00:41.129118 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:00:41.129086 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bce99bbd504e016d7f3f18dbae7d94fb58d64e942e5cf85781bf0a101efd2d55"} err="failed to get container status \"bce99bbd504e016d7f3f18dbae7d94fb58d64e942e5cf85781bf0a101efd2d55\": rpc error: code = NotFound desc = could not find container \"bce99bbd504e016d7f3f18dbae7d94fb58d64e942e5cf85781bf0a101efd2d55\": container with ID starting with bce99bbd504e016d7f3f18dbae7d94fb58d64e942e5cf85781bf0a101efd2d55 not found: ID does not exist" Apr 28 20:00:41.129118 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:00:41.129104 2576 scope.go:117] "RemoveContainer" containerID="70142a6b903ff37405f244cb1c878343b18b3bc40d1a1f0af80703fb1177f9a6" Apr 28 20:00:41.129326 ip-10-0-139-184 kubenswrapper[2576]: E0428 20:00:41.129310 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70142a6b903ff37405f244cb1c878343b18b3bc40d1a1f0af80703fb1177f9a6\": container with ID starting with 70142a6b903ff37405f244cb1c878343b18b3bc40d1a1f0af80703fb1177f9a6 not found: ID does not exist" containerID="70142a6b903ff37405f244cb1c878343b18b3bc40d1a1f0af80703fb1177f9a6" Apr 28 20:00:41.129376 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:00:41.129334 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70142a6b903ff37405f244cb1c878343b18b3bc40d1a1f0af80703fb1177f9a6"} err="failed to get container status \"70142a6b903ff37405f244cb1c878343b18b3bc40d1a1f0af80703fb1177f9a6\": rpc error: code = NotFound desc = could not find container \"70142a6b903ff37405f244cb1c878343b18b3bc40d1a1f0af80703fb1177f9a6\": container with ID starting with 70142a6b903ff37405f244cb1c878343b18b3bc40d1a1f0af80703fb1177f9a6 not found: ID does not exist" Apr 28 20:00:41.237518 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:00:41.237484 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a3c065ac-d90a-42d8-86fd-49d138e4cde4-proxy-tls\") pod \"a3c065ac-d90a-42d8-86fd-49d138e4cde4\" (UID: \"a3c065ac-d90a-42d8-86fd-49d138e4cde4\") " Apr 28 20:00:41.237737 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:00:41.237546 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-297vc\" (UniqueName: \"kubernetes.io/projected/a3c065ac-d90a-42d8-86fd-49d138e4cde4-kube-api-access-297vc\") pod \"a3c065ac-d90a-42d8-86fd-49d138e4cde4\" (UID: \"a3c065ac-d90a-42d8-86fd-49d138e4cde4\") " Apr 28 20:00:41.237737 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:00:41.237582 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-93cee-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a3c065ac-d90a-42d8-86fd-49d138e4cde4-success-200-isvc-93cee-kube-rbac-proxy-sar-config\") pod \"a3c065ac-d90a-42d8-86fd-49d138e4cde4\" (UID: \"a3c065ac-d90a-42d8-86fd-49d138e4cde4\") " Apr 28 20:00:41.237992 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:00:41.237967 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3c065ac-d90a-42d8-86fd-49d138e4cde4-success-200-isvc-93cee-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-93cee-kube-rbac-proxy-sar-config") pod "a3c065ac-d90a-42d8-86fd-49d138e4cde4" (UID: "a3c065ac-d90a-42d8-86fd-49d138e4cde4"). InnerVolumeSpecName "success-200-isvc-93cee-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 20:00:41.239825 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:00:41.239797 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3c065ac-d90a-42d8-86fd-49d138e4cde4-kube-api-access-297vc" (OuterVolumeSpecName: "kube-api-access-297vc") pod "a3c065ac-d90a-42d8-86fd-49d138e4cde4" (UID: "a3c065ac-d90a-42d8-86fd-49d138e4cde4"). InnerVolumeSpecName "kube-api-access-297vc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 20:00:41.239921 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:00:41.239872 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3c065ac-d90a-42d8-86fd-49d138e4cde4-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "a3c065ac-d90a-42d8-86fd-49d138e4cde4" (UID: "a3c065ac-d90a-42d8-86fd-49d138e4cde4"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 20:00:41.338762 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:00:41.338725 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a3c065ac-d90a-42d8-86fd-49d138e4cde4-proxy-tls\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 28 20:00:41.338762 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:00:41.338756 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-297vc\" (UniqueName: \"kubernetes.io/projected/a3c065ac-d90a-42d8-86fd-49d138e4cde4-kube-api-access-297vc\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 28 20:00:41.338762 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:00:41.338767 2576 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-93cee-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a3c065ac-d90a-42d8-86fd-49d138e4cde4-success-200-isvc-93cee-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 28 20:00:41.438464 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:00:41.438425 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-93cee-predictor-7d9b86d498-48dr9"] Apr 28 20:00:41.445293 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:00:41.445252 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-93cee-predictor-7d9b86d498-48dr9"] Apr 28 20:00:42.513959 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:00:42.513922 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3c065ac-d90a-42d8-86fd-49d138e4cde4" path="/var/lib/kubelet/pods/a3c065ac-d90a-42d8-86fd-49d138e4cde4/volumes" Apr 28 20:00:45.494151 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:00:45.494111 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-93cee-7d756c9f5c-6gn5j" podUID="8ea68c33-7034-4b7d-a6b6-4642a4b344f0" containerName="splitter-graph-93cee" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 20:00:50.493384 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:00:50.493341 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-93cee-7d756c9f5c-6gn5j" podUID="8ea68c33-7034-4b7d-a6b6-4642a4b344f0" containerName="splitter-graph-93cee" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 20:00:50.493878 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:00:50.493448 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/splitter-graph-93cee-7d756c9f5c-6gn5j" Apr 28 20:00:55.493558 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:00:55.493508 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-93cee-7d756c9f5c-6gn5j" podUID="8ea68c33-7034-4b7d-a6b6-4642a4b344f0" containerName="splitter-graph-93cee" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 20:01:00.493724 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:01:00.493672 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-93cee-7d756c9f5c-6gn5j" podUID="8ea68c33-7034-4b7d-a6b6-4642a4b344f0" containerName="splitter-graph-93cee" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 20:01:05.493283 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:01:05.493238 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-93cee-7d756c9f5c-6gn5j" podUID="8ea68c33-7034-4b7d-a6b6-4642a4b344f0" containerName="splitter-graph-93cee" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 20:01:07.962166 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:01:07.962142 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-93cee-7d756c9f5c-6gn5j" Apr 28 20:01:08.051412 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:01:08.051367 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8ea68c33-7034-4b7d-a6b6-4642a4b344f0-proxy-tls\") pod \"8ea68c33-7034-4b7d-a6b6-4642a4b344f0\" (UID: \"8ea68c33-7034-4b7d-a6b6-4642a4b344f0\") " Apr 28 20:01:08.051412 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:01:08.051423 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8ea68c33-7034-4b7d-a6b6-4642a4b344f0-openshift-service-ca-bundle\") pod \"8ea68c33-7034-4b7d-a6b6-4642a4b344f0\" (UID: \"8ea68c33-7034-4b7d-a6b6-4642a4b344f0\") " Apr 28 20:01:08.051875 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:01:08.051846 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ea68c33-7034-4b7d-a6b6-4642a4b344f0-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "8ea68c33-7034-4b7d-a6b6-4642a4b344f0" (UID: "8ea68c33-7034-4b7d-a6b6-4642a4b344f0"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 20:01:08.053841 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:01:08.053816 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ea68c33-7034-4b7d-a6b6-4642a4b344f0-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "8ea68c33-7034-4b7d-a6b6-4642a4b344f0" (UID: "8ea68c33-7034-4b7d-a6b6-4642a4b344f0"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 20:01:08.152516 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:01:08.152435 2576 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8ea68c33-7034-4b7d-a6b6-4642a4b344f0-openshift-service-ca-bundle\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 28 20:01:08.152516 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:01:08.152469 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8ea68c33-7034-4b7d-a6b6-4642a4b344f0-proxy-tls\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 28 20:01:08.198204 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:01:08.198175 2576 generic.go:358] "Generic (PLEG): container finished" podID="8ea68c33-7034-4b7d-a6b6-4642a4b344f0" containerID="783f67e797572525e4b79b302e83406a804f5609876262dac0e392b98089e0df" exitCode=0 Apr 28 20:01:08.198327 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:01:08.198236 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-93cee-7d756c9f5c-6gn5j" Apr 28 20:01:08.198327 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:01:08.198251 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-93cee-7d756c9f5c-6gn5j" event={"ID":"8ea68c33-7034-4b7d-a6b6-4642a4b344f0","Type":"ContainerDied","Data":"783f67e797572525e4b79b302e83406a804f5609876262dac0e392b98089e0df"} Apr 28 20:01:08.198327 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:01:08.198284 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-93cee-7d756c9f5c-6gn5j" event={"ID":"8ea68c33-7034-4b7d-a6b6-4642a4b344f0","Type":"ContainerDied","Data":"f9eddc1958fa57a8659ee6f7d3819f6cf8a4b82f87645963fdf035029630c6fe"} Apr 28 20:01:08.198327 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:01:08.198298 2576 scope.go:117] "RemoveContainer" containerID="783f67e797572525e4b79b302e83406a804f5609876262dac0e392b98089e0df" Apr 28 20:01:08.206799 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:01:08.206773 2576 scope.go:117] "RemoveContainer" containerID="783f67e797572525e4b79b302e83406a804f5609876262dac0e392b98089e0df" Apr 28 20:01:08.207060 ip-10-0-139-184 kubenswrapper[2576]: E0428 20:01:08.207042 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"783f67e797572525e4b79b302e83406a804f5609876262dac0e392b98089e0df\": container with ID starting with 783f67e797572525e4b79b302e83406a804f5609876262dac0e392b98089e0df not found: ID does not exist" containerID="783f67e797572525e4b79b302e83406a804f5609876262dac0e392b98089e0df" Apr 28 20:01:08.207124 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:01:08.207070 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"783f67e797572525e4b79b302e83406a804f5609876262dac0e392b98089e0df"} err="failed to get container status \"783f67e797572525e4b79b302e83406a804f5609876262dac0e392b98089e0df\": rpc error: code = NotFound desc = could not find container \"783f67e797572525e4b79b302e83406a804f5609876262dac0e392b98089e0df\": container with ID starting with 783f67e797572525e4b79b302e83406a804f5609876262dac0e392b98089e0df not found: ID does not exist" Apr 28 20:01:08.218760 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:01:08.218731 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-93cee-7d756c9f5c-6gn5j"] Apr 28 20:01:08.220110 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:01:08.220089 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-93cee-7d756c9f5c-6gn5j"] Apr 28 20:01:08.513927 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:01:08.513894 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ea68c33-7034-4b7d-a6b6-4642a4b344f0" path="/var/lib/kubelet/pods/8ea68c33-7034-4b7d-a6b6-4642a4b344f0/volumes" Apr 28 20:01:24.608551 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:01:24.608449 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fts6r_6551d5d0-2583-4478-98ab-1efc22016165/console-operator/1.log" Apr 28 20:01:24.619322 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:01:24.619284 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5tdk8_ff2f50e1-de53-4f11-a477-9236b340536b/ovn-acl-logging/0.log" Apr 28 20:01:24.624869 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:01:24.624840 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fts6r_6551d5d0-2583-4478-98ab-1efc22016165/console-operator/1.log" Apr 28 20:01:24.630268 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:01:24.630251 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5tdk8_ff2f50e1-de53-4f11-a477-9236b340536b/ovn-acl-logging/0.log" Apr 28 20:06:24.637660 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:06:24.637530 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fts6r_6551d5d0-2583-4478-98ab-1efc22016165/console-operator/1.log" Apr 28 20:06:24.643558 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:06:24.643528 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5tdk8_ff2f50e1-de53-4f11-a477-9236b340536b/ovn-acl-logging/0.log" Apr 28 20:06:24.647181 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:06:24.647160 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fts6r_6551d5d0-2583-4478-98ab-1efc22016165/console-operator/1.log" Apr 28 20:06:24.652674 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:06:24.652655 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5tdk8_ff2f50e1-de53-4f11-a477-9236b340536b/ovn-acl-logging/0.log" Apr 28 20:08:11.772189 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:11.772156 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-e5e5a-65c9566545-brp96"] Apr 28 20:08:11.772738 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:11.772386 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/switch-graph-e5e5a-65c9566545-brp96" podUID="44f4ad2a-fae1-4e8e-932b-b7d20fb0eb60" containerName="switch-graph-e5e5a" containerID="cri-o://a2792d848b775db9b76c7624f594568d803b56f4ae5776be411d7ddf317c2dc2" gracePeriod=30 Apr 28 20:08:11.919176 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:11.919141 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-e5e5a-predictor-7bd5747694-zwckw"] Apr 28 20:08:11.919482 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:11.919453 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-e5e5a-predictor-7bd5747694-zwckw" podUID="dfc04099-98ac-4199-beeb-90bdf50f0477" containerName="kserve-container" containerID="cri-o://238f6c83e59a0532054fbb8cd02cdfffe8a71511867b52b1808becc55abe299a" gracePeriod=30 Apr 28 20:08:11.919564 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:11.919503 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-e5e5a-predictor-7bd5747694-zwckw" podUID="dfc04099-98ac-4199-beeb-90bdf50f0477" containerName="kube-rbac-proxy" containerID="cri-o://ee8d13a421615001853fb3d84d4164e18119a513066d7582d788efa81cdf3ecf" gracePeriod=30 Apr 28 20:08:12.580639 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:12.580578 2576 generic.go:358] "Generic (PLEG): container finished" podID="dfc04099-98ac-4199-beeb-90bdf50f0477" containerID="ee8d13a421615001853fb3d84d4164e18119a513066d7582d788efa81cdf3ecf" exitCode=2 Apr 28 20:08:12.580815 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:12.580639 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-e5e5a-predictor-7bd5747694-zwckw" event={"ID":"dfc04099-98ac-4199-beeb-90bdf50f0477","Type":"ContainerDied","Data":"ee8d13a421615001853fb3d84d4164e18119a513066d7582d788efa81cdf3ecf"} Apr 28 20:08:14.386296 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:14.386260 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-e5e5a-65c9566545-brp96" podUID="44f4ad2a-fae1-4e8e-932b-b7d20fb0eb60" containerName="switch-graph-e5e5a" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 20:08:14.970626 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:14.970584 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-e5e5a-predictor-7bd5747694-zwckw" Apr 28 20:08:15.100619 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:15.100511 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dfc04099-98ac-4199-beeb-90bdf50f0477-proxy-tls\") pod \"dfc04099-98ac-4199-beeb-90bdf50f0477\" (UID: \"dfc04099-98ac-4199-beeb-90bdf50f0477\") " Apr 28 20:08:15.100619 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:15.100558 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-e5e5a-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/dfc04099-98ac-4199-beeb-90bdf50f0477-success-200-isvc-e5e5a-kube-rbac-proxy-sar-config\") pod \"dfc04099-98ac-4199-beeb-90bdf50f0477\" (UID: \"dfc04099-98ac-4199-beeb-90bdf50f0477\") " Apr 28 20:08:15.100619 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:15.100588 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s7xms\" (UniqueName: \"kubernetes.io/projected/dfc04099-98ac-4199-beeb-90bdf50f0477-kube-api-access-s7xms\") pod \"dfc04099-98ac-4199-beeb-90bdf50f0477\" (UID: \"dfc04099-98ac-4199-beeb-90bdf50f0477\") " Apr 28 20:08:15.100941 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:15.100910 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dfc04099-98ac-4199-beeb-90bdf50f0477-success-200-isvc-e5e5a-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-e5e5a-kube-rbac-proxy-sar-config") pod "dfc04099-98ac-4199-beeb-90bdf50f0477" (UID: "dfc04099-98ac-4199-beeb-90bdf50f0477"). InnerVolumeSpecName "success-200-isvc-e5e5a-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 20:08:15.102914 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:15.102889 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfc04099-98ac-4199-beeb-90bdf50f0477-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "dfc04099-98ac-4199-beeb-90bdf50f0477" (UID: "dfc04099-98ac-4199-beeb-90bdf50f0477"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 20:08:15.103021 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:15.102909 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dfc04099-98ac-4199-beeb-90bdf50f0477-kube-api-access-s7xms" (OuterVolumeSpecName: "kube-api-access-s7xms") pod "dfc04099-98ac-4199-beeb-90bdf50f0477" (UID: "dfc04099-98ac-4199-beeb-90bdf50f0477"). InnerVolumeSpecName "kube-api-access-s7xms". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 20:08:15.201251 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:15.201215 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dfc04099-98ac-4199-beeb-90bdf50f0477-proxy-tls\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 28 20:08:15.201251 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:15.201247 2576 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-e5e5a-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/dfc04099-98ac-4199-beeb-90bdf50f0477-success-200-isvc-e5e5a-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 28 20:08:15.201251 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:15.201259 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-s7xms\" (UniqueName: \"kubernetes.io/projected/dfc04099-98ac-4199-beeb-90bdf50f0477-kube-api-access-s7xms\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 28 20:08:15.590451 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:15.590419 2576 generic.go:358] "Generic (PLEG): container finished" podID="dfc04099-98ac-4199-beeb-90bdf50f0477" containerID="238f6c83e59a0532054fbb8cd02cdfffe8a71511867b52b1808becc55abe299a" exitCode=0 Apr 28 20:08:15.590896 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:15.590478 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-e5e5a-predictor-7bd5747694-zwckw" event={"ID":"dfc04099-98ac-4199-beeb-90bdf50f0477","Type":"ContainerDied","Data":"238f6c83e59a0532054fbb8cd02cdfffe8a71511867b52b1808becc55abe299a"} Apr 28 20:08:15.590896 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:15.590489 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-e5e5a-predictor-7bd5747694-zwckw" Apr 28 20:08:15.590896 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:15.590516 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-e5e5a-predictor-7bd5747694-zwckw" event={"ID":"dfc04099-98ac-4199-beeb-90bdf50f0477","Type":"ContainerDied","Data":"b0e829fc43fa5a17b8162244cdb792f3ced292594962e40f198cbb24329c970b"} Apr 28 20:08:15.590896 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:15.590531 2576 scope.go:117] "RemoveContainer" containerID="ee8d13a421615001853fb3d84d4164e18119a513066d7582d788efa81cdf3ecf" Apr 28 20:08:15.599154 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:15.599133 2576 scope.go:117] "RemoveContainer" containerID="238f6c83e59a0532054fbb8cd02cdfffe8a71511867b52b1808becc55abe299a" Apr 28 20:08:15.605886 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:15.605869 2576 scope.go:117] "RemoveContainer" containerID="ee8d13a421615001853fb3d84d4164e18119a513066d7582d788efa81cdf3ecf" Apr 28 20:08:15.606154 ip-10-0-139-184 kubenswrapper[2576]: E0428 20:08:15.606135 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee8d13a421615001853fb3d84d4164e18119a513066d7582d788efa81cdf3ecf\": container with ID starting with ee8d13a421615001853fb3d84d4164e18119a513066d7582d788efa81cdf3ecf not found: ID does not exist" containerID="ee8d13a421615001853fb3d84d4164e18119a513066d7582d788efa81cdf3ecf" Apr 28 20:08:15.606198 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:15.606171 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee8d13a421615001853fb3d84d4164e18119a513066d7582d788efa81cdf3ecf"} err="failed to get container status \"ee8d13a421615001853fb3d84d4164e18119a513066d7582d788efa81cdf3ecf\": rpc error: code = NotFound desc = could not find container \"ee8d13a421615001853fb3d84d4164e18119a513066d7582d788efa81cdf3ecf\": container with ID starting with ee8d13a421615001853fb3d84d4164e18119a513066d7582d788efa81cdf3ecf not found: ID does not exist" Apr 28 20:08:15.606198 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:15.606188 2576 scope.go:117] "RemoveContainer" containerID="238f6c83e59a0532054fbb8cd02cdfffe8a71511867b52b1808becc55abe299a" Apr 28 20:08:15.606429 ip-10-0-139-184 kubenswrapper[2576]: E0428 20:08:15.606412 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"238f6c83e59a0532054fbb8cd02cdfffe8a71511867b52b1808becc55abe299a\": container with ID starting with 238f6c83e59a0532054fbb8cd02cdfffe8a71511867b52b1808becc55abe299a not found: ID does not exist" containerID="238f6c83e59a0532054fbb8cd02cdfffe8a71511867b52b1808becc55abe299a" Apr 28 20:08:15.606474 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:15.606435 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"238f6c83e59a0532054fbb8cd02cdfffe8a71511867b52b1808becc55abe299a"} err="failed to get container status \"238f6c83e59a0532054fbb8cd02cdfffe8a71511867b52b1808becc55abe299a\": rpc error: code = NotFound desc = could not find container \"238f6c83e59a0532054fbb8cd02cdfffe8a71511867b52b1808becc55abe299a\": container with ID starting with 238f6c83e59a0532054fbb8cd02cdfffe8a71511867b52b1808becc55abe299a not found: ID does not exist" Apr 28 20:08:15.612136 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:15.612113 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-e5e5a-predictor-7bd5747694-zwckw"] Apr 28 20:08:15.616257 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:15.616236 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-e5e5a-predictor-7bd5747694-zwckw"] Apr 28 20:08:16.515061 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:16.515021 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dfc04099-98ac-4199-beeb-90bdf50f0477" path="/var/lib/kubelet/pods/dfc04099-98ac-4199-beeb-90bdf50f0477/volumes" Apr 28 20:08:19.385526 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:19.385481 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-e5e5a-65c9566545-brp96" podUID="44f4ad2a-fae1-4e8e-932b-b7d20fb0eb60" containerName="switch-graph-e5e5a" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 20:08:24.386139 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:24.386102 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-e5e5a-65c9566545-brp96" podUID="44f4ad2a-fae1-4e8e-932b-b7d20fb0eb60" containerName="switch-graph-e5e5a" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 20:08:24.386509 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:24.386220 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-e5e5a-65c9566545-brp96" Apr 28 20:08:27.476107 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:27.476079 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-e5e5a-65c9566545-brp96_44f4ad2a-fae1-4e8e-932b-b7d20fb0eb60/switch-graph-e5e5a/0.log" Apr 28 20:08:28.305052 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:28.305011 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-e5e5a-65c9566545-brp96_44f4ad2a-fae1-4e8e-932b-b7d20fb0eb60/switch-graph-e5e5a/0.log" Apr 28 20:08:29.105400 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:29.104594 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-e5e5a-65c9566545-brp96_44f4ad2a-fae1-4e8e-932b-b7d20fb0eb60/switch-graph-e5e5a/0.log" Apr 28 20:08:29.386464 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:29.386375 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-e5e5a-65c9566545-brp96" podUID="44f4ad2a-fae1-4e8e-932b-b7d20fb0eb60" containerName="switch-graph-e5e5a" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 20:08:29.895405 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:29.895374 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-e5e5a-65c9566545-brp96_44f4ad2a-fae1-4e8e-932b-b7d20fb0eb60/switch-graph-e5e5a/0.log" Apr 28 20:08:30.704233 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:30.704200 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-e5e5a-65c9566545-brp96_44f4ad2a-fae1-4e8e-932b-b7d20fb0eb60/switch-graph-e5e5a/0.log" Apr 28 20:08:31.504401 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:31.504368 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-e5e5a-65c9566545-brp96_44f4ad2a-fae1-4e8e-932b-b7d20fb0eb60/switch-graph-e5e5a/0.log" Apr 28 20:08:32.329491 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:32.329454 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-e5e5a-65c9566545-brp96_44f4ad2a-fae1-4e8e-932b-b7d20fb0eb60/switch-graph-e5e5a/0.log" Apr 28 20:08:33.136334 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:33.136301 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-e5e5a-65c9566545-brp96_44f4ad2a-fae1-4e8e-932b-b7d20fb0eb60/switch-graph-e5e5a/0.log" Apr 28 20:08:33.943183 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:33.943152 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-e5e5a-65c9566545-brp96_44f4ad2a-fae1-4e8e-932b-b7d20fb0eb60/switch-graph-e5e5a/0.log" Apr 28 20:08:34.385720 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:34.385678 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-e5e5a-65c9566545-brp96" podUID="44f4ad2a-fae1-4e8e-932b-b7d20fb0eb60" containerName="switch-graph-e5e5a" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 20:08:34.745446 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:34.745412 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-e5e5a-65c9566545-brp96_44f4ad2a-fae1-4e8e-932b-b7d20fb0eb60/switch-graph-e5e5a/0.log" Apr 28 20:08:35.553529 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:35.553501 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-e5e5a-65c9566545-brp96_44f4ad2a-fae1-4e8e-932b-b7d20fb0eb60/switch-graph-e5e5a/0.log" Apr 28 20:08:36.350269 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:36.350234 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-e5e5a-65c9566545-brp96_44f4ad2a-fae1-4e8e-932b-b7d20fb0eb60/switch-graph-e5e5a/0.log" Apr 28 20:08:39.385637 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:39.385580 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-e5e5a-65c9566545-brp96" podUID="44f4ad2a-fae1-4e8e-932b-b7d20fb0eb60" containerName="switch-graph-e5e5a" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 20:08:41.924702 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:41.924673 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-e5e5a-65c9566545-brp96" Apr 28 20:08:42.022999 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:42.022960 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/44f4ad2a-fae1-4e8e-932b-b7d20fb0eb60-openshift-service-ca-bundle\") pod \"44f4ad2a-fae1-4e8e-932b-b7d20fb0eb60\" (UID: \"44f4ad2a-fae1-4e8e-932b-b7d20fb0eb60\") " Apr 28 20:08:42.023178 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:42.023026 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/44f4ad2a-fae1-4e8e-932b-b7d20fb0eb60-proxy-tls\") pod \"44f4ad2a-fae1-4e8e-932b-b7d20fb0eb60\" (UID: \"44f4ad2a-fae1-4e8e-932b-b7d20fb0eb60\") " Apr 28 20:08:42.023359 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:42.023334 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44f4ad2a-fae1-4e8e-932b-b7d20fb0eb60-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "44f4ad2a-fae1-4e8e-932b-b7d20fb0eb60" (UID: "44f4ad2a-fae1-4e8e-932b-b7d20fb0eb60"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 20:08:42.025308 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:42.025276 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44f4ad2a-fae1-4e8e-932b-b7d20fb0eb60-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "44f4ad2a-fae1-4e8e-932b-b7d20fb0eb60" (UID: "44f4ad2a-fae1-4e8e-932b-b7d20fb0eb60"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 20:08:42.124459 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:42.124414 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/44f4ad2a-fae1-4e8e-932b-b7d20fb0eb60-proxy-tls\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 28 20:08:42.124459 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:42.124448 2576 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/44f4ad2a-fae1-4e8e-932b-b7d20fb0eb60-openshift-service-ca-bundle\") on node \"ip-10-0-139-184.ec2.internal\" DevicePath \"\"" Apr 28 20:08:42.677949 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:42.677910 2576 generic.go:358] "Generic (PLEG): container finished" podID="44f4ad2a-fae1-4e8e-932b-b7d20fb0eb60" containerID="a2792d848b775db9b76c7624f594568d803b56f4ae5776be411d7ddf317c2dc2" exitCode=0 Apr 28 20:08:42.678225 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:42.677973 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-e5e5a-65c9566545-brp96" Apr 28 20:08:42.678225 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:42.677998 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-e5e5a-65c9566545-brp96" event={"ID":"44f4ad2a-fae1-4e8e-932b-b7d20fb0eb60","Type":"ContainerDied","Data":"a2792d848b775db9b76c7624f594568d803b56f4ae5776be411d7ddf317c2dc2"} Apr 28 20:08:42.678225 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:42.678043 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-e5e5a-65c9566545-brp96" event={"ID":"44f4ad2a-fae1-4e8e-932b-b7d20fb0eb60","Type":"ContainerDied","Data":"7a70f1624068a7ddc9eb2150b6ba1190d668b2b0265e55b2ae06b05573cf8138"} Apr 28 20:08:42.678225 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:42.678059 2576 scope.go:117] "RemoveContainer" containerID="a2792d848b775db9b76c7624f594568d803b56f4ae5776be411d7ddf317c2dc2" Apr 28 20:08:42.686148 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:42.686128 2576 scope.go:117] "RemoveContainer" containerID="a2792d848b775db9b76c7624f594568d803b56f4ae5776be411d7ddf317c2dc2" Apr 28 20:08:42.686419 ip-10-0-139-184 kubenswrapper[2576]: E0428 20:08:42.686400 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2792d848b775db9b76c7624f594568d803b56f4ae5776be411d7ddf317c2dc2\": container with ID starting with a2792d848b775db9b76c7624f594568d803b56f4ae5776be411d7ddf317c2dc2 not found: ID does not exist" containerID="a2792d848b775db9b76c7624f594568d803b56f4ae5776be411d7ddf317c2dc2" Apr 28 20:08:42.686483 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:42.686435 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2792d848b775db9b76c7624f594568d803b56f4ae5776be411d7ddf317c2dc2"} err="failed to get container status \"a2792d848b775db9b76c7624f594568d803b56f4ae5776be411d7ddf317c2dc2\": rpc error: code = NotFound desc = could not find container \"a2792d848b775db9b76c7624f594568d803b56f4ae5776be411d7ddf317c2dc2\": container with ID starting with a2792d848b775db9b76c7624f594568d803b56f4ae5776be411d7ddf317c2dc2 not found: ID does not exist" Apr 28 20:08:42.696874 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:42.696845 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-e5e5a-65c9566545-brp96"] Apr 28 20:08:42.698399 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:42.698371 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/switch-graph-e5e5a-65c9566545-brp96"] Apr 28 20:08:43.384739 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:43.384709 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-d9gwv_71247a4c-9959-44f9-acd7-c5243ed29332/global-pull-secret-syncer/0.log" Apr 28 20:08:43.548749 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:43.548708 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-j966f_c1267be0-6760-422e-b253-3d9b5132c496/konnectivity-agent/0.log" Apr 28 20:08:43.689696 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:43.689592 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-139-184.ec2.internal_e3440d423eacb4ec58cf1cc320321a41/haproxy/0.log" Apr 28 20:08:44.513755 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:44.513719 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44f4ad2a-fae1-4e8e-932b-b7d20fb0eb60" path="/var/lib/kubelet/pods/44f4ad2a-fae1-4e8e-932b-b7d20fb0eb60/volumes" Apr 28 20:08:47.135403 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:47.135365 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-smtv9_6f21f574-c1af-4f40-9435-416276a65b15/cluster-monitoring-operator/0.log" Apr 28 20:08:47.689011 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:47.688979 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-lpcfl_c23a392b-1dc0-46ed-aa02-2f51cadbca4c/node-exporter/0.log" Apr 28 20:08:47.728107 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:47.728081 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-lpcfl_c23a392b-1dc0-46ed-aa02-2f51cadbca4c/kube-rbac-proxy/0.log" Apr 28 20:08:47.759180 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:47.759149 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-lpcfl_c23a392b-1dc0-46ed-aa02-2f51cadbca4c/init-textfile/0.log" Apr 28 20:08:49.872586 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:49.872548 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fts6r_6551d5d0-2583-4478-98ab-1efc22016165/console-operator/1.log" Apr 28 20:08:49.880879 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:49.880858 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fts6r_6551d5d0-2583-4478-98ab-1efc22016165/console-operator/2.log" Apr 28 20:08:50.245442 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:50.245411 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-65d79b9785-vfbq2_074538fe-619d-4f26-a23e-bc0f85a1fc20/console/0.log" Apr 28 20:08:50.291725 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:50.291696 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-hgljc_f38d2ea1-629f-4e29-8ffd-33cc8928f1b9/download-server/0.log" Apr 28 20:08:50.351807 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:50.351770 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-t6fg8/perf-node-gather-daemonset-t8trw"] Apr 28 20:08:50.352070 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:50.352058 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a3c065ac-d90a-42d8-86fd-49d138e4cde4" containerName="kserve-container" Apr 28 20:08:50.352119 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:50.352072 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3c065ac-d90a-42d8-86fd-49d138e4cde4" containerName="kserve-container" Apr 28 20:08:50.352119 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:50.352083 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dfc04099-98ac-4199-beeb-90bdf50f0477" containerName="kube-rbac-proxy" Apr 28 20:08:50.352119 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:50.352089 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfc04099-98ac-4199-beeb-90bdf50f0477" containerName="kube-rbac-proxy" Apr 28 20:08:50.352119 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:50.352103 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="44f4ad2a-fae1-4e8e-932b-b7d20fb0eb60" containerName="switch-graph-e5e5a" Apr 28 20:08:50.352119 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:50.352109 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="44f4ad2a-fae1-4e8e-932b-b7d20fb0eb60" containerName="switch-graph-e5e5a" Apr 28 20:08:50.352119 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:50.352116 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8ea68c33-7034-4b7d-a6b6-4642a4b344f0" containerName="splitter-graph-93cee" Apr 28 20:08:50.352305 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:50.352121 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ea68c33-7034-4b7d-a6b6-4642a4b344f0" containerName="splitter-graph-93cee" Apr 28 20:08:50.352305 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:50.352131 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a3c065ac-d90a-42d8-86fd-49d138e4cde4" containerName="kube-rbac-proxy" Apr 28 20:08:50.352305 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:50.352136 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3c065ac-d90a-42d8-86fd-49d138e4cde4" containerName="kube-rbac-proxy" Apr 28 20:08:50.352305 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:50.352142 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dfc04099-98ac-4199-beeb-90bdf50f0477" containerName="kserve-container" Apr 28 20:08:50.352305 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:50.352147 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfc04099-98ac-4199-beeb-90bdf50f0477" containerName="kserve-container" Apr 28 20:08:50.352305 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:50.352193 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="44f4ad2a-fae1-4e8e-932b-b7d20fb0eb60" containerName="switch-graph-e5e5a" Apr 28 20:08:50.352305 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:50.352203 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="dfc04099-98ac-4199-beeb-90bdf50f0477" containerName="kube-rbac-proxy" Apr 28 20:08:50.352305 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:50.352210 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="a3c065ac-d90a-42d8-86fd-49d138e4cde4" containerName="kserve-container" Apr 28 20:08:50.352305 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:50.352215 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="a3c065ac-d90a-42d8-86fd-49d138e4cde4" containerName="kube-rbac-proxy" Apr 28 20:08:50.352305 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:50.352223 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="8ea68c33-7034-4b7d-a6b6-4642a4b344f0" containerName="splitter-graph-93cee" Apr 28 20:08:50.352305 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:50.352229 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="dfc04099-98ac-4199-beeb-90bdf50f0477" containerName="kserve-container" Apr 28 20:08:50.356660 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:50.356642 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t6fg8/perf-node-gather-daemonset-t8trw" Apr 28 20:08:50.359437 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:50.359412 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-t6fg8\"/\"openshift-service-ca.crt\"" Apr 28 20:08:50.359548 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:50.359454 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-t6fg8\"/\"kube-root-ca.crt\"" Apr 28 20:08:50.360692 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:50.360674 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-t6fg8\"/\"default-dockercfg-494nn\"" Apr 28 20:08:50.368328 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:50.368307 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-t6fg8/perf-node-gather-daemonset-t8trw"] Apr 28 20:08:50.493461 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:50.493422 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1e5972b4-b7b9-45ba-8127-25d422f88059-sys\") pod \"perf-node-gather-daemonset-t8trw\" (UID: \"1e5972b4-b7b9-45ba-8127-25d422f88059\") " pod="openshift-must-gather-t6fg8/perf-node-gather-daemonset-t8trw" Apr 28 20:08:50.493461 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:50.493461 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvst2\" (UniqueName: \"kubernetes.io/projected/1e5972b4-b7b9-45ba-8127-25d422f88059-kube-api-access-xvst2\") pod \"perf-node-gather-daemonset-t8trw\" (UID: \"1e5972b4-b7b9-45ba-8127-25d422f88059\") " pod="openshift-must-gather-t6fg8/perf-node-gather-daemonset-t8trw" Apr 28 20:08:50.493757 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:50.493482 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/1e5972b4-b7b9-45ba-8127-25d422f88059-podres\") pod \"perf-node-gather-daemonset-t8trw\" (UID: \"1e5972b4-b7b9-45ba-8127-25d422f88059\") " pod="openshift-must-gather-t6fg8/perf-node-gather-daemonset-t8trw" Apr 28 20:08:50.493757 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:50.493530 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/1e5972b4-b7b9-45ba-8127-25d422f88059-proc\") pod \"perf-node-gather-daemonset-t8trw\" (UID: \"1e5972b4-b7b9-45ba-8127-25d422f88059\") " pod="openshift-must-gather-t6fg8/perf-node-gather-daemonset-t8trw" Apr 28 20:08:50.493757 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:50.493557 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1e5972b4-b7b9-45ba-8127-25d422f88059-lib-modules\") pod \"perf-node-gather-daemonset-t8trw\" (UID: \"1e5972b4-b7b9-45ba-8127-25d422f88059\") " pod="openshift-must-gather-t6fg8/perf-node-gather-daemonset-t8trw" Apr 28 20:08:50.594848 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:50.594747 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1e5972b4-b7b9-45ba-8127-25d422f88059-sys\") pod \"perf-node-gather-daemonset-t8trw\" (UID: \"1e5972b4-b7b9-45ba-8127-25d422f88059\") " pod="openshift-must-gather-t6fg8/perf-node-gather-daemonset-t8trw" Apr 28 20:08:50.594848 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:50.594794 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xvst2\" (UniqueName: \"kubernetes.io/projected/1e5972b4-b7b9-45ba-8127-25d422f88059-kube-api-access-xvst2\") pod \"perf-node-gather-daemonset-t8trw\" (UID: \"1e5972b4-b7b9-45ba-8127-25d422f88059\") " pod="openshift-must-gather-t6fg8/perf-node-gather-daemonset-t8trw" Apr 28 20:08:50.594848 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:50.594811 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/1e5972b4-b7b9-45ba-8127-25d422f88059-podres\") pod \"perf-node-gather-daemonset-t8trw\" (UID: \"1e5972b4-b7b9-45ba-8127-25d422f88059\") " pod="openshift-must-gather-t6fg8/perf-node-gather-daemonset-t8trw" Apr 28 20:08:50.594848 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:50.594834 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/1e5972b4-b7b9-45ba-8127-25d422f88059-proc\") pod \"perf-node-gather-daemonset-t8trw\" (UID: \"1e5972b4-b7b9-45ba-8127-25d422f88059\") " pod="openshift-must-gather-t6fg8/perf-node-gather-daemonset-t8trw" Apr 28 20:08:50.595183 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:50.594865 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1e5972b4-b7b9-45ba-8127-25d422f88059-lib-modules\") pod \"perf-node-gather-daemonset-t8trw\" (UID: \"1e5972b4-b7b9-45ba-8127-25d422f88059\") " pod="openshift-must-gather-t6fg8/perf-node-gather-daemonset-t8trw" Apr 28 20:08:50.595183 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:50.594897 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1e5972b4-b7b9-45ba-8127-25d422f88059-sys\") pod \"perf-node-gather-daemonset-t8trw\" (UID: \"1e5972b4-b7b9-45ba-8127-25d422f88059\") " pod="openshift-must-gather-t6fg8/perf-node-gather-daemonset-t8trw" Apr 28 20:08:50.595183 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:50.594907 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/1e5972b4-b7b9-45ba-8127-25d422f88059-proc\") pod \"perf-node-gather-daemonset-t8trw\" (UID: \"1e5972b4-b7b9-45ba-8127-25d422f88059\") " pod="openshift-must-gather-t6fg8/perf-node-gather-daemonset-t8trw" Apr 28 20:08:50.595183 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:50.594956 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/1e5972b4-b7b9-45ba-8127-25d422f88059-podres\") pod \"perf-node-gather-daemonset-t8trw\" (UID: \"1e5972b4-b7b9-45ba-8127-25d422f88059\") " pod="openshift-must-gather-t6fg8/perf-node-gather-daemonset-t8trw" Apr 28 20:08:50.595183 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:50.594996 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1e5972b4-b7b9-45ba-8127-25d422f88059-lib-modules\") pod \"perf-node-gather-daemonset-t8trw\" (UID: \"1e5972b4-b7b9-45ba-8127-25d422f88059\") " pod="openshift-must-gather-t6fg8/perf-node-gather-daemonset-t8trw" Apr 28 20:08:50.603443 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:50.603423 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvst2\" (UniqueName: \"kubernetes.io/projected/1e5972b4-b7b9-45ba-8127-25d422f88059-kube-api-access-xvst2\") pod \"perf-node-gather-daemonset-t8trw\" (UID: \"1e5972b4-b7b9-45ba-8127-25d422f88059\") " pod="openshift-must-gather-t6fg8/perf-node-gather-daemonset-t8trw" Apr 28 20:08:50.666158 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:50.666124 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t6fg8/perf-node-gather-daemonset-t8trw" Apr 28 20:08:50.790926 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:50.790901 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-t6fg8/perf-node-gather-daemonset-t8trw"] Apr 28 20:08:50.793032 ip-10-0-139-184 kubenswrapper[2576]: W0428 20:08:50.793008 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod1e5972b4_b7b9_45ba_8127_25d422f88059.slice/crio-5f05ca40334e7b9d9320947db15ae755552fda78582141374c2ff71b453808b9 WatchSource:0}: Error finding container 5f05ca40334e7b9d9320947db15ae755552fda78582141374c2ff71b453808b9: Status 404 returned error can't find the container with id 5f05ca40334e7b9d9320947db15ae755552fda78582141374c2ff71b453808b9 Apr 28 20:08:50.794754 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:50.794734 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 28 20:08:51.551063 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:51.551030 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-qjvwn_018d803a-f231-469e-8539-32dcc07e43f8/dns/0.log" Apr 28 20:08:51.597340 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:51.597303 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-qjvwn_018d803a-f231-469e-8539-32dcc07e43f8/kube-rbac-proxy/0.log" Apr 28 20:08:51.705598 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:51.705559 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-rb8nk_edd42d80-2884-4124-a4b2-2aea5543b72b/dns-node-resolver/0.log" Apr 28 20:08:51.711297 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:51.711260 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-t6fg8/perf-node-gather-daemonset-t8trw" event={"ID":"1e5972b4-b7b9-45ba-8127-25d422f88059","Type":"ContainerStarted","Data":"204a5c3c5cd5153669a40022865819f2c3b4d4657621e1f01d21393fdf6fb963"} Apr 28 20:08:51.711297 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:51.711296 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-t6fg8/perf-node-gather-daemonset-t8trw" event={"ID":"1e5972b4-b7b9-45ba-8127-25d422f88059","Type":"ContainerStarted","Data":"5f05ca40334e7b9d9320947db15ae755552fda78582141374c2ff71b453808b9"} Apr 28 20:08:51.711509 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:51.711374 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-t6fg8/perf-node-gather-daemonset-t8trw" Apr 28 20:08:51.730241 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:51.730198 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-t6fg8/perf-node-gather-daemonset-t8trw" podStartSLOduration=1.730184528 podStartE2EDuration="1.730184528s" podCreationTimestamp="2026-04-28 20:08:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 20:08:51.729156827 +0000 UTC m=+3147.692057185" watchObservedRunningTime="2026-04-28 20:08:51.730184528 +0000 UTC m=+3147.693084890" Apr 28 20:08:52.143958 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:52.143921 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-6b4b455f45-wj89m_6083dc36-b003-4755-9d7d-93340b1b3f4e/registry/0.log" Apr 28 20:08:52.225667 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:52.225636 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-xfk6k_b5119881-7aaa-4ea1-8738-f8463adc7b0c/node-ca/0.log" Apr 28 20:08:53.336686 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:53.336656 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-j8mfv_eb3c06b0-d193-4c98-afaf-e689e3a82af8/serve-healthcheck-canary/0.log" Apr 28 20:08:53.750914 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:53.750882 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-xq7sp_65cc2665-0cf0-4e9a-9316-292edb21e2bc/insights-operator/0.log" Apr 28 20:08:53.752235 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:53.752214 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-xq7sp_65cc2665-0cf0-4e9a-9316-292edb21e2bc/insights-operator/1.log" Apr 28 20:08:53.854012 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:53.853988 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-b7hh9_448e32b7-e5cf-4771-a258-5be2467dc339/kube-rbac-proxy/0.log" Apr 28 20:08:53.876587 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:53.876565 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-b7hh9_448e32b7-e5cf-4771-a258-5be2467dc339/exporter/0.log" Apr 28 20:08:53.900478 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:53.900455 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-b7hh9_448e32b7-e5cf-4771-a258-5be2467dc339/extractor/0.log" Apr 28 20:08:56.021875 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:56.021844 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-r8x4c_71203882-4a2d-4f7a-a355-6babe49bc167/server/0.log" Apr 28 20:08:56.483778 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:56.483682 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-2sdrd_0e7a1512-ee6c-4bac-8c74-191587ce85b3/manager/0.log" Apr 28 20:08:56.539374 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:56.539344 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-86cc847c5c-nctcv_76b5d293-e0e7-41d3-afda-56b52042dc5b/seaweedfs/0.log" Apr 28 20:08:57.723972 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:08:57.723941 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-t6fg8/perf-node-gather-daemonset-t8trw" Apr 28 20:09:00.730747 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:09:00.730715 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-dq4vl_2ceb6eda-ba8b-4e64-86be-238acc7be78a/migrator/0.log" Apr 28 20:09:00.755162 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:09:00.755132 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-dq4vl_2ceb6eda-ba8b-4e64-86be-238acc7be78a/graceful-termination/0.log" Apr 28 20:09:02.468668 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:09:02.468640 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wt99f_51e675bf-bae4-491c-adfc-eae81fef84bf/kube-multus-additional-cni-plugins/0.log" Apr 28 20:09:02.493392 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:09:02.493370 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wt99f_51e675bf-bae4-491c-adfc-eae81fef84bf/egress-router-binary-copy/0.log" Apr 28 20:09:02.520457 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:09:02.520394 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wt99f_51e675bf-bae4-491c-adfc-eae81fef84bf/cni-plugins/0.log" Apr 28 20:09:02.544669 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:09:02.544639 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wt99f_51e675bf-bae4-491c-adfc-eae81fef84bf/bond-cni-plugin/0.log" Apr 28 20:09:02.568205 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:09:02.568183 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wt99f_51e675bf-bae4-491c-adfc-eae81fef84bf/routeoverride-cni/0.log" Apr 28 20:09:02.592209 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:09:02.592187 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wt99f_51e675bf-bae4-491c-adfc-eae81fef84bf/whereabouts-cni-bincopy/0.log" Apr 28 20:09:02.616052 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:09:02.616031 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wt99f_51e675bf-bae4-491c-adfc-eae81fef84bf/whereabouts-cni/0.log" Apr 28 20:09:02.718745 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:09:02.718682 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-rktwr_d4d04cef-5f05-4c8f-82a1-c1e8350c738c/kube-multus/0.log" Apr 28 20:09:02.744152 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:09:02.744124 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-8j8w9_4236a3f6-5c96-4e29-bb77-8dafe3cd242d/network-metrics-daemon/0.log" Apr 28 20:09:02.777288 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:09:02.777262 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-8j8w9_4236a3f6-5c96-4e29-bb77-8dafe3cd242d/kube-rbac-proxy/0.log" Apr 28 20:09:03.658916 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:09:03.658884 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5tdk8_ff2f50e1-de53-4f11-a477-9236b340536b/ovn-controller/0.log" Apr 28 20:09:03.678439 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:09:03.678416 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5tdk8_ff2f50e1-de53-4f11-a477-9236b340536b/ovn-acl-logging/0.log" Apr 28 20:09:03.706397 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:09:03.706368 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5tdk8_ff2f50e1-de53-4f11-a477-9236b340536b/ovn-acl-logging/1.log" Apr 28 20:09:03.735706 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:09:03.735683 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5tdk8_ff2f50e1-de53-4f11-a477-9236b340536b/kube-rbac-proxy-node/0.log" Apr 28 20:09:03.762992 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:09:03.762953 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5tdk8_ff2f50e1-de53-4f11-a477-9236b340536b/kube-rbac-proxy-ovn-metrics/0.log" Apr 28 20:09:03.782792 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:09:03.782769 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5tdk8_ff2f50e1-de53-4f11-a477-9236b340536b/northd/0.log" Apr 28 20:09:03.808231 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:09:03.808195 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5tdk8_ff2f50e1-de53-4f11-a477-9236b340536b/nbdb/0.log" Apr 28 20:09:03.834959 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:09:03.834934 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5tdk8_ff2f50e1-de53-4f11-a477-9236b340536b/sbdb/0.log" Apr 28 20:09:04.011008 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:09:04.010972 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5tdk8_ff2f50e1-de53-4f11-a477-9236b340536b/ovnkube-controller/0.log" Apr 28 20:09:05.676813 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:09:05.676786 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-hgdfg_ca288914-564f-4959-9c10-76a6327678fe/network-check-target-container/0.log" Apr 28 20:09:06.723841 ip-10-0-139-184 kubenswrapper[2576]: I0428 20:09:06.723805 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-z9bgv_35de0ddf-e6a6-49cd-b5bd-9d110f16b469/iptables-alerter/0.log"