Apr 23 13:29:32.082431 ip-10-0-132-207 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 23 13:29:32.082445 ip-10-0-132-207 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 23 13:29:32.082455 ip-10-0-132-207 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 23 13:29:32.082768 ip-10-0-132-207 systemd[1]: Failed to start Kubernetes Kubelet. Apr 23 13:29:42.188922 ip-10-0-132-207 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 23 13:29:42.188939 ip-10-0-132-207 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot a30509d399e342169fc5fc148f664b25 -- Apr 23 13:32:06.909788 ip-10-0-132-207 systemd[1]: Starting Kubernetes Kubelet... Apr 23 13:32:07.348017 ip-10-0-132-207 kubenswrapper[2577]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 13:32:07.348017 ip-10-0-132-207 kubenswrapper[2577]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 23 13:32:07.348017 ip-10-0-132-207 kubenswrapper[2577]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 13:32:07.348017 ip-10-0-132-207 kubenswrapper[2577]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 23 13:32:07.348017 ip-10-0-132-207 kubenswrapper[2577]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 13:32:07.349126 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.348767 2577 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 23 13:32:07.354388 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.354373 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 13:32:07.354388 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.354388 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 13:32:07.354452 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.354392 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 13:32:07.354452 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.354396 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 13:32:07.354452 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.354398 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 13:32:07.354452 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.354401 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 13:32:07.354452 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.354405 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 13:32:07.354452 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.354408 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 13:32:07.354452 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.354411 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 13:32:07.354452 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.354414 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 13:32:07.354452 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.354416 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 13:32:07.354452 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.354419 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 13:32:07.354452 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.354422 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 13:32:07.354452 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.354424 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 13:32:07.354452 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.354427 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 13:32:07.354452 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.354435 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 13:32:07.354452 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.354438 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 13:32:07.354452 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.354441 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 13:32:07.354452 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.354444 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 13:32:07.354452 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.354446 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 13:32:07.354452 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.354449 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 13:32:07.354452 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.354452 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 13:32:07.354452 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.354454 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 13:32:07.354981 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.354457 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 13:32:07.354981 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.354460 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 13:32:07.354981 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.354462 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 13:32:07.354981 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.354465 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 13:32:07.354981 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.354468 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 13:32:07.354981 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.354471 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 13:32:07.354981 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.354473 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 13:32:07.354981 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.354476 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 13:32:07.354981 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.354478 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 13:32:07.354981 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.354481 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 13:32:07.354981 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.354483 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 13:32:07.354981 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.354502 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 13:32:07.354981 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.354505 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 13:32:07.354981 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.354507 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 13:32:07.354981 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.354510 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 13:32:07.354981 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.354513 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 13:32:07.354981 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.354515 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 23 13:32:07.354981 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.354519 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 13:32:07.354981 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.354523 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 13:32:07.354981 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.354526 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 13:32:07.355531 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.354528 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 13:32:07.355531 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.354531 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 13:32:07.355531 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.354533 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 13:32:07.355531 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.354536 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 13:32:07.355531 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.354539 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 13:32:07.355531 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.354542 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 13:32:07.355531 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.354545 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 13:32:07.355531 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.354548 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 13:32:07.355531 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.354551 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 13:32:07.355531 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.354554 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 13:32:07.355531 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.354557 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 13:32:07.355531 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.354560 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 13:32:07.355531 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.354562 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 13:32:07.355531 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.354565 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 13:32:07.355531 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.354569 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 13:32:07.355531 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.354573 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 13:32:07.355531 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.354577 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 13:32:07.355531 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.354579 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 13:32:07.355531 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.354582 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 13:32:07.355984 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.354584 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 13:32:07.355984 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.354587 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 13:32:07.355984 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.354590 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 13:32:07.355984 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.354592 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 13:32:07.355984 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.354595 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 13:32:07.355984 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.354598 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 13:32:07.355984 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.354600 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 13:32:07.355984 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.354604 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 13:32:07.355984 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.354607 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 13:32:07.355984 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.354610 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 13:32:07.355984 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.354613 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 13:32:07.355984 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.354615 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 13:32:07.355984 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.354618 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 13:32:07.355984 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.354620 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 13:32:07.355984 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.354623 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 13:32:07.355984 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.354626 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 13:32:07.355984 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.354628 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 13:32:07.355984 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.354631 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 13:32:07.355984 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.354634 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 13:32:07.356452 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.354637 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 13:32:07.356452 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.354640 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 13:32:07.356452 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.354642 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 13:32:07.356452 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.354645 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 13:32:07.356452 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.354648 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 13:32:07.356452 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.355001 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 13:32:07.356452 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.355005 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 13:32:07.356452 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.355008 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 13:32:07.356452 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.355011 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 23 13:32:07.356452 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.355013 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 13:32:07.356452 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.355016 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 13:32:07.356452 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.355019 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 13:32:07.356452 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.355022 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 13:32:07.356452 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.355024 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 13:32:07.356452 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.355027 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 13:32:07.356452 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.355030 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 13:32:07.356452 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.355032 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 13:32:07.356452 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.355035 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 13:32:07.356452 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.355037 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 13:32:07.356926 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.355040 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 13:32:07.356926 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.355043 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 13:32:07.356926 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.355046 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 13:32:07.356926 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.355049 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 13:32:07.356926 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.355052 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 13:32:07.356926 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.355054 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 13:32:07.356926 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.355057 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 13:32:07.356926 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.355059 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 13:32:07.356926 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.355062 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 13:32:07.356926 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.355064 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 13:32:07.356926 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.355066 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 13:32:07.356926 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.355070 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 13:32:07.356926 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.355073 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 13:32:07.356926 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.355076 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 13:32:07.356926 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.355079 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 13:32:07.356926 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.355082 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 13:32:07.356926 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.355084 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 13:32:07.356926 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.355087 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 13:32:07.356926 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.355089 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 13:32:07.357397 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.355092 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 13:32:07.357397 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.355094 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 13:32:07.357397 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.355097 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 13:32:07.357397 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.355100 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 13:32:07.357397 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.355103 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 13:32:07.357397 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.355106 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 13:32:07.357397 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.355109 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 13:32:07.357397 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.355111 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 13:32:07.357397 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.355114 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 13:32:07.357397 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.355117 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 13:32:07.357397 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.355119 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 13:32:07.357397 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.355122 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 13:32:07.357397 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.355124 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 13:32:07.357397 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.355127 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 13:32:07.357397 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.355129 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 13:32:07.357397 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.355132 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 13:32:07.357397 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.355136 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 13:32:07.357397 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.355139 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 13:32:07.357397 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.355142 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 13:32:07.357870 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.355145 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 13:32:07.357870 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.355147 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 13:32:07.357870 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.355150 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 13:32:07.357870 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.355152 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 13:32:07.357870 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.355155 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 13:32:07.357870 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.355158 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 13:32:07.357870 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.355161 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 13:32:07.357870 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.355163 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 13:32:07.357870 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.355166 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 13:32:07.357870 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.355169 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 13:32:07.357870 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.355171 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 13:32:07.357870 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.355175 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 13:32:07.357870 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.355177 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 13:32:07.357870 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.355180 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 13:32:07.357870 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.355182 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 13:32:07.357870 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.355185 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 13:32:07.357870 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.355187 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 13:32:07.357870 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.355190 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 13:32:07.357870 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.355192 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 13:32:07.357870 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.355195 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 13:32:07.358604 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.355197 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 13:32:07.358604 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.355200 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 13:32:07.358604 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.355202 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 13:32:07.358604 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.355204 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 13:32:07.358604 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.355207 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 13:32:07.358604 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.355209 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 13:32:07.358604 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.355212 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 13:32:07.358604 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.355214 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 13:32:07.358604 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.355217 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 13:32:07.358604 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.355219 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 13:32:07.358604 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.355222 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 13:32:07.358604 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.355224 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 13:32:07.358604 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.355227 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 13:32:07.358604 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.355229 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 13:32:07.358604 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356580 2577 flags.go:64] FLAG: --address="0.0.0.0" Apr 23 13:32:07.358604 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356589 2577 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 23 13:32:07.358604 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356595 2577 flags.go:64] FLAG: --anonymous-auth="true" Apr 23 13:32:07.358604 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356600 2577 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 23 13:32:07.358604 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356605 2577 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 23 13:32:07.358604 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356608 2577 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 23 13:32:07.358604 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356613 2577 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 23 13:32:07.359235 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356617 2577 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 23 13:32:07.359235 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356621 2577 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 23 13:32:07.359235 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356623 2577 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 23 13:32:07.359235 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356627 2577 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 23 13:32:07.359235 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356631 2577 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 23 13:32:07.359235 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356634 2577 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 23 13:32:07.359235 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356637 2577 flags.go:64] FLAG: --cgroup-root="" Apr 23 13:32:07.359235 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356639 2577 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 23 13:32:07.359235 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356642 2577 flags.go:64] FLAG: --client-ca-file="" Apr 23 13:32:07.359235 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356645 2577 flags.go:64] FLAG: --cloud-config="" Apr 23 13:32:07.359235 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356648 2577 flags.go:64] FLAG: --cloud-provider="external" Apr 23 13:32:07.359235 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356651 2577 flags.go:64] FLAG: --cluster-dns="[]" Apr 23 13:32:07.359235 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356654 2577 flags.go:64] FLAG: --cluster-domain="" Apr 23 13:32:07.359235 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356657 2577 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 23 13:32:07.359235 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356660 2577 flags.go:64] FLAG: --config-dir="" Apr 23 13:32:07.359235 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356663 2577 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 23 13:32:07.359235 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356666 2577 flags.go:64] FLAG: --container-log-max-files="5" Apr 23 13:32:07.359235 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356670 2577 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 23 13:32:07.359235 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356672 2577 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 23 13:32:07.359235 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356675 2577 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 23 13:32:07.359235 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356678 2577 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 23 13:32:07.359235 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356681 2577 flags.go:64] FLAG: --contention-profiling="false" Apr 23 13:32:07.359235 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356684 2577 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 23 13:32:07.359235 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356687 2577 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 23 13:32:07.359846 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356690 2577 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 23 13:32:07.359846 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356693 2577 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 23 13:32:07.359846 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356697 2577 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 23 13:32:07.359846 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356700 2577 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 23 13:32:07.359846 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356703 2577 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 23 13:32:07.359846 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356706 2577 flags.go:64] FLAG: --enable-load-reader="false" Apr 23 13:32:07.359846 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356709 2577 flags.go:64] FLAG: --enable-server="true" Apr 23 13:32:07.359846 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356712 2577 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 23 13:32:07.359846 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356716 2577 flags.go:64] FLAG: --event-burst="100" Apr 23 13:32:07.359846 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356720 2577 flags.go:64] FLAG: --event-qps="50" Apr 23 13:32:07.359846 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356723 2577 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 23 13:32:07.359846 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356726 2577 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 23 13:32:07.359846 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356728 2577 flags.go:64] FLAG: --eviction-hard="" Apr 23 13:32:07.359846 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356732 2577 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 23 13:32:07.359846 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356735 2577 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 23 13:32:07.359846 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356738 2577 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 23 13:32:07.359846 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356741 2577 flags.go:64] FLAG: --eviction-soft="" Apr 23 13:32:07.359846 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356744 2577 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 23 13:32:07.359846 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356747 2577 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 23 13:32:07.359846 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356749 2577 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 23 13:32:07.359846 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356752 2577 flags.go:64] FLAG: --experimental-mounter-path="" Apr 23 13:32:07.359846 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356755 2577 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 23 13:32:07.359846 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356758 2577 flags.go:64] FLAG: --fail-swap-on="true" Apr 23 13:32:07.359846 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356761 2577 flags.go:64] FLAG: --feature-gates="" Apr 23 13:32:07.359846 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356764 2577 flags.go:64] FLAG: --file-check-frequency="20s" Apr 23 13:32:07.360449 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356767 2577 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 23 13:32:07.360449 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356770 2577 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 23 13:32:07.360449 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356773 2577 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 23 13:32:07.360449 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356776 2577 flags.go:64] FLAG: --healthz-port="10248" Apr 23 13:32:07.360449 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356779 2577 flags.go:64] FLAG: --help="false" Apr 23 13:32:07.360449 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356782 2577 flags.go:64] FLAG: --hostname-override="ip-10-0-132-207.ec2.internal" Apr 23 13:32:07.360449 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356785 2577 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 23 13:32:07.360449 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356788 2577 flags.go:64] FLAG: --http-check-frequency="20s" Apr 23 13:32:07.360449 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356791 2577 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 23 13:32:07.360449 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356794 2577 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 23 13:32:07.360449 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356798 2577 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 23 13:32:07.360449 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356801 2577 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 23 13:32:07.360449 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356804 2577 flags.go:64] FLAG: --image-service-endpoint="" Apr 23 13:32:07.360449 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356807 2577 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 23 13:32:07.360449 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356810 2577 flags.go:64] FLAG: --kube-api-burst="100" Apr 23 13:32:07.360449 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356813 2577 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 23 13:32:07.360449 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356816 2577 flags.go:64] FLAG: --kube-api-qps="50" Apr 23 13:32:07.360449 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356821 2577 flags.go:64] FLAG: --kube-reserved="" Apr 23 13:32:07.360449 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356825 2577 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 23 13:32:07.360449 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356827 2577 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 23 13:32:07.360449 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356830 2577 flags.go:64] FLAG: --kubelet-cgroups="" Apr 23 13:32:07.360449 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356833 2577 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 23 13:32:07.360449 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356836 2577 flags.go:64] FLAG: --lock-file="" Apr 23 13:32:07.360449 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356839 2577 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 23 13:32:07.361051 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356842 2577 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 23 13:32:07.361051 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356845 2577 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 23 13:32:07.361051 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356849 2577 flags.go:64] FLAG: --log-json-split-stream="false" Apr 23 13:32:07.361051 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356853 2577 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 23 13:32:07.361051 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356855 2577 flags.go:64] FLAG: --log-text-split-stream="false" Apr 23 13:32:07.361051 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356858 2577 flags.go:64] FLAG: --logging-format="text" Apr 23 13:32:07.361051 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356861 2577 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 23 13:32:07.361051 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356864 2577 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 23 13:32:07.361051 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356867 2577 flags.go:64] FLAG: --manifest-url="" Apr 23 13:32:07.361051 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356870 2577 flags.go:64] FLAG: --manifest-url-header="" Apr 23 13:32:07.361051 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356874 2577 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 23 13:32:07.361051 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356877 2577 flags.go:64] FLAG: --max-open-files="1000000" Apr 23 13:32:07.361051 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356881 2577 flags.go:64] FLAG: --max-pods="110" Apr 23 13:32:07.361051 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356883 2577 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 23 13:32:07.361051 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356886 2577 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 23 13:32:07.361051 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356889 2577 flags.go:64] FLAG: --memory-manager-policy="None" Apr 23 13:32:07.361051 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356892 2577 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 23 13:32:07.361051 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356896 2577 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 23 13:32:07.361051 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356899 2577 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 23 13:32:07.361051 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356901 2577 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 23 13:32:07.361051 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356908 2577 flags.go:64] FLAG: --node-status-max-images="50" Apr 23 13:32:07.361051 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356912 2577 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 23 13:32:07.361051 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356915 2577 flags.go:64] FLAG: --oom-score-adj="-999" Apr 23 13:32:07.361051 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356917 2577 flags.go:64] FLAG: --pod-cidr="" Apr 23 13:32:07.361682 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356920 2577 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 23 13:32:07.361682 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356926 2577 flags.go:64] FLAG: --pod-manifest-path="" Apr 23 13:32:07.361682 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356930 2577 flags.go:64] FLAG: --pod-max-pids="-1" Apr 23 13:32:07.361682 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356933 2577 flags.go:64] FLAG: --pods-per-core="0" Apr 23 13:32:07.361682 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356936 2577 flags.go:64] FLAG: --port="10250" Apr 23 13:32:07.361682 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356939 2577 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 23 13:32:07.361682 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356942 2577 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0391b327e9d5e12c5" Apr 23 13:32:07.361682 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356945 2577 flags.go:64] FLAG: --qos-reserved="" Apr 23 13:32:07.361682 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356948 2577 flags.go:64] FLAG: --read-only-port="10255" Apr 23 13:32:07.361682 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356951 2577 flags.go:64] FLAG: --register-node="true" Apr 23 13:32:07.361682 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356954 2577 flags.go:64] FLAG: --register-schedulable="true" Apr 23 13:32:07.361682 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356957 2577 flags.go:64] FLAG: --register-with-taints="" Apr 23 13:32:07.361682 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356960 2577 flags.go:64] FLAG: --registry-burst="10" Apr 23 13:32:07.361682 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356963 2577 flags.go:64] FLAG: --registry-qps="5" Apr 23 13:32:07.361682 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356966 2577 flags.go:64] FLAG: --reserved-cpus="" Apr 23 13:32:07.361682 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356969 2577 flags.go:64] FLAG: --reserved-memory="" Apr 23 13:32:07.361682 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356978 2577 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 23 13:32:07.361682 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356981 2577 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 23 13:32:07.361682 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356984 2577 flags.go:64] FLAG: --rotate-certificates="false" Apr 23 13:32:07.361682 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356987 2577 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 23 13:32:07.361682 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356989 2577 flags.go:64] FLAG: --runonce="false" Apr 23 13:32:07.361682 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356993 2577 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 23 13:32:07.361682 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356996 2577 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 23 13:32:07.361682 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.356999 2577 flags.go:64] FLAG: --seccomp-default="false" Apr 23 13:32:07.361682 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.357002 2577 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 23 13:32:07.362300 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.357016 2577 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 23 13:32:07.362300 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.357019 2577 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 23 13:32:07.362300 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.357022 2577 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 23 13:32:07.362300 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.357026 2577 flags.go:64] FLAG: --storage-driver-password="root" Apr 23 13:32:07.362300 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.357029 2577 flags.go:64] FLAG: --storage-driver-secure="false" Apr 23 13:32:07.362300 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.357032 2577 flags.go:64] FLAG: --storage-driver-table="stats" Apr 23 13:32:07.362300 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.357034 2577 flags.go:64] FLAG: --storage-driver-user="root" Apr 23 13:32:07.362300 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.357037 2577 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 23 13:32:07.362300 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.357041 2577 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 23 13:32:07.362300 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.357043 2577 flags.go:64] FLAG: --system-cgroups="" Apr 23 13:32:07.362300 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.357047 2577 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 23 13:32:07.362300 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.357052 2577 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 23 13:32:07.362300 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.357055 2577 flags.go:64] FLAG: --tls-cert-file="" Apr 23 13:32:07.362300 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.357057 2577 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 23 13:32:07.362300 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.357064 2577 flags.go:64] FLAG: --tls-min-version="" Apr 23 13:32:07.362300 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.357067 2577 flags.go:64] FLAG: --tls-private-key-file="" Apr 23 13:32:07.362300 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.357069 2577 flags.go:64] FLAG: --topology-manager-policy="none" Apr 23 13:32:07.362300 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.357072 2577 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 23 13:32:07.362300 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.357075 2577 flags.go:64] FLAG: --topology-manager-scope="container" Apr 23 13:32:07.362300 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.357078 2577 flags.go:64] FLAG: --v="2" Apr 23 13:32:07.362300 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.357082 2577 flags.go:64] FLAG: --version="false" Apr 23 13:32:07.362300 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.357086 2577 flags.go:64] FLAG: --vmodule="" Apr 23 13:32:07.362300 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.357090 2577 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 23 13:32:07.362300 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.357093 2577 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 23 13:32:07.362300 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.357180 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 13:32:07.362922 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.357184 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 13:32:07.362922 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.357187 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 13:32:07.362922 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.357190 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 13:32:07.362922 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.357193 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 13:32:07.362922 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.357196 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 13:32:07.362922 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.357199 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 13:32:07.362922 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.357203 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 13:32:07.362922 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.357205 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 13:32:07.362922 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.357208 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 13:32:07.362922 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.357210 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 13:32:07.362922 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.357213 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 13:32:07.362922 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.357215 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 13:32:07.362922 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.357218 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 13:32:07.362922 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.357222 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 13:32:07.362922 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.357224 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 13:32:07.362922 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.357228 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 13:32:07.362922 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.357232 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 13:32:07.362922 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.357234 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 13:32:07.362922 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.357238 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 13:32:07.363431 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.357241 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 13:32:07.363431 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.357244 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 13:32:07.363431 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.357248 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 13:32:07.363431 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.357250 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 13:32:07.363431 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.357253 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 13:32:07.363431 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.357256 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 13:32:07.363431 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.357258 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 13:32:07.363431 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.357261 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 13:32:07.363431 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.357263 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 13:32:07.363431 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.357266 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 13:32:07.363431 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.357269 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 13:32:07.363431 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.357271 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 13:32:07.363431 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.357274 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 13:32:07.363431 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.357276 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 13:32:07.363431 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.357279 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 13:32:07.363431 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.357281 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 13:32:07.363431 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.357287 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 13:32:07.363431 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.357290 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 13:32:07.363431 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.357292 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 13:32:07.363431 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.357295 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 13:32:07.363961 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.357297 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 13:32:07.363961 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.357300 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 13:32:07.363961 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.357302 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 13:32:07.363961 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.357305 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 13:32:07.363961 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.357307 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 13:32:07.363961 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.357310 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 13:32:07.363961 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.357312 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 13:32:07.363961 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.357315 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 13:32:07.363961 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.357317 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 13:32:07.363961 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.357320 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 13:32:07.363961 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.357322 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 13:32:07.363961 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.357325 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 13:32:07.363961 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.357329 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 13:32:07.363961 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.357332 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 13:32:07.363961 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.357335 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 13:32:07.363961 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.357338 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 13:32:07.363961 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.357340 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 13:32:07.363961 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.357343 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 13:32:07.363961 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.357345 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 13:32:07.364451 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.357348 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 23 13:32:07.364451 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.357350 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 13:32:07.364451 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.357353 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 13:32:07.364451 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.357355 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 13:32:07.364451 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.357358 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 13:32:07.364451 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.357362 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 13:32:07.364451 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.357366 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 13:32:07.364451 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.357368 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 13:32:07.364451 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.357371 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 13:32:07.364451 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.357374 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 13:32:07.364451 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.357377 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 13:32:07.364451 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.357380 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 13:32:07.364451 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.357382 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 13:32:07.364451 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.357385 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 13:32:07.364451 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.357388 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 13:32:07.364451 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.357390 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 13:32:07.364451 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.357393 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 13:32:07.364451 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.357396 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 13:32:07.364451 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.357398 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 13:32:07.364451 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.357401 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 13:32:07.364976 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.357404 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 13:32:07.364976 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.357406 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 13:32:07.364976 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.357409 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 13:32:07.364976 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.357411 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 13:32:07.364976 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.357414 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 13:32:07.364976 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.357416 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 13:32:07.364976 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.357419 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 13:32:07.364976 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.357967 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 13:32:07.364976 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.364640 2577 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 23 13:32:07.364976 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.364654 2577 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 23 13:32:07.364976 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.364697 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 13:32:07.364976 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.364702 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 13:32:07.364976 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.364705 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 13:32:07.364976 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.364708 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 13:32:07.364976 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.364711 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 13:32:07.365343 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.364714 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 13:32:07.365343 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.364717 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 13:32:07.365343 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.364719 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 13:32:07.365343 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.364722 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 13:32:07.365343 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.364725 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 13:32:07.365343 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.364728 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 13:32:07.365343 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.364730 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 13:32:07.365343 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.364733 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 13:32:07.365343 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.364737 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 13:32:07.365343 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.364739 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 13:32:07.365343 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.364742 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 13:32:07.365343 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.364744 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 13:32:07.365343 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.364747 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 13:32:07.365343 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.364750 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 13:32:07.365343 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.364752 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 13:32:07.365343 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.364755 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 13:32:07.365343 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.364758 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 13:32:07.365343 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.364760 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 13:32:07.365343 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.364763 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 13:32:07.365343 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.364766 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 13:32:07.365849 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.364770 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 13:32:07.365849 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.364775 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 13:32:07.365849 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.364777 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 13:32:07.365849 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.364780 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 13:32:07.365849 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.364783 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 13:32:07.365849 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.364786 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 13:32:07.365849 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.364789 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 13:32:07.365849 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.364792 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 13:32:07.365849 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.364795 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 13:32:07.365849 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.364797 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 13:32:07.365849 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.364800 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 13:32:07.365849 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.364803 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 13:32:07.365849 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.364806 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 13:32:07.365849 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.364808 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 13:32:07.365849 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.364811 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 13:32:07.365849 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.364813 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 13:32:07.365849 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.364816 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 13:32:07.365849 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.364818 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 13:32:07.365849 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.364821 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 13:32:07.366402 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.364823 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 13:32:07.366402 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.364826 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 13:32:07.366402 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.364828 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 13:32:07.366402 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.364831 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 13:32:07.366402 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.364833 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 13:32:07.366402 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.364836 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 13:32:07.366402 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.364838 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 13:32:07.366402 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.364841 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 13:32:07.366402 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.364844 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 13:32:07.366402 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.364846 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 13:32:07.366402 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.364848 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 13:32:07.366402 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.364851 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 13:32:07.366402 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.364870 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 13:32:07.366402 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.364874 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 13:32:07.366402 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.364878 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 13:32:07.366402 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.364881 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 13:32:07.366402 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.364883 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 13:32:07.366402 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.364886 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 23 13:32:07.366402 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.364889 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 13:32:07.366402 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.364892 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 13:32:07.366903 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.364895 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 13:32:07.366903 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.364897 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 13:32:07.366903 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.364900 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 13:32:07.366903 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.364903 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 13:32:07.366903 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.364905 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 13:32:07.366903 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.364908 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 13:32:07.366903 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.364910 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 13:32:07.366903 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.364913 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 13:32:07.366903 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.364915 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 13:32:07.366903 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.364918 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 13:32:07.366903 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.364920 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 13:32:07.366903 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.364923 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 13:32:07.366903 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.364925 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 13:32:07.366903 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.364928 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 13:32:07.366903 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.364930 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 13:32:07.366903 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.364933 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 13:32:07.366903 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.364936 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 13:32:07.366903 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.364940 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 13:32:07.366903 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.364943 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 13:32:07.366903 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.364946 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 13:32:07.367383 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.364949 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 13:32:07.367383 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.364951 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 13:32:07.367383 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.364956 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 13:32:07.367383 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.365060 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 13:32:07.367383 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.365066 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 13:32:07.367383 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.365070 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 13:32:07.367383 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.365073 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 13:32:07.367383 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.365076 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 13:32:07.367383 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.365079 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 13:32:07.367383 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.365082 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 13:32:07.367383 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.365084 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 13:32:07.367383 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.365088 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 13:32:07.367383 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.365091 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 13:32:07.367383 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.365094 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 13:32:07.367383 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.365096 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 13:32:07.367818 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.365099 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 13:32:07.367818 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.365102 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 13:32:07.367818 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.365104 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 13:32:07.367818 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.365107 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 13:32:07.367818 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.365109 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 13:32:07.367818 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.365112 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 13:32:07.367818 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.365114 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 13:32:07.367818 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.365117 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 13:32:07.367818 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.365119 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 13:32:07.367818 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.365123 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 13:32:07.367818 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.365127 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 13:32:07.367818 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.365129 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 13:32:07.367818 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.365132 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 13:32:07.367818 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.365134 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 13:32:07.367818 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.365137 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 13:32:07.367818 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.365139 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 13:32:07.367818 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.365142 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 13:32:07.367818 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.365144 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 13:32:07.367818 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.365147 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 13:32:07.368281 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.365150 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 13:32:07.368281 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.365152 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 13:32:07.368281 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.365154 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 13:32:07.368281 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.365157 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 13:32:07.368281 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.365159 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 13:32:07.368281 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.365162 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 13:32:07.368281 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.365164 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 23 13:32:07.368281 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.365166 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 13:32:07.368281 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.365169 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 13:32:07.368281 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.365171 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 13:32:07.368281 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.365174 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 13:32:07.368281 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.365177 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 13:32:07.368281 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.365179 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 13:32:07.368281 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.365182 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 13:32:07.368281 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.365185 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 13:32:07.368281 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.365187 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 13:32:07.368281 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.365190 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 13:32:07.368281 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.365192 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 13:32:07.368281 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.365194 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 13:32:07.368281 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.365197 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 13:32:07.368779 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.365200 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 13:32:07.368779 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.365202 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 13:32:07.368779 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.365205 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 13:32:07.368779 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.365208 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 13:32:07.368779 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.365210 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 13:32:07.368779 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.365213 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 13:32:07.368779 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.365215 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 13:32:07.368779 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.365218 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 13:32:07.368779 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.365220 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 13:32:07.368779 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.365223 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 13:32:07.368779 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.365225 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 13:32:07.368779 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.365227 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 13:32:07.368779 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.365230 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 13:32:07.368779 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.365232 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 13:32:07.368779 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.365235 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 13:32:07.368779 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.365237 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 13:32:07.368779 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.365240 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 13:32:07.368779 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.365242 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 13:32:07.368779 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.365245 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 13:32:07.368779 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.365247 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 13:32:07.369274 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.365249 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 13:32:07.369274 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.365253 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 13:32:07.369274 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.365255 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 13:32:07.369274 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.365258 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 13:32:07.369274 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.365260 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 13:32:07.369274 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.365263 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 13:32:07.369274 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.365266 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 13:32:07.369274 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.365268 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 13:32:07.369274 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.365270 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 13:32:07.369274 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.365273 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 13:32:07.369274 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.365276 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 13:32:07.369274 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.365278 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 13:32:07.369274 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.365281 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 13:32:07.369274 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.365283 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 13:32:07.369274 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:07.365285 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 13:32:07.369274 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.365290 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 13:32:07.369681 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.366021 2577 server.go:962] "Client rotation is on, will bootstrap in background" Apr 23 13:32:07.369681 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.368571 2577 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 23 13:32:07.369681 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.369549 2577 server.go:1019] "Starting client certificate rotation" Apr 23 13:32:07.369681 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.369644 2577 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 23 13:32:07.369787 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.369685 2577 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 23 13:32:07.395809 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.395790 2577 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 23 13:32:07.398186 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.398170 2577 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 23 13:32:07.407389 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.407371 2577 log.go:25] "Validated CRI v1 runtime API" Apr 23 13:32:07.414015 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.413999 2577 log.go:25] "Validated CRI v1 image API" Apr 23 13:32:07.415664 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.415650 2577 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 23 13:32:07.417847 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.417826 2577 fs.go:135] Filesystem UUIDs: map[505ee927-0aff-474e-8626-839e8723b53e:/dev/nvme0n1p4 78a7997b-efd3-4651-a838-9d4145ec550f:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2] Apr 23 13:32:07.417932 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.417845 2577 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 23 13:32:07.423412 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.423289 2577 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 23 13:32:07.424343 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.424238 2577 manager.go:217] Machine: {Timestamp:2026-04-23 13:32:07.422469284 +0000 UTC m=+0.400377867 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3098655 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2ee39ad3d01467f7951191c26d585a SystemUUID:ec2ee39a-d3d0-1467-f795-1191c26d585a BootID:a30509d3-99e3-4216-9fc5-fc148f664b25 Filesystems:[{Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:65:eb:d4:1f:f3 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:65:eb:d4:1f:f3 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:a6:ed:e8:29:7e:5c Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 23 13:32:07.424343 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.424339 2577 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 23 13:32:07.424447 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.424413 2577 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 23 13:32:07.427045 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.427018 2577 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 23 13:32:07.427168 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.427047 2577 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-132-207.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 23 13:32:07.427214 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.427176 2577 topology_manager.go:138] "Creating topology manager with none policy" Apr 23 13:32:07.427214 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.427185 2577 container_manager_linux.go:306] "Creating device plugin manager" Apr 23 13:32:07.427214 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.427199 2577 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 23 13:32:07.427214 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.427213 2577 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 23 13:32:07.428444 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.428433 2577 state_mem.go:36] "Initialized new in-memory state store" Apr 23 13:32:07.428560 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.428551 2577 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 23 13:32:07.431103 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.431093 2577 kubelet.go:491] "Attempting to sync node with API server" Apr 23 13:32:07.431140 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.431113 2577 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 23 13:32:07.431140 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.431125 2577 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 23 13:32:07.431140 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.431135 2577 kubelet.go:397] "Adding apiserver pod source" Apr 23 13:32:07.431227 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.431144 2577 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 23 13:32:07.432123 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.432111 2577 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 23 13:32:07.432164 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.432130 2577 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 23 13:32:07.434809 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.434791 2577 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 23 13:32:07.436033 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.436020 2577 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 23 13:32:07.438019 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.437984 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 23 13:32:07.438019 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.438011 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 23 13:32:07.438019 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.438018 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 23 13:32:07.438140 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.438026 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 23 13:32:07.438140 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.438035 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 23 13:32:07.438140 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.438044 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 23 13:32:07.438140 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.438053 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 23 13:32:07.438140 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.438058 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 23 13:32:07.438140 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.438065 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 23 13:32:07.438140 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.438071 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 23 13:32:07.438140 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.438080 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 23 13:32:07.438140 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.438089 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 23 13:32:07.439966 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.439953 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 23 13:32:07.439966 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.439966 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 23 13:32:07.442264 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:07.442242 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 23 13:32:07.442387 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:07.442369 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-132-207.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 23 13:32:07.443392 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.443379 2577 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 23 13:32:07.443450 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.443414 2577 server.go:1295] "Started kubelet" Apr 23 13:32:07.443530 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.443504 2577 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 23 13:32:07.443612 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.443562 2577 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 23 13:32:07.443668 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.443639 2577 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 23 13:32:07.444179 ip-10-0-132-207 systemd[1]: Started Kubernetes Kubelet. Apr 23 13:32:07.444774 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.444757 2577 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 23 13:32:07.446052 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.446037 2577 server.go:317] "Adding debug handlers to kubelet server" Apr 23 13:32:07.449415 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.449397 2577 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 23 13:32:07.449965 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.449947 2577 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 23 13:32:07.450574 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.450557 2577 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 23 13:32:07.450652 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.450579 2577 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 23 13:32:07.450702 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.450679 2577 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 23 13:32:07.450785 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.450738 2577 reconstruct.go:97] "Volume reconstruction finished" Apr 23 13:32:07.450785 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.450745 2577 reconciler.go:26] "Reconciler: start to sync state" Apr 23 13:32:07.451071 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:07.451036 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-207.ec2.internal\" not found" Apr 23 13:32:07.451548 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.451533 2577 factory.go:55] Registering systemd factory Apr 23 13:32:07.451634 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.451552 2577 factory.go:223] Registration of the systemd container factory successfully Apr 23 13:32:07.451756 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.451741 2577 factory.go:153] Registering CRI-O factory Apr 23 13:32:07.451830 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.451759 2577 factory.go:223] Registration of the crio container factory successfully Apr 23 13:32:07.451830 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.451807 2577 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 23 13:32:07.451830 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.451827 2577 factory.go:103] Registering Raw factory Apr 23 13:32:07.451970 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.451842 2577 manager.go:1196] Started watching for new ooms in manager Apr 23 13:32:07.452141 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:07.452102 2577 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 23 13:32:07.452255 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.452241 2577 manager.go:319] Starting recovery of all containers Apr 23 13:32:07.452328 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.452311 2577 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-132-207.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 13:32:07.452540 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:07.452520 2577 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-132-207.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 23 13:32:07.455412 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:07.452400 2577 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-132-207.ec2.internal.18a8ffa2be7e4087 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-132-207.ec2.internal,UID:ip-10-0-132-207.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-132-207.ec2.internal,},FirstTimestamp:2026-04-23 13:32:07.443390599 +0000 UTC m=+0.421299182,LastTimestamp:2026-04-23 13:32:07.443390599 +0000 UTC m=+0.421299182,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-132-207.ec2.internal,}" Apr 23 13:32:07.458792 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:07.458760 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 23 13:32:07.467206 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.467188 2577 manager.go:324] Recovery completed Apr 23 13:32:07.468374 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:07.468358 2577 watcher.go:152] Failed to watch directory "/sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service": inotify_add_watch /sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service: no such file or directory Apr 23 13:32:07.468696 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.468676 2577 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-kkkp4" Apr 23 13:32:07.471165 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.471153 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 13:32:07.473365 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.473352 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-207.ec2.internal" event="NodeHasSufficientMemory" Apr 23 13:32:07.473445 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.473376 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-207.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 13:32:07.473445 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.473386 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-207.ec2.internal" event="NodeHasSufficientPID" Apr 23 13:32:07.473891 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.473878 2577 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 23 13:32:07.473891 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.473891 2577 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 23 13:32:07.473975 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.473905 2577 state_mem.go:36] "Initialized new in-memory state store" Apr 23 13:32:07.475826 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:07.475762 2577 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-132-207.ec2.internal.18a8ffa2c0479f35 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-132-207.ec2.internal,UID:ip-10-0-132-207.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-132-207.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-132-207.ec2.internal,},FirstTimestamp:2026-04-23 13:32:07.473364789 +0000 UTC m=+0.451273371,LastTimestamp:2026-04-23 13:32:07.473364789 +0000 UTC m=+0.451273371,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-132-207.ec2.internal,}" Apr 23 13:32:07.476130 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.476118 2577 policy_none.go:49] "None policy: Start" Apr 23 13:32:07.476179 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.476133 2577 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 23 13:32:07.476179 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.476143 2577 state_mem.go:35] "Initializing new in-memory state store" Apr 23 13:32:07.476543 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.476527 2577 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-kkkp4" Apr 23 13:32:07.518087 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.518071 2577 manager.go:341] "Starting Device Plugin manager" Apr 23 13:32:07.518628 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:07.518113 2577 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 23 13:32:07.518628 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.518127 2577 server.go:85] "Starting device plugin registration server" Apr 23 13:32:07.518628 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.518335 2577 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 23 13:32:07.518628 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.518348 2577 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 23 13:32:07.518628 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.518451 2577 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 23 13:32:07.518628 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.518539 2577 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 23 13:32:07.518628 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.518549 2577 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 23 13:32:07.519078 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:07.519018 2577 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 23 13:32:07.519078 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:07.519069 2577 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-132-207.ec2.internal\" not found" Apr 23 13:32:07.577830 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.577801 2577 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 23 13:32:07.578915 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.578900 2577 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 23 13:32:07.579011 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.578937 2577 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 23 13:32:07.579011 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.578958 2577 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 23 13:32:07.579011 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.578968 2577 kubelet.go:2451] "Starting kubelet main sync loop" Apr 23 13:32:07.579011 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:07.579005 2577 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 23 13:32:07.582098 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.582080 2577 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 13:32:07.620779 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.620735 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 13:32:07.621537 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.621523 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-207.ec2.internal" event="NodeHasSufficientMemory" Apr 23 13:32:07.621611 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.621550 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-207.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 13:32:07.621611 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.621562 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-207.ec2.internal" event="NodeHasSufficientPID" Apr 23 13:32:07.621611 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.621587 2577 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-132-207.ec2.internal" Apr 23 13:32:07.633307 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.633291 2577 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-132-207.ec2.internal" Apr 23 13:32:07.633353 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:07.633311 2577 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-132-207.ec2.internal\": node \"ip-10-0-132-207.ec2.internal\" not found" Apr 23 13:32:07.657136 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:07.657115 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-207.ec2.internal\" not found" Apr 23 13:32:07.679404 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.679382 2577 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-207.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-132-207.ec2.internal"] Apr 23 13:32:07.679472 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.679439 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 13:32:07.680776 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.680760 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-207.ec2.internal" event="NodeHasSufficientMemory" Apr 23 13:32:07.680859 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.680786 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-207.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 13:32:07.680859 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.680798 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-207.ec2.internal" event="NodeHasSufficientPID" Apr 23 13:32:07.682079 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.682067 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 13:32:07.682214 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.682202 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-207.ec2.internal" Apr 23 13:32:07.682272 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.682229 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 13:32:07.682711 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.682696 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-207.ec2.internal" event="NodeHasSufficientMemory" Apr 23 13:32:07.682790 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.682697 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-207.ec2.internal" event="NodeHasSufficientMemory" Apr 23 13:32:07.682790 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.682745 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-207.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 13:32:07.682790 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.682757 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-207.ec2.internal" event="NodeHasSufficientPID" Apr 23 13:32:07.682790 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.682722 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-207.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 13:32:07.682790 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.682791 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-207.ec2.internal" event="NodeHasSufficientPID" Apr 23 13:32:07.684037 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.684020 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-207.ec2.internal" Apr 23 13:32:07.684115 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.684048 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 13:32:07.684698 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.684685 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-207.ec2.internal" event="NodeHasSufficientMemory" Apr 23 13:32:07.684763 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.684711 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-207.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 13:32:07.684763 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.684723 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-207.ec2.internal" event="NodeHasSufficientPID" Apr 23 13:32:07.700994 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:07.700976 2577 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-132-207.ec2.internal\" not found" node="ip-10-0-132-207.ec2.internal" Apr 23 13:32:07.704284 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:07.704270 2577 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-132-207.ec2.internal\" not found" node="ip-10-0-132-207.ec2.internal" Apr 23 13:32:07.757934 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:07.757916 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-207.ec2.internal\" not found" Apr 23 13:32:07.852143 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.852122 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e0714ba56e53b69dc64c17ef7d8f7924-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-132-207.ec2.internal\" (UID: \"e0714ba56e53b69dc64c17ef7d8f7924\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-207.ec2.internal" Apr 23 13:32:07.852226 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.852148 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e0714ba56e53b69dc64c17ef7d8f7924-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-132-207.ec2.internal\" (UID: \"e0714ba56e53b69dc64c17ef7d8f7924\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-207.ec2.internal" Apr 23 13:32:07.852226 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.852164 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/bbbaef9da88b934e809d0d29bccb4dd7-config\") pod \"kube-apiserver-proxy-ip-10-0-132-207.ec2.internal\" (UID: \"bbbaef9da88b934e809d0d29bccb4dd7\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-132-207.ec2.internal" Apr 23 13:32:07.858215 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:07.858200 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-207.ec2.internal\" not found" Apr 23 13:32:07.952876 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.952849 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e0714ba56e53b69dc64c17ef7d8f7924-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-132-207.ec2.internal\" (UID: \"e0714ba56e53b69dc64c17ef7d8f7924\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-207.ec2.internal" Apr 23 13:32:07.952876 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.952881 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/bbbaef9da88b934e809d0d29bccb4dd7-config\") pod \"kube-apiserver-proxy-ip-10-0-132-207.ec2.internal\" (UID: \"bbbaef9da88b934e809d0d29bccb4dd7\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-132-207.ec2.internal" Apr 23 13:32:07.952988 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.952897 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e0714ba56e53b69dc64c17ef7d8f7924-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-132-207.ec2.internal\" (UID: \"e0714ba56e53b69dc64c17ef7d8f7924\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-207.ec2.internal" Apr 23 13:32:07.952988 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.952920 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e0714ba56e53b69dc64c17ef7d8f7924-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-132-207.ec2.internal\" (UID: \"e0714ba56e53b69dc64c17ef7d8f7924\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-207.ec2.internal" Apr 23 13:32:07.952988 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.952937 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e0714ba56e53b69dc64c17ef7d8f7924-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-132-207.ec2.internal\" (UID: \"e0714ba56e53b69dc64c17ef7d8f7924\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-207.ec2.internal" Apr 23 13:32:07.952988 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:07.952946 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/bbbaef9da88b934e809d0d29bccb4dd7-config\") pod \"kube-apiserver-proxy-ip-10-0-132-207.ec2.internal\" (UID: \"bbbaef9da88b934e809d0d29bccb4dd7\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-132-207.ec2.internal" Apr 23 13:32:07.958967 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:07.958949 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-207.ec2.internal\" not found" Apr 23 13:32:08.003114 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:08.003100 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-207.ec2.internal" Apr 23 13:32:08.006460 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:08.006444 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-207.ec2.internal" Apr 23 13:32:08.059135 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:08.059113 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-207.ec2.internal\" not found" Apr 23 13:32:08.159665 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:08.159646 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-207.ec2.internal\" not found" Apr 23 13:32:08.260137 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:08.260094 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-207.ec2.internal\" not found" Apr 23 13:32:08.360623 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:08.360589 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-207.ec2.internal\" not found" Apr 23 13:32:08.369748 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:08.369731 2577 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 23 13:32:08.369865 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:08.369850 2577 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 23 13:32:08.450263 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:08.450237 2577 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 23 13:32:08.461136 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:08.461110 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-207.ec2.internal\" not found" Apr 23 13:32:08.470595 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:08.470569 2577 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 23 13:32:08.478373 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:08.478345 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-22 13:27:07 +0000 UTC" deadline="2028-01-06 17:33:13.436010906 +0000 UTC" Apr 23 13:32:08.478373 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:08.478371 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14956h1m4.957643005s" Apr 23 13:32:08.490222 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:08.490203 2577 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-26n5p" Apr 23 13:32:08.497396 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:08.497380 2577 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-26n5p" Apr 23 13:32:08.528967 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:08.528922 2577 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 13:32:08.551134 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:08.551117 2577 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-207.ec2.internal" Apr 23 13:32:08.563615 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:08.563600 2577 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 23 13:32:08.565477 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:08.565463 2577 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-207.ec2.internal" Apr 23 13:32:08.574543 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:08.574528 2577 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 23 13:32:08.591595 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:08.591571 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0714ba56e53b69dc64c17ef7d8f7924.slice/crio-612cdd3df255e2b132f6494fa922f80ed9a205753acccc6e3631c4a643f32c13 WatchSource:0}: Error finding container 612cdd3df255e2b132f6494fa922f80ed9a205753acccc6e3631c4a643f32c13: Status 404 returned error can't find the container with id 612cdd3df255e2b132f6494fa922f80ed9a205753acccc6e3631c4a643f32c13 Apr 23 13:32:08.592046 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:08.592025 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbbbaef9da88b934e809d0d29bccb4dd7.slice/crio-da4cd21a353103cc1bdb2795d2f23f3b33f9995bf5e4cabdcfb59576244f982b WatchSource:0}: Error finding container da4cd21a353103cc1bdb2795d2f23f3b33f9995bf5e4cabdcfb59576244f982b: Status 404 returned error can't find the container with id da4cd21a353103cc1bdb2795d2f23f3b33f9995bf5e4cabdcfb59576244f982b Apr 23 13:32:08.597677 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:08.597663 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 13:32:08.632147 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:08.632124 2577 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 13:32:08.723023 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:08.723004 2577 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 13:32:09.432110 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.432080 2577 apiserver.go:52] "Watching apiserver" Apr 23 13:32:09.440432 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.440405 2577 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 23 13:32:09.441158 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.441132 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-fl2td","openshift-network-operator/iptables-alerter-v5s77","openshift-ovn-kubernetes/ovnkube-node-mwbjc","kube-system/konnectivity-agent-shltj","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9pf4r","openshift-cluster-node-tuning-operator/tuned-g4zgx","openshift-dns/node-resolver-wxlj2","openshift-image-registry/node-ca-mbln5","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-207.ec2.internal","openshift-multus/multus-additional-cni-plugins-p9tzn","openshift-multus/multus-xvg2d","kube-system/kube-apiserver-proxy-ip-10-0-132-207.ec2.internal","openshift-multus/network-metrics-daemon-6pz6w"] Apr 23 13:32:09.444069 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.444047 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-p9tzn" Apr 23 13:32:09.446678 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.446445 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-v5s77" Apr 23 13:32:09.446867 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.446848 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 23 13:32:09.447400 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.447247 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 23 13:32:09.447400 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.447282 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 23 13:32:09.447400 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.447285 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 23 13:32:09.447400 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.447326 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 23 13:32:09.447400 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.447370 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-2dsfl\"" Apr 23 13:32:09.449205 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.449185 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-ft7qb\"" Apr 23 13:32:09.449451 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.449434 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 23 13:32:09.449549 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.449481 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 23 13:32:09.449745 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.449727 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 23 13:32:09.450987 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.450915 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-mwbjc" Apr 23 13:32:09.455124 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.454002 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-shltj" Apr 23 13:32:09.455124 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.454824 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 23 13:32:09.455124 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.455024 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-bqtrf\"" Apr 23 13:32:09.455461 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.455187 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 23 13:32:09.455461 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.455271 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 23 13:32:09.455461 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.455331 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 23 13:32:09.456046 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.456025 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 23 13:32:09.456151 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.456125 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 23 13:32:09.456502 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.456404 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9pf4r" Apr 23 13:32:09.456593 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.456526 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-g4zgx" Apr 23 13:32:09.456593 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.456543 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 23 13:32:09.456700 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.456665 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 23 13:32:09.457286 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.456939 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-7vkbp\"" Apr 23 13:32:09.459082 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.459061 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 23 13:32:09.459295 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.459273 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-b6qz7\"" Apr 23 13:32:09.460157 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.460142 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 23 13:32:09.460516 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.460146 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 23 13:32:09.460516 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.460186 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-jnpgw\"" Apr 23 13:32:09.460657 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.460644 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 23 13:32:09.461037 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.461013 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4a590caf-dc65-421e-a4c8-40d3258ddd7b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-p9tzn\" (UID: \"4a590caf-dc65-421e-a4c8-40d3258ddd7b\") " pod="openshift-multus/multus-additional-cni-plugins-p9tzn" Apr 23 13:32:09.461122 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.461052 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4k592\" (UniqueName: \"kubernetes.io/projected/4a590caf-dc65-421e-a4c8-40d3258ddd7b-kube-api-access-4k592\") pod \"multus-additional-cni-plugins-p9tzn\" (UID: \"4a590caf-dc65-421e-a4c8-40d3258ddd7b\") " pod="openshift-multus/multus-additional-cni-plugins-p9tzn" Apr 23 13:32:09.461122 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.461076 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/58abee5a-98ee-4a90-ab84-a17d06d08d00-etc-openvswitch\") pod \"ovnkube-node-mwbjc\" (UID: \"58abee5a-98ee-4a90-ab84-a17d06d08d00\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwbjc" Apr 23 13:32:09.461122 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.461099 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/58abee5a-98ee-4a90-ab84-a17d06d08d00-host-cni-bin\") pod \"ovnkube-node-mwbjc\" (UID: \"58abee5a-98ee-4a90-ab84-a17d06d08d00\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwbjc" Apr 23 13:32:09.461338 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.461125 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4a590caf-dc65-421e-a4c8-40d3258ddd7b-system-cni-dir\") pod \"multus-additional-cni-plugins-p9tzn\" (UID: \"4a590caf-dc65-421e-a4c8-40d3258ddd7b\") " pod="openshift-multus/multus-additional-cni-plugins-p9tzn" Apr 23 13:32:09.461338 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.461166 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4a590caf-dc65-421e-a4c8-40d3258ddd7b-cni-binary-copy\") pod \"multus-additional-cni-plugins-p9tzn\" (UID: \"4a590caf-dc65-421e-a4c8-40d3258ddd7b\") " pod="openshift-multus/multus-additional-cni-plugins-p9tzn" Apr 23 13:32:09.461338 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.461180 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 23 13:32:09.461338 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.461210 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4a590caf-dc65-421e-a4c8-40d3258ddd7b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-p9tzn\" (UID: \"4a590caf-dc65-421e-a4c8-40d3258ddd7b\") " pod="openshift-multus/multus-additional-cni-plugins-p9tzn" Apr 23 13:32:09.461338 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.461241 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/58abee5a-98ee-4a90-ab84-a17d06d08d00-host-kubelet\") pod \"ovnkube-node-mwbjc\" (UID: \"58abee5a-98ee-4a90-ab84-a17d06d08d00\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwbjc" Apr 23 13:32:09.461338 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.461266 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/58abee5a-98ee-4a90-ab84-a17d06d08d00-host-run-netns\") pod \"ovnkube-node-mwbjc\" (UID: \"58abee5a-98ee-4a90-ab84-a17d06d08d00\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwbjc" Apr 23 13:32:09.461338 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.461290 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/58abee5a-98ee-4a90-ab84-a17d06d08d00-log-socket\") pod \"ovnkube-node-mwbjc\" (UID: \"58abee5a-98ee-4a90-ab84-a17d06d08d00\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwbjc" Apr 23 13:32:09.461338 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.461312 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4a590caf-dc65-421e-a4c8-40d3258ddd7b-os-release\") pod \"multus-additional-cni-plugins-p9tzn\" (UID: \"4a590caf-dc65-421e-a4c8-40d3258ddd7b\") " pod="openshift-multus/multus-additional-cni-plugins-p9tzn" Apr 23 13:32:09.461761 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.461345 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/58abee5a-98ee-4a90-ab84-a17d06d08d00-host-cni-netd\") pod \"ovnkube-node-mwbjc\" (UID: \"58abee5a-98ee-4a90-ab84-a17d06d08d00\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwbjc" Apr 23 13:32:09.461761 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.461369 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/58abee5a-98ee-4a90-ab84-a17d06d08d00-ovn-node-metrics-cert\") pod \"ovnkube-node-mwbjc\" (UID: \"58abee5a-98ee-4a90-ab84-a17d06d08d00\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwbjc" Apr 23 13:32:09.461761 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.461394 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/58abee5a-98ee-4a90-ab84-a17d06d08d00-host-run-ovn-kubernetes\") pod \"ovnkube-node-mwbjc\" (UID: \"58abee5a-98ee-4a90-ab84-a17d06d08d00\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwbjc" Apr 23 13:32:09.461761 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.461373 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-mbln5" Apr 23 13:32:09.461761 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.461435 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4a590caf-dc65-421e-a4c8-40d3258ddd7b-cnibin\") pod \"multus-additional-cni-plugins-p9tzn\" (UID: \"4a590caf-dc65-421e-a4c8-40d3258ddd7b\") " pod="openshift-multus/multus-additional-cni-plugins-p9tzn" Apr 23 13:32:09.461761 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.461467 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8de6e215-aa9d-4003-aac4-d2a7bbdb59fb-host-slash\") pod \"iptables-alerter-v5s77\" (UID: \"8de6e215-aa9d-4003-aac4-d2a7bbdb59fb\") " pod="openshift-network-operator/iptables-alerter-v5s77" Apr 23 13:32:09.461761 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.461509 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/58abee5a-98ee-4a90-ab84-a17d06d08d00-host-slash\") pod \"ovnkube-node-mwbjc\" (UID: \"58abee5a-98ee-4a90-ab84-a17d06d08d00\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwbjc" Apr 23 13:32:09.461761 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.461532 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/58abee5a-98ee-4a90-ab84-a17d06d08d00-var-lib-openvswitch\") pod \"ovnkube-node-mwbjc\" (UID: \"58abee5a-98ee-4a90-ab84-a17d06d08d00\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwbjc" Apr 23 13:32:09.461761 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.461555 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/58abee5a-98ee-4a90-ab84-a17d06d08d00-run-openvswitch\") pod \"ovnkube-node-mwbjc\" (UID: \"58abee5a-98ee-4a90-ab84-a17d06d08d00\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwbjc" Apr 23 13:32:09.461761 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.461588 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/58abee5a-98ee-4a90-ab84-a17d06d08d00-ovnkube-config\") pod \"ovnkube-node-mwbjc\" (UID: \"58abee5a-98ee-4a90-ab84-a17d06d08d00\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwbjc" Apr 23 13:32:09.461761 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.461613 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpjmn\" (UniqueName: \"kubernetes.io/projected/58abee5a-98ee-4a90-ab84-a17d06d08d00-kube-api-access-cpjmn\") pod \"ovnkube-node-mwbjc\" (UID: \"58abee5a-98ee-4a90-ab84-a17d06d08d00\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwbjc" Apr 23 13:32:09.461761 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.461638 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/2241e05b-7796-4e2e-b1cf-f47baaeef969-agent-certs\") pod \"konnectivity-agent-shltj\" (UID: \"2241e05b-7796-4e2e-b1cf-f47baaeef969\") " pod="kube-system/konnectivity-agent-shltj" Apr 23 13:32:09.461761 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.461679 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/58abee5a-98ee-4a90-ab84-a17d06d08d00-systemd-units\") pod \"ovnkube-node-mwbjc\" (UID: \"58abee5a-98ee-4a90-ab84-a17d06d08d00\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwbjc" Apr 23 13:32:09.461761 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.461711 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/58abee5a-98ee-4a90-ab84-a17d06d08d00-run-ovn\") pod \"ovnkube-node-mwbjc\" (UID: \"58abee5a-98ee-4a90-ab84-a17d06d08d00\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwbjc" Apr 23 13:32:09.461761 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.461735 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/58abee5a-98ee-4a90-ab84-a17d06d08d00-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mwbjc\" (UID: \"58abee5a-98ee-4a90-ab84-a17d06d08d00\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwbjc" Apr 23 13:32:09.461761 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.461762 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/8de6e215-aa9d-4003-aac4-d2a7bbdb59fb-iptables-alerter-script\") pod \"iptables-alerter-v5s77\" (UID: \"8de6e215-aa9d-4003-aac4-d2a7bbdb59fb\") " pod="openshift-network-operator/iptables-alerter-v5s77" Apr 23 13:32:09.462698 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.461786 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trbct\" (UniqueName: \"kubernetes.io/projected/8de6e215-aa9d-4003-aac4-d2a7bbdb59fb-kube-api-access-trbct\") pod \"iptables-alerter-v5s77\" (UID: \"8de6e215-aa9d-4003-aac4-d2a7bbdb59fb\") " pod="openshift-network-operator/iptables-alerter-v5s77" Apr 23 13:32:09.462698 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.461831 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/58abee5a-98ee-4a90-ab84-a17d06d08d00-run-systemd\") pod \"ovnkube-node-mwbjc\" (UID: \"58abee5a-98ee-4a90-ab84-a17d06d08d00\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwbjc" Apr 23 13:32:09.462698 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.461881 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/58abee5a-98ee-4a90-ab84-a17d06d08d00-node-log\") pod \"ovnkube-node-mwbjc\" (UID: \"58abee5a-98ee-4a90-ab84-a17d06d08d00\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwbjc" Apr 23 13:32:09.462698 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.461932 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/58abee5a-98ee-4a90-ab84-a17d06d08d00-ovnkube-script-lib\") pod \"ovnkube-node-mwbjc\" (UID: \"58abee5a-98ee-4a90-ab84-a17d06d08d00\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwbjc" Apr 23 13:32:09.462698 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.461962 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/2241e05b-7796-4e2e-b1cf-f47baaeef969-konnectivity-ca\") pod \"konnectivity-agent-shltj\" (UID: \"2241e05b-7796-4e2e-b1cf-f47baaeef969\") " pod="kube-system/konnectivity-agent-shltj" Apr 23 13:32:09.462698 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.461990 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/4a590caf-dc65-421e-a4c8-40d3258ddd7b-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-p9tzn\" (UID: \"4a590caf-dc65-421e-a4c8-40d3258ddd7b\") " pod="openshift-multus/multus-additional-cni-plugins-p9tzn" Apr 23 13:32:09.462698 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.462026 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/58abee5a-98ee-4a90-ab84-a17d06d08d00-env-overrides\") pod \"ovnkube-node-mwbjc\" (UID: \"58abee5a-98ee-4a90-ab84-a17d06d08d00\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwbjc" Apr 23 13:32:09.463665 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.463645 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fl2td" Apr 23 13:32:09.463758 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:09.463735 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fl2td" podUID="373af144-ae77-4496-8057-d855373807e4" Apr 23 13:32:09.463822 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.463766 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-xvg2d" Apr 23 13:32:09.463946 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.463925 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-svnt7\"" Apr 23 13:32:09.464009 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.464001 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 23 13:32:09.464090 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.464060 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 23 13:32:09.464288 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.464267 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 23 13:32:09.466086 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.466066 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6pz6w" Apr 23 13:32:09.466213 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:09.466142 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6pz6w" podUID="821df7f9-3f87-4f86-a7e9-82cad302fff0" Apr 23 13:32:09.466547 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.466434 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 23 13:32:09.466547 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.466434 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-5r6gd\"" Apr 23 13:32:09.468437 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.468418 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-wxlj2" Apr 23 13:32:09.470910 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.470892 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 23 13:32:09.471235 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.471218 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-wk8bw\"" Apr 23 13:32:09.471319 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.471226 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 23 13:32:09.498847 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.498814 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-22 13:27:08 +0000 UTC" deadline="2028-01-05 08:04:53.109063744 +0000 UTC" Apr 23 13:32:09.498847 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.498847 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14922h32m43.610220999s" Apr 23 13:32:09.552376 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.552354 2577 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 23 13:32:09.562987 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.562959 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/4a590caf-dc65-421e-a4c8-40d3258ddd7b-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-p9tzn\" (UID: \"4a590caf-dc65-421e-a4c8-40d3258ddd7b\") " pod="openshift-multus/multus-additional-cni-plugins-p9tzn" Apr 23 13:32:09.563099 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.562997 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/58abee5a-98ee-4a90-ab84-a17d06d08d00-env-overrides\") pod \"ovnkube-node-mwbjc\" (UID: \"58abee5a-98ee-4a90-ab84-a17d06d08d00\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwbjc" Apr 23 13:32:09.563099 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.563029 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/aecfee63-4703-49e8-81cc-aa07bc06ce4e-serviceca\") pod \"node-ca-mbln5\" (UID: \"aecfee63-4703-49e8-81cc-aa07bc06ce4e\") " pod="openshift-image-registry/node-ca-mbln5" Apr 23 13:32:09.563099 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.563052 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7c29fef9-0671-485c-988d-0b06e4091d1a-host\") pod \"tuned-g4zgx\" (UID: \"7c29fef9-0671-485c-988d-0b06e4091d1a\") " pod="openshift-cluster-node-tuning-operator/tuned-g4zgx" Apr 23 13:32:09.563099 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.563077 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b61ffc5b-bae6-4b74-a181-3e3df6606045-multus-conf-dir\") pod \"multus-xvg2d\" (UID: \"b61ffc5b-bae6-4b74-a181-3e3df6606045\") " pod="openshift-multus/multus-xvg2d" Apr 23 13:32:09.563296 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.563103 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4a590caf-dc65-421e-a4c8-40d3258ddd7b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-p9tzn\" (UID: \"4a590caf-dc65-421e-a4c8-40d3258ddd7b\") " pod="openshift-multus/multus-additional-cni-plugins-p9tzn" Apr 23 13:32:09.563296 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.563128 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4k592\" (UniqueName: \"kubernetes.io/projected/4a590caf-dc65-421e-a4c8-40d3258ddd7b-kube-api-access-4k592\") pod \"multus-additional-cni-plugins-p9tzn\" (UID: \"4a590caf-dc65-421e-a4c8-40d3258ddd7b\") " pod="openshift-multus/multus-additional-cni-plugins-p9tzn" Apr 23 13:32:09.563296 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.563145 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/58abee5a-98ee-4a90-ab84-a17d06d08d00-etc-openvswitch\") pod \"ovnkube-node-mwbjc\" (UID: \"58abee5a-98ee-4a90-ab84-a17d06d08d00\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwbjc" Apr 23 13:32:09.563296 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.563195 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/0fc2bb02-44f9-45bf-aa27-62f98f04e5ac-device-dir\") pod \"aws-ebs-csi-driver-node-9pf4r\" (UID: \"0fc2bb02-44f9-45bf-aa27-62f98f04e5ac\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9pf4r" Apr 23 13:32:09.563296 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.563205 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/58abee5a-98ee-4a90-ab84-a17d06d08d00-etc-openvswitch\") pod \"ovnkube-node-mwbjc\" (UID: \"58abee5a-98ee-4a90-ab84-a17d06d08d00\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwbjc" Apr 23 13:32:09.563296 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.563230 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7c29fef9-0671-485c-988d-0b06e4091d1a-etc-kubernetes\") pod \"tuned-g4zgx\" (UID: \"7c29fef9-0671-485c-988d-0b06e4091d1a\") " pod="openshift-cluster-node-tuning-operator/tuned-g4zgx" Apr 23 13:32:09.563296 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.563257 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/b61ffc5b-bae6-4b74-a181-3e3df6606045-host-var-lib-cni-multus\") pod \"multus-xvg2d\" (UID: \"b61ffc5b-bae6-4b74-a181-3e3df6606045\") " pod="openshift-multus/multus-xvg2d" Apr 23 13:32:09.563296 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.563274 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4a590caf-dc65-421e-a4c8-40d3258ddd7b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-p9tzn\" (UID: \"4a590caf-dc65-421e-a4c8-40d3258ddd7b\") " pod="openshift-multus/multus-additional-cni-plugins-p9tzn" Apr 23 13:32:09.563296 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.563280 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/5ab5642a-1989-41c1-956f-98f92fcc6f23-hosts-file\") pod \"node-resolver-wxlj2\" (UID: \"5ab5642a-1989-41c1-956f-98f92fcc6f23\") " pod="openshift-dns/node-resolver-wxlj2" Apr 23 13:32:09.563703 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.563334 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4a590caf-dc65-421e-a4c8-40d3258ddd7b-cni-binary-copy\") pod \"multus-additional-cni-plugins-p9tzn\" (UID: \"4a590caf-dc65-421e-a4c8-40d3258ddd7b\") " pod="openshift-multus/multus-additional-cni-plugins-p9tzn" Apr 23 13:32:09.563703 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.563359 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/58abee5a-98ee-4a90-ab84-a17d06d08d00-host-kubelet\") pod \"ovnkube-node-mwbjc\" (UID: \"58abee5a-98ee-4a90-ab84-a17d06d08d00\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwbjc" Apr 23 13:32:09.564943 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.564284 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/58abee5a-98ee-4a90-ab84-a17d06d08d00-host-kubelet\") pod \"ovnkube-node-mwbjc\" (UID: \"58abee5a-98ee-4a90-ab84-a17d06d08d00\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwbjc" Apr 23 13:32:09.564943 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.564303 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/58abee5a-98ee-4a90-ab84-a17d06d08d00-host-run-netns\") pod \"ovnkube-node-mwbjc\" (UID: \"58abee5a-98ee-4a90-ab84-a17d06d08d00\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwbjc" Apr 23 13:32:09.564943 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.564351 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/58abee5a-98ee-4a90-ab84-a17d06d08d00-log-socket\") pod \"ovnkube-node-mwbjc\" (UID: \"58abee5a-98ee-4a90-ab84-a17d06d08d00\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwbjc" Apr 23 13:32:09.564943 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.564395 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7c29fef9-0671-485c-988d-0b06e4091d1a-sys\") pod \"tuned-g4zgx\" (UID: \"7c29fef9-0671-485c-988d-0b06e4091d1a\") " pod="openshift-cluster-node-tuning-operator/tuned-g4zgx" Apr 23 13:32:09.564943 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.564431 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4a590caf-dc65-421e-a4c8-40d3258ddd7b-os-release\") pod \"multus-additional-cni-plugins-p9tzn\" (UID: \"4a590caf-dc65-421e-a4c8-40d3258ddd7b\") " pod="openshift-multus/multus-additional-cni-plugins-p9tzn" Apr 23 13:32:09.564943 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.564470 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/58abee5a-98ee-4a90-ab84-a17d06d08d00-ovn-node-metrics-cert\") pod \"ovnkube-node-mwbjc\" (UID: \"58abee5a-98ee-4a90-ab84-a17d06d08d00\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwbjc" Apr 23 13:32:09.564943 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.564469 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/4a590caf-dc65-421e-a4c8-40d3258ddd7b-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-p9tzn\" (UID: \"4a590caf-dc65-421e-a4c8-40d3258ddd7b\") " pod="openshift-multus/multus-additional-cni-plugins-p9tzn" Apr 23 13:32:09.564943 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.564525 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/7c29fef9-0671-485c-988d-0b06e4091d1a-etc-tuned\") pod \"tuned-g4zgx\" (UID: \"7c29fef9-0671-485c-988d-0b06e4091d1a\") " pod="openshift-cluster-node-tuning-operator/tuned-g4zgx" Apr 23 13:32:09.564943 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.564553 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/0fc2bb02-44f9-45bf-aa27-62f98f04e5ac-registration-dir\") pod \"aws-ebs-csi-driver-node-9pf4r\" (UID: \"0fc2bb02-44f9-45bf-aa27-62f98f04e5ac\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9pf4r" Apr 23 13:32:09.564943 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.564596 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/7c29fef9-0671-485c-988d-0b06e4091d1a-etc-modprobe-d\") pod \"tuned-g4zgx\" (UID: \"7c29fef9-0671-485c-988d-0b06e4091d1a\") " pod="openshift-cluster-node-tuning-operator/tuned-g4zgx" Apr 23 13:32:09.564943 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.564650 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/58abee5a-98ee-4a90-ab84-a17d06d08d00-env-overrides\") pod \"ovnkube-node-mwbjc\" (UID: \"58abee5a-98ee-4a90-ab84-a17d06d08d00\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwbjc" Apr 23 13:32:09.564943 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.564659 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4a590caf-dc65-421e-a4c8-40d3258ddd7b-os-release\") pod \"multus-additional-cni-plugins-p9tzn\" (UID: \"4a590caf-dc65-421e-a4c8-40d3258ddd7b\") " pod="openshift-multus/multus-additional-cni-plugins-p9tzn" Apr 23 13:32:09.564943 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.564716 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/58abee5a-98ee-4a90-ab84-a17d06d08d00-log-socket\") pod \"ovnkube-node-mwbjc\" (UID: \"58abee5a-98ee-4a90-ab84-a17d06d08d00\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwbjc" Apr 23 13:32:09.564943 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.564752 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/58abee5a-98ee-4a90-ab84-a17d06d08d00-host-slash\") pod \"ovnkube-node-mwbjc\" (UID: \"58abee5a-98ee-4a90-ab84-a17d06d08d00\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwbjc" Apr 23 13:32:09.564943 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.564760 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4a590caf-dc65-421e-a4c8-40d3258ddd7b-cni-binary-copy\") pod \"multus-additional-cni-plugins-p9tzn\" (UID: \"4a590caf-dc65-421e-a4c8-40d3258ddd7b\") " pod="openshift-multus/multus-additional-cni-plugins-p9tzn" Apr 23 13:32:09.564943 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.564782 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/58abee5a-98ee-4a90-ab84-a17d06d08d00-var-lib-openvswitch\") pod \"ovnkube-node-mwbjc\" (UID: \"58abee5a-98ee-4a90-ab84-a17d06d08d00\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwbjc" Apr 23 13:32:09.564943 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.564838 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/58abee5a-98ee-4a90-ab84-a17d06d08d00-host-slash\") pod \"ovnkube-node-mwbjc\" (UID: \"58abee5a-98ee-4a90-ab84-a17d06d08d00\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwbjc" Apr 23 13:32:09.565756 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.564862 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/58abee5a-98ee-4a90-ab84-a17d06d08d00-var-lib-openvswitch\") pod \"ovnkube-node-mwbjc\" (UID: \"58abee5a-98ee-4a90-ab84-a17d06d08d00\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwbjc" Apr 23 13:32:09.565756 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.564900 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/58abee5a-98ee-4a90-ab84-a17d06d08d00-run-openvswitch\") pod \"ovnkube-node-mwbjc\" (UID: \"58abee5a-98ee-4a90-ab84-a17d06d08d00\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwbjc" Apr 23 13:32:09.565756 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.564924 2577 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 23 13:32:09.565756 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.564935 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/58abee5a-98ee-4a90-ab84-a17d06d08d00-host-run-netns\") pod \"ovnkube-node-mwbjc\" (UID: \"58abee5a-98ee-4a90-ab84-a17d06d08d00\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwbjc" Apr 23 13:32:09.565756 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.564812 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/58abee5a-98ee-4a90-ab84-a17d06d08d00-run-openvswitch\") pod \"ovnkube-node-mwbjc\" (UID: \"58abee5a-98ee-4a90-ab84-a17d06d08d00\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwbjc" Apr 23 13:32:09.565756 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.565031 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/58abee5a-98ee-4a90-ab84-a17d06d08d00-ovnkube-config\") pod \"ovnkube-node-mwbjc\" (UID: \"58abee5a-98ee-4a90-ab84-a17d06d08d00\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwbjc" Apr 23 13:32:09.565756 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.565065 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/2241e05b-7796-4e2e-b1cf-f47baaeef969-agent-certs\") pod \"konnectivity-agent-shltj\" (UID: \"2241e05b-7796-4e2e-b1cf-f47baaeef969\") " pod="kube-system/konnectivity-agent-shltj" Apr 23 13:32:09.565756 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.565098 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/0fc2bb02-44f9-45bf-aa27-62f98f04e5ac-socket-dir\") pod \"aws-ebs-csi-driver-node-9pf4r\" (UID: \"0fc2bb02-44f9-45bf-aa27-62f98f04e5ac\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9pf4r" Apr 23 13:32:09.565756 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.565132 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/7c29fef9-0671-485c-988d-0b06e4091d1a-etc-systemd\") pod \"tuned-g4zgx\" (UID: \"7c29fef9-0671-485c-988d-0b06e4091d1a\") " pod="openshift-cluster-node-tuning-operator/tuned-g4zgx" Apr 23 13:32:09.565756 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.565155 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7c29fef9-0671-485c-988d-0b06e4091d1a-var-lib-kubelet\") pod \"tuned-g4zgx\" (UID: \"7c29fef9-0671-485c-988d-0b06e4091d1a\") " pod="openshift-cluster-node-tuning-operator/tuned-g4zgx" Apr 23 13:32:09.565756 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.565187 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/58abee5a-98ee-4a90-ab84-a17d06d08d00-run-ovn\") pod \"ovnkube-node-mwbjc\" (UID: \"58abee5a-98ee-4a90-ab84-a17d06d08d00\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwbjc" Apr 23 13:32:09.565756 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.565215 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7c29fef9-0671-485c-988d-0b06e4091d1a-run\") pod \"tuned-g4zgx\" (UID: \"7c29fef9-0671-485c-988d-0b06e4091d1a\") " pod="openshift-cluster-node-tuning-operator/tuned-g4zgx" Apr 23 13:32:09.565756 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.565248 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjg76\" (UniqueName: \"kubernetes.io/projected/373af144-ae77-4496-8057-d855373807e4-kube-api-access-gjg76\") pod \"network-check-target-fl2td\" (UID: \"373af144-ae77-4496-8057-d855373807e4\") " pod="openshift-network-diagnostics/network-check-target-fl2td" Apr 23 13:32:09.565756 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.565282 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-trbct\" (UniqueName: \"kubernetes.io/projected/8de6e215-aa9d-4003-aac4-d2a7bbdb59fb-kube-api-access-trbct\") pod \"iptables-alerter-v5s77\" (UID: \"8de6e215-aa9d-4003-aac4-d2a7bbdb59fb\") " pod="openshift-network-operator/iptables-alerter-v5s77" Apr 23 13:32:09.565756 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.565337 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/58abee5a-98ee-4a90-ab84-a17d06d08d00-run-systemd\") pod \"ovnkube-node-mwbjc\" (UID: \"58abee5a-98ee-4a90-ab84-a17d06d08d00\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwbjc" Apr 23 13:32:09.565756 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.565369 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/58abee5a-98ee-4a90-ab84-a17d06d08d00-ovnkube-script-lib\") pod \"ovnkube-node-mwbjc\" (UID: \"58abee5a-98ee-4a90-ab84-a17d06d08d00\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwbjc" Apr 23 13:32:09.565756 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.565399 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/2241e05b-7796-4e2e-b1cf-f47baaeef969-konnectivity-ca\") pod \"konnectivity-agent-shltj\" (UID: \"2241e05b-7796-4e2e-b1cf-f47baaeef969\") " pod="kube-system/konnectivity-agent-shltj" Apr 23 13:32:09.566458 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.565434 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vr8wc\" (UniqueName: \"kubernetes.io/projected/0fc2bb02-44f9-45bf-aa27-62f98f04e5ac-kube-api-access-vr8wc\") pod \"aws-ebs-csi-driver-node-9pf4r\" (UID: \"0fc2bb02-44f9-45bf-aa27-62f98f04e5ac\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9pf4r" Apr 23 13:32:09.566458 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.565466 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/7c29fef9-0671-485c-988d-0b06e4091d1a-etc-sysctl-d\") pod \"tuned-g4zgx\" (UID: \"7c29fef9-0671-485c-988d-0b06e4091d1a\") " pod="openshift-cluster-node-tuning-operator/tuned-g4zgx" Apr 23 13:32:09.566458 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.565508 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b61ffc5b-bae6-4b74-a181-3e3df6606045-system-cni-dir\") pod \"multus-xvg2d\" (UID: \"b61ffc5b-bae6-4b74-a181-3e3df6606045\") " pod="openshift-multus/multus-xvg2d" Apr 23 13:32:09.566458 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.565542 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b61ffc5b-bae6-4b74-a181-3e3df6606045-multus-cni-dir\") pod \"multus-xvg2d\" (UID: \"b61ffc5b-bae6-4b74-a181-3e3df6606045\") " pod="openshift-multus/multus-xvg2d" Apr 23 13:32:09.566458 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.565574 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/b61ffc5b-bae6-4b74-a181-3e3df6606045-multus-socket-dir-parent\") pod \"multus-xvg2d\" (UID: \"b61ffc5b-bae6-4b74-a181-3e3df6606045\") " pod="openshift-multus/multus-xvg2d" Apr 23 13:32:09.566458 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.565604 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/b61ffc5b-bae6-4b74-a181-3e3df6606045-host-run-k8s-cni-cncf-io\") pod \"multus-xvg2d\" (UID: \"b61ffc5b-bae6-4b74-a181-3e3df6606045\") " pod="openshift-multus/multus-xvg2d" Apr 23 13:32:09.566458 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.565621 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/58abee5a-98ee-4a90-ab84-a17d06d08d00-ovnkube-config\") pod \"ovnkube-node-mwbjc\" (UID: \"58abee5a-98ee-4a90-ab84-a17d06d08d00\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwbjc" Apr 23 13:32:09.566458 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.565633 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b61ffc5b-bae6-4b74-a181-3e3df6606045-host-run-netns\") pod \"multus-xvg2d\" (UID: \"b61ffc5b-bae6-4b74-a181-3e3df6606045\") " pod="openshift-multus/multus-xvg2d" Apr 23 13:32:09.566458 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.565654 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/58abee5a-98ee-4a90-ab84-a17d06d08d00-run-ovn\") pod \"ovnkube-node-mwbjc\" (UID: \"58abee5a-98ee-4a90-ab84-a17d06d08d00\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwbjc" Apr 23 13:32:09.566458 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.565664 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/821df7f9-3f87-4f86-a7e9-82cad302fff0-metrics-certs\") pod \"network-metrics-daemon-6pz6w\" (UID: \"821df7f9-3f87-4f86-a7e9-82cad302fff0\") " pod="openshift-multus/network-metrics-daemon-6pz6w" Apr 23 13:32:09.566458 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.565689 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kpq6\" (UniqueName: \"kubernetes.io/projected/821df7f9-3f87-4f86-a7e9-82cad302fff0-kube-api-access-5kpq6\") pod \"network-metrics-daemon-6pz6w\" (UID: \"821df7f9-3f87-4f86-a7e9-82cad302fff0\") " pod="openshift-multus/network-metrics-daemon-6pz6w" Apr 23 13:32:09.566458 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.565743 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/58abee5a-98ee-4a90-ab84-a17d06d08d00-run-systemd\") pod \"ovnkube-node-mwbjc\" (UID: \"58abee5a-98ee-4a90-ab84-a17d06d08d00\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwbjc" Apr 23 13:32:09.566458 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.566382 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/58abee5a-98ee-4a90-ab84-a17d06d08d00-ovnkube-script-lib\") pod \"ovnkube-node-mwbjc\" (UID: \"58abee5a-98ee-4a90-ab84-a17d06d08d00\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwbjc" Apr 23 13:32:09.566458 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.566448 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/2241e05b-7796-4e2e-b1cf-f47baaeef969-konnectivity-ca\") pod \"konnectivity-agent-shltj\" (UID: \"2241e05b-7796-4e2e-b1cf-f47baaeef969\") " pod="kube-system/konnectivity-agent-shltj" Apr 23 13:32:09.568004 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.567853 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/58abee5a-98ee-4a90-ab84-a17d06d08d00-host-cni-bin\") pod \"ovnkube-node-mwbjc\" (UID: \"58abee5a-98ee-4a90-ab84-a17d06d08d00\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwbjc" Apr 23 13:32:09.568004 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.567865 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/58abee5a-98ee-4a90-ab84-a17d06d08d00-host-cni-bin\") pod \"ovnkube-node-mwbjc\" (UID: \"58abee5a-98ee-4a90-ab84-a17d06d08d00\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwbjc" Apr 23 13:32:09.568004 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.567897 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7c29fef9-0671-485c-988d-0b06e4091d1a-lib-modules\") pod \"tuned-g4zgx\" (UID: \"7c29fef9-0671-485c-988d-0b06e4091d1a\") " pod="openshift-cluster-node-tuning-operator/tuned-g4zgx" Apr 23 13:32:09.568004 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.567945 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b61ffc5b-bae6-4b74-a181-3e3df6606045-cni-binary-copy\") pod \"multus-xvg2d\" (UID: \"b61ffc5b-bae6-4b74-a181-3e3df6606045\") " pod="openshift-multus/multus-xvg2d" Apr 23 13:32:09.568004 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.567973 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fndp5\" (UniqueName: \"kubernetes.io/projected/b61ffc5b-bae6-4b74-a181-3e3df6606045-kube-api-access-fndp5\") pod \"multus-xvg2d\" (UID: \"b61ffc5b-bae6-4b74-a181-3e3df6606045\") " pod="openshift-multus/multus-xvg2d" Apr 23 13:32:09.568295 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.568028 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4a590caf-dc65-421e-a4c8-40d3258ddd7b-system-cni-dir\") pod \"multus-additional-cni-plugins-p9tzn\" (UID: \"4a590caf-dc65-421e-a4c8-40d3258ddd7b\") " pod="openshift-multus/multus-additional-cni-plugins-p9tzn" Apr 23 13:32:09.568295 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.568054 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4a590caf-dc65-421e-a4c8-40d3258ddd7b-system-cni-dir\") pod \"multus-additional-cni-plugins-p9tzn\" (UID: \"4a590caf-dc65-421e-a4c8-40d3258ddd7b\") " pod="openshift-multus/multus-additional-cni-plugins-p9tzn" Apr 23 13:32:09.568295 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.568067 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4a590caf-dc65-421e-a4c8-40d3258ddd7b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-p9tzn\" (UID: \"4a590caf-dc65-421e-a4c8-40d3258ddd7b\") " pod="openshift-multus/multus-additional-cni-plugins-p9tzn" Apr 23 13:32:09.568295 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.568101 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/aecfee63-4703-49e8-81cc-aa07bc06ce4e-host\") pod \"node-ca-mbln5\" (UID: \"aecfee63-4703-49e8-81cc-aa07bc06ce4e\") " pod="openshift-image-registry/node-ca-mbln5" Apr 23 13:32:09.568295 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.568127 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-469cf\" (UniqueName: \"kubernetes.io/projected/aecfee63-4703-49e8-81cc-aa07bc06ce4e-kube-api-access-469cf\") pod \"node-ca-mbln5\" (UID: \"aecfee63-4703-49e8-81cc-aa07bc06ce4e\") " pod="openshift-image-registry/node-ca-mbln5" Apr 23 13:32:09.568295 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.568179 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b61ffc5b-bae6-4b74-a181-3e3df6606045-os-release\") pod \"multus-xvg2d\" (UID: \"b61ffc5b-bae6-4b74-a181-3e3df6606045\") " pod="openshift-multus/multus-xvg2d" Apr 23 13:32:09.568650 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.568288 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/b61ffc5b-bae6-4b74-a181-3e3df6606045-host-run-multus-certs\") pod \"multus-xvg2d\" (UID: \"b61ffc5b-bae6-4b74-a181-3e3df6606045\") " pod="openshift-multus/multus-xvg2d" Apr 23 13:32:09.568650 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.568360 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/58abee5a-98ee-4a90-ab84-a17d06d08d00-host-cni-netd\") pod \"ovnkube-node-mwbjc\" (UID: \"58abee5a-98ee-4a90-ab84-a17d06d08d00\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwbjc" Apr 23 13:32:09.568650 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.568393 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/7c29fef9-0671-485c-988d-0b06e4091d1a-etc-sysctl-conf\") pod \"tuned-g4zgx\" (UID: \"7c29fef9-0671-485c-988d-0b06e4091d1a\") " pod="openshift-cluster-node-tuning-operator/tuned-g4zgx" Apr 23 13:32:09.568650 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.568421 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b61ffc5b-bae6-4b74-a181-3e3df6606045-etc-kubernetes\") pod \"multus-xvg2d\" (UID: \"b61ffc5b-bae6-4b74-a181-3e3df6606045\") " pod="openshift-multus/multus-xvg2d" Apr 23 13:32:09.568650 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.568503 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/58abee5a-98ee-4a90-ab84-a17d06d08d00-ovn-node-metrics-cert\") pod \"ovnkube-node-mwbjc\" (UID: \"58abee5a-98ee-4a90-ab84-a17d06d08d00\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwbjc" Apr 23 13:32:09.568650 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.568523 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzzwf\" (UniqueName: \"kubernetes.io/projected/5ab5642a-1989-41c1-956f-98f92fcc6f23-kube-api-access-qzzwf\") pod \"node-resolver-wxlj2\" (UID: \"5ab5642a-1989-41c1-956f-98f92fcc6f23\") " pod="openshift-dns/node-resolver-wxlj2" Apr 23 13:32:09.568650 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.568504 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/58abee5a-98ee-4a90-ab84-a17d06d08d00-host-cni-netd\") pod \"ovnkube-node-mwbjc\" (UID: \"58abee5a-98ee-4a90-ab84-a17d06d08d00\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwbjc" Apr 23 13:32:09.568650 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.568568 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/58abee5a-98ee-4a90-ab84-a17d06d08d00-host-run-ovn-kubernetes\") pod \"ovnkube-node-mwbjc\" (UID: \"58abee5a-98ee-4a90-ab84-a17d06d08d00\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwbjc" Apr 23 13:32:09.568650 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.568631 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/0fc2bb02-44f9-45bf-aa27-62f98f04e5ac-etc-selinux\") pod \"aws-ebs-csi-driver-node-9pf4r\" (UID: \"0fc2bb02-44f9-45bf-aa27-62f98f04e5ac\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9pf4r" Apr 23 13:32:09.569028 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.568661 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/58abee5a-98ee-4a90-ab84-a17d06d08d00-host-run-ovn-kubernetes\") pod \"ovnkube-node-mwbjc\" (UID: \"58abee5a-98ee-4a90-ab84-a17d06d08d00\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwbjc" Apr 23 13:32:09.569028 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.568671 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7c29fef9-0671-485c-988d-0b06e4091d1a-tmp\") pod \"tuned-g4zgx\" (UID: \"7c29fef9-0671-485c-988d-0b06e4091d1a\") " pod="openshift-cluster-node-tuning-operator/tuned-g4zgx" Apr 23 13:32:09.569028 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.568717 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/b61ffc5b-bae6-4b74-a181-3e3df6606045-multus-daemon-config\") pod \"multus-xvg2d\" (UID: \"b61ffc5b-bae6-4b74-a181-3e3df6606045\") " pod="openshift-multus/multus-xvg2d" Apr 23 13:32:09.569028 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.568754 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4a590caf-dc65-421e-a4c8-40d3258ddd7b-cnibin\") pod \"multus-additional-cni-plugins-p9tzn\" (UID: \"4a590caf-dc65-421e-a4c8-40d3258ddd7b\") " pod="openshift-multus/multus-additional-cni-plugins-p9tzn" Apr 23 13:32:09.569028 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.568875 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4a590caf-dc65-421e-a4c8-40d3258ddd7b-cnibin\") pod \"multus-additional-cni-plugins-p9tzn\" (UID: \"4a590caf-dc65-421e-a4c8-40d3258ddd7b\") " pod="openshift-multus/multus-additional-cni-plugins-p9tzn" Apr 23 13:32:09.569228 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.569205 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8de6e215-aa9d-4003-aac4-d2a7bbdb59fb-host-slash\") pod \"iptables-alerter-v5s77\" (UID: \"8de6e215-aa9d-4003-aac4-d2a7bbdb59fb\") " pod="openshift-network-operator/iptables-alerter-v5s77" Apr 23 13:32:09.569306 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.569282 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cpjmn\" (UniqueName: \"kubernetes.io/projected/58abee5a-98ee-4a90-ab84-a17d06d08d00-kube-api-access-cpjmn\") pod \"ovnkube-node-mwbjc\" (UID: \"58abee5a-98ee-4a90-ab84-a17d06d08d00\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwbjc" Apr 23 13:32:09.569362 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.569331 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b61ffc5b-bae6-4b74-a181-3e3df6606045-cnibin\") pod \"multus-xvg2d\" (UID: \"b61ffc5b-bae6-4b74-a181-3e3df6606045\") " pod="openshift-multus/multus-xvg2d" Apr 23 13:32:09.569362 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.569288 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8de6e215-aa9d-4003-aac4-d2a7bbdb59fb-host-slash\") pod \"iptables-alerter-v5s77\" (UID: \"8de6e215-aa9d-4003-aac4-d2a7bbdb59fb\") " pod="openshift-network-operator/iptables-alerter-v5s77" Apr 23 13:32:09.569459 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.569359 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b61ffc5b-bae6-4b74-a181-3e3df6606045-host-var-lib-cni-bin\") pod \"multus-xvg2d\" (UID: \"b61ffc5b-bae6-4b74-a181-3e3df6606045\") " pod="openshift-multus/multus-xvg2d" Apr 23 13:32:09.569459 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.569385 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5ab5642a-1989-41c1-956f-98f92fcc6f23-tmp-dir\") pod \"node-resolver-wxlj2\" (UID: \"5ab5642a-1989-41c1-956f-98f92fcc6f23\") " pod="openshift-dns/node-resolver-wxlj2" Apr 23 13:32:09.569459 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.569407 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4a590caf-dc65-421e-a4c8-40d3258ddd7b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-p9tzn\" (UID: \"4a590caf-dc65-421e-a4c8-40d3258ddd7b\") " pod="openshift-multus/multus-additional-cni-plugins-p9tzn" Apr 23 13:32:09.569459 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.569415 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/58abee5a-98ee-4a90-ab84-a17d06d08d00-systemd-units\") pod \"ovnkube-node-mwbjc\" (UID: \"58abee5a-98ee-4a90-ab84-a17d06d08d00\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwbjc" Apr 23 13:32:09.569459 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.569451 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/58abee5a-98ee-4a90-ab84-a17d06d08d00-systemd-units\") pod \"ovnkube-node-mwbjc\" (UID: \"58abee5a-98ee-4a90-ab84-a17d06d08d00\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwbjc" Apr 23 13:32:09.569712 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.569468 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/58abee5a-98ee-4a90-ab84-a17d06d08d00-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mwbjc\" (UID: \"58abee5a-98ee-4a90-ab84-a17d06d08d00\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwbjc" Apr 23 13:32:09.569712 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.569512 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0fc2bb02-44f9-45bf-aa27-62f98f04e5ac-kubelet-dir\") pod \"aws-ebs-csi-driver-node-9pf4r\" (UID: \"0fc2bb02-44f9-45bf-aa27-62f98f04e5ac\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9pf4r" Apr 23 13:32:09.569712 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.569535 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/7c29fef9-0671-485c-988d-0b06e4091d1a-etc-sysconfig\") pod \"tuned-g4zgx\" (UID: \"7c29fef9-0671-485c-988d-0b06e4091d1a\") " pod="openshift-cluster-node-tuning-operator/tuned-g4zgx" Apr 23 13:32:09.569712 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.569545 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/58abee5a-98ee-4a90-ab84-a17d06d08d00-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mwbjc\" (UID: \"58abee5a-98ee-4a90-ab84-a17d06d08d00\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwbjc" Apr 23 13:32:09.569712 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.569566 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b61ffc5b-bae6-4b74-a181-3e3df6606045-host-var-lib-kubelet\") pod \"multus-xvg2d\" (UID: \"b61ffc5b-bae6-4b74-a181-3e3df6606045\") " pod="openshift-multus/multus-xvg2d" Apr 23 13:32:09.569712 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.569591 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/8de6e215-aa9d-4003-aac4-d2a7bbdb59fb-iptables-alerter-script\") pod \"iptables-alerter-v5s77\" (UID: \"8de6e215-aa9d-4003-aac4-d2a7bbdb59fb\") " pod="openshift-network-operator/iptables-alerter-v5s77" Apr 23 13:32:09.569712 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.569618 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/58abee5a-98ee-4a90-ab84-a17d06d08d00-node-log\") pod \"ovnkube-node-mwbjc\" (UID: \"58abee5a-98ee-4a90-ab84-a17d06d08d00\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwbjc" Apr 23 13:32:09.569712 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.569641 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/0fc2bb02-44f9-45bf-aa27-62f98f04e5ac-sys-fs\") pod \"aws-ebs-csi-driver-node-9pf4r\" (UID: \"0fc2bb02-44f9-45bf-aa27-62f98f04e5ac\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9pf4r" Apr 23 13:32:09.569712 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.569675 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxj2c\" (UniqueName: \"kubernetes.io/projected/7c29fef9-0671-485c-988d-0b06e4091d1a-kube-api-access-xxj2c\") pod \"tuned-g4zgx\" (UID: \"7c29fef9-0671-485c-988d-0b06e4091d1a\") " pod="openshift-cluster-node-tuning-operator/tuned-g4zgx" Apr 23 13:32:09.569712 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.569687 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/58abee5a-98ee-4a90-ab84-a17d06d08d00-node-log\") pod \"ovnkube-node-mwbjc\" (UID: \"58abee5a-98ee-4a90-ab84-a17d06d08d00\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwbjc" Apr 23 13:32:09.569712 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.569704 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/b61ffc5b-bae6-4b74-a181-3e3df6606045-hostroot\") pod \"multus-xvg2d\" (UID: \"b61ffc5b-bae6-4b74-a181-3e3df6606045\") " pod="openshift-multus/multus-xvg2d" Apr 23 13:32:09.570266 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.570242 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/2241e05b-7796-4e2e-b1cf-f47baaeef969-agent-certs\") pod \"konnectivity-agent-shltj\" (UID: \"2241e05b-7796-4e2e-b1cf-f47baaeef969\") " pod="kube-system/konnectivity-agent-shltj" Apr 23 13:32:09.570543 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.570522 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/8de6e215-aa9d-4003-aac4-d2a7bbdb59fb-iptables-alerter-script\") pod \"iptables-alerter-v5s77\" (UID: \"8de6e215-aa9d-4003-aac4-d2a7bbdb59fb\") " pod="openshift-network-operator/iptables-alerter-v5s77" Apr 23 13:32:09.574146 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.574124 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-trbct\" (UniqueName: \"kubernetes.io/projected/8de6e215-aa9d-4003-aac4-d2a7bbdb59fb-kube-api-access-trbct\") pod \"iptables-alerter-v5s77\" (UID: \"8de6e215-aa9d-4003-aac4-d2a7bbdb59fb\") " pod="openshift-network-operator/iptables-alerter-v5s77" Apr 23 13:32:09.574410 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.574392 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4k592\" (UniqueName: \"kubernetes.io/projected/4a590caf-dc65-421e-a4c8-40d3258ddd7b-kube-api-access-4k592\") pod \"multus-additional-cni-plugins-p9tzn\" (UID: \"4a590caf-dc65-421e-a4c8-40d3258ddd7b\") " pod="openshift-multus/multus-additional-cni-plugins-p9tzn" Apr 23 13:32:09.577076 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.577054 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpjmn\" (UniqueName: \"kubernetes.io/projected/58abee5a-98ee-4a90-ab84-a17d06d08d00-kube-api-access-cpjmn\") pod \"ovnkube-node-mwbjc\" (UID: \"58abee5a-98ee-4a90-ab84-a17d06d08d00\") " pod="openshift-ovn-kubernetes/ovnkube-node-mwbjc" Apr 23 13:32:09.582608 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.582571 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-207.ec2.internal" event={"ID":"bbbaef9da88b934e809d0d29bccb4dd7","Type":"ContainerStarted","Data":"da4cd21a353103cc1bdb2795d2f23f3b33f9995bf5e4cabdcfb59576244f982b"} Apr 23 13:32:09.583905 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.583879 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-207.ec2.internal" event={"ID":"e0714ba56e53b69dc64c17ef7d8f7924","Type":"ContainerStarted","Data":"612cdd3df255e2b132f6494fa922f80ed9a205753acccc6e3631c4a643f32c13"} Apr 23 13:32:09.670134 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.670106 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/0fc2bb02-44f9-45bf-aa27-62f98f04e5ac-socket-dir\") pod \"aws-ebs-csi-driver-node-9pf4r\" (UID: \"0fc2bb02-44f9-45bf-aa27-62f98f04e5ac\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9pf4r" Apr 23 13:32:09.670134 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.670137 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/7c29fef9-0671-485c-988d-0b06e4091d1a-etc-systemd\") pod \"tuned-g4zgx\" (UID: \"7c29fef9-0671-485c-988d-0b06e4091d1a\") " pod="openshift-cluster-node-tuning-operator/tuned-g4zgx" Apr 23 13:32:09.670375 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.670156 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7c29fef9-0671-485c-988d-0b06e4091d1a-var-lib-kubelet\") pod \"tuned-g4zgx\" (UID: \"7c29fef9-0671-485c-988d-0b06e4091d1a\") " pod="openshift-cluster-node-tuning-operator/tuned-g4zgx" Apr 23 13:32:09.670375 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.670182 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7c29fef9-0671-485c-988d-0b06e4091d1a-run\") pod \"tuned-g4zgx\" (UID: \"7c29fef9-0671-485c-988d-0b06e4091d1a\") " pod="openshift-cluster-node-tuning-operator/tuned-g4zgx" Apr 23 13:32:09.670375 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.670206 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gjg76\" (UniqueName: \"kubernetes.io/projected/373af144-ae77-4496-8057-d855373807e4-kube-api-access-gjg76\") pod \"network-check-target-fl2td\" (UID: \"373af144-ae77-4496-8057-d855373807e4\") " pod="openshift-network-diagnostics/network-check-target-fl2td" Apr 23 13:32:09.670375 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.670233 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vr8wc\" (UniqueName: \"kubernetes.io/projected/0fc2bb02-44f9-45bf-aa27-62f98f04e5ac-kube-api-access-vr8wc\") pod \"aws-ebs-csi-driver-node-9pf4r\" (UID: \"0fc2bb02-44f9-45bf-aa27-62f98f04e5ac\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9pf4r" Apr 23 13:32:09.670375 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.670239 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7c29fef9-0671-485c-988d-0b06e4091d1a-var-lib-kubelet\") pod \"tuned-g4zgx\" (UID: \"7c29fef9-0671-485c-988d-0b06e4091d1a\") " pod="openshift-cluster-node-tuning-operator/tuned-g4zgx" Apr 23 13:32:09.670375 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.670257 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/7c29fef9-0671-485c-988d-0b06e4091d1a-etc-sysctl-d\") pod \"tuned-g4zgx\" (UID: \"7c29fef9-0671-485c-988d-0b06e4091d1a\") " pod="openshift-cluster-node-tuning-operator/tuned-g4zgx" Apr 23 13:32:09.670375 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.670243 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/7c29fef9-0671-485c-988d-0b06e4091d1a-etc-systemd\") pod \"tuned-g4zgx\" (UID: \"7c29fef9-0671-485c-988d-0b06e4091d1a\") " pod="openshift-cluster-node-tuning-operator/tuned-g4zgx" Apr 23 13:32:09.670375 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.670268 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/0fc2bb02-44f9-45bf-aa27-62f98f04e5ac-socket-dir\") pod \"aws-ebs-csi-driver-node-9pf4r\" (UID: \"0fc2bb02-44f9-45bf-aa27-62f98f04e5ac\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9pf4r" Apr 23 13:32:09.670375 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.670282 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b61ffc5b-bae6-4b74-a181-3e3df6606045-system-cni-dir\") pod \"multus-xvg2d\" (UID: \"b61ffc5b-bae6-4b74-a181-3e3df6606045\") " pod="openshift-multus/multus-xvg2d" Apr 23 13:32:09.670375 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.670305 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b61ffc5b-bae6-4b74-a181-3e3df6606045-multus-cni-dir\") pod \"multus-xvg2d\" (UID: \"b61ffc5b-bae6-4b74-a181-3e3df6606045\") " pod="openshift-multus/multus-xvg2d" Apr 23 13:32:09.670375 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.670318 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7c29fef9-0671-485c-988d-0b06e4091d1a-run\") pod \"tuned-g4zgx\" (UID: \"7c29fef9-0671-485c-988d-0b06e4091d1a\") " pod="openshift-cluster-node-tuning-operator/tuned-g4zgx" Apr 23 13:32:09.670375 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.670319 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/b61ffc5b-bae6-4b74-a181-3e3df6606045-multus-socket-dir-parent\") pod \"multus-xvg2d\" (UID: \"b61ffc5b-bae6-4b74-a181-3e3df6606045\") " pod="openshift-multus/multus-xvg2d" Apr 23 13:32:09.670375 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.670367 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/b61ffc5b-bae6-4b74-a181-3e3df6606045-multus-socket-dir-parent\") pod \"multus-xvg2d\" (UID: \"b61ffc5b-bae6-4b74-a181-3e3df6606045\") " pod="openshift-multus/multus-xvg2d" Apr 23 13:32:09.670989 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.670382 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/7c29fef9-0671-485c-988d-0b06e4091d1a-etc-sysctl-d\") pod \"tuned-g4zgx\" (UID: \"7c29fef9-0671-485c-988d-0b06e4091d1a\") " pod="openshift-cluster-node-tuning-operator/tuned-g4zgx" Apr 23 13:32:09.670989 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.670395 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/b61ffc5b-bae6-4b74-a181-3e3df6606045-host-run-k8s-cni-cncf-io\") pod \"multus-xvg2d\" (UID: \"b61ffc5b-bae6-4b74-a181-3e3df6606045\") " pod="openshift-multus/multus-xvg2d" Apr 23 13:32:09.670989 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.670363 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b61ffc5b-bae6-4b74-a181-3e3df6606045-system-cni-dir\") pod \"multus-xvg2d\" (UID: \"b61ffc5b-bae6-4b74-a181-3e3df6606045\") " pod="openshift-multus/multus-xvg2d" Apr 23 13:32:09.670989 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.670411 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b61ffc5b-bae6-4b74-a181-3e3df6606045-multus-cni-dir\") pod \"multus-xvg2d\" (UID: \"b61ffc5b-bae6-4b74-a181-3e3df6606045\") " pod="openshift-multus/multus-xvg2d" Apr 23 13:32:09.670989 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.670415 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b61ffc5b-bae6-4b74-a181-3e3df6606045-host-run-netns\") pod \"multus-xvg2d\" (UID: \"b61ffc5b-bae6-4b74-a181-3e3df6606045\") " pod="openshift-multus/multus-xvg2d" Apr 23 13:32:09.670989 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.670443 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b61ffc5b-bae6-4b74-a181-3e3df6606045-host-run-netns\") pod \"multus-xvg2d\" (UID: \"b61ffc5b-bae6-4b74-a181-3e3df6606045\") " pod="openshift-multus/multus-xvg2d" Apr 23 13:32:09.670989 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.670442 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/b61ffc5b-bae6-4b74-a181-3e3df6606045-host-run-k8s-cni-cncf-io\") pod \"multus-xvg2d\" (UID: \"b61ffc5b-bae6-4b74-a181-3e3df6606045\") " pod="openshift-multus/multus-xvg2d" Apr 23 13:32:09.670989 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.670459 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/821df7f9-3f87-4f86-a7e9-82cad302fff0-metrics-certs\") pod \"network-metrics-daemon-6pz6w\" (UID: \"821df7f9-3f87-4f86-a7e9-82cad302fff0\") " pod="openshift-multus/network-metrics-daemon-6pz6w" Apr 23 13:32:09.670989 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.670485 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5kpq6\" (UniqueName: \"kubernetes.io/projected/821df7f9-3f87-4f86-a7e9-82cad302fff0-kube-api-access-5kpq6\") pod \"network-metrics-daemon-6pz6w\" (UID: \"821df7f9-3f87-4f86-a7e9-82cad302fff0\") " pod="openshift-multus/network-metrics-daemon-6pz6w" Apr 23 13:32:09.670989 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.670523 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7c29fef9-0671-485c-988d-0b06e4091d1a-lib-modules\") pod \"tuned-g4zgx\" (UID: \"7c29fef9-0671-485c-988d-0b06e4091d1a\") " pod="openshift-cluster-node-tuning-operator/tuned-g4zgx" Apr 23 13:32:09.670989 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:09.670562 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:32:09.670989 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.670599 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7c29fef9-0671-485c-988d-0b06e4091d1a-lib-modules\") pod \"tuned-g4zgx\" (UID: \"7c29fef9-0671-485c-988d-0b06e4091d1a\") " pod="openshift-cluster-node-tuning-operator/tuned-g4zgx" Apr 23 13:32:09.670989 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:09.670626 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/821df7f9-3f87-4f86-a7e9-82cad302fff0-metrics-certs podName:821df7f9-3f87-4f86-a7e9-82cad302fff0 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:10.170598805 +0000 UTC m=+3.148507375 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/821df7f9-3f87-4f86-a7e9-82cad302fff0-metrics-certs") pod "network-metrics-daemon-6pz6w" (UID: "821df7f9-3f87-4f86-a7e9-82cad302fff0") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:32:09.670989 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.670642 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b61ffc5b-bae6-4b74-a181-3e3df6606045-cni-binary-copy\") pod \"multus-xvg2d\" (UID: \"b61ffc5b-bae6-4b74-a181-3e3df6606045\") " pod="openshift-multus/multus-xvg2d" Apr 23 13:32:09.670989 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.670665 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fndp5\" (UniqueName: \"kubernetes.io/projected/b61ffc5b-bae6-4b74-a181-3e3df6606045-kube-api-access-fndp5\") pod \"multus-xvg2d\" (UID: \"b61ffc5b-bae6-4b74-a181-3e3df6606045\") " pod="openshift-multus/multus-xvg2d" Apr 23 13:32:09.670989 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.670687 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/aecfee63-4703-49e8-81cc-aa07bc06ce4e-host\") pod \"node-ca-mbln5\" (UID: \"aecfee63-4703-49e8-81cc-aa07bc06ce4e\") " pod="openshift-image-registry/node-ca-mbln5" Apr 23 13:32:09.670989 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.670702 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-469cf\" (UniqueName: \"kubernetes.io/projected/aecfee63-4703-49e8-81cc-aa07bc06ce4e-kube-api-access-469cf\") pod \"node-ca-mbln5\" (UID: \"aecfee63-4703-49e8-81cc-aa07bc06ce4e\") " pod="openshift-image-registry/node-ca-mbln5" Apr 23 13:32:09.670989 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.670717 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b61ffc5b-bae6-4b74-a181-3e3df6606045-os-release\") pod \"multus-xvg2d\" (UID: \"b61ffc5b-bae6-4b74-a181-3e3df6606045\") " pod="openshift-multus/multus-xvg2d" Apr 23 13:32:09.671782 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.670732 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/b61ffc5b-bae6-4b74-a181-3e3df6606045-host-run-multus-certs\") pod \"multus-xvg2d\" (UID: \"b61ffc5b-bae6-4b74-a181-3e3df6606045\") " pod="openshift-multus/multus-xvg2d" Apr 23 13:32:09.671782 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.670754 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/7c29fef9-0671-485c-988d-0b06e4091d1a-etc-sysctl-conf\") pod \"tuned-g4zgx\" (UID: \"7c29fef9-0671-485c-988d-0b06e4091d1a\") " pod="openshift-cluster-node-tuning-operator/tuned-g4zgx" Apr 23 13:32:09.671782 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.670772 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/aecfee63-4703-49e8-81cc-aa07bc06ce4e-host\") pod \"node-ca-mbln5\" (UID: \"aecfee63-4703-49e8-81cc-aa07bc06ce4e\") " pod="openshift-image-registry/node-ca-mbln5" Apr 23 13:32:09.671782 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.670777 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b61ffc5b-bae6-4b74-a181-3e3df6606045-etc-kubernetes\") pod \"multus-xvg2d\" (UID: \"b61ffc5b-bae6-4b74-a181-3e3df6606045\") " pod="openshift-multus/multus-xvg2d" Apr 23 13:32:09.671782 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.670799 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b61ffc5b-bae6-4b74-a181-3e3df6606045-os-release\") pod \"multus-xvg2d\" (UID: \"b61ffc5b-bae6-4b74-a181-3e3df6606045\") " pod="openshift-multus/multus-xvg2d" Apr 23 13:32:09.671782 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.670808 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b61ffc5b-bae6-4b74-a181-3e3df6606045-etc-kubernetes\") pod \"multus-xvg2d\" (UID: \"b61ffc5b-bae6-4b74-a181-3e3df6606045\") " pod="openshift-multus/multus-xvg2d" Apr 23 13:32:09.671782 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.670821 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qzzwf\" (UniqueName: \"kubernetes.io/projected/5ab5642a-1989-41c1-956f-98f92fcc6f23-kube-api-access-qzzwf\") pod \"node-resolver-wxlj2\" (UID: \"5ab5642a-1989-41c1-956f-98f92fcc6f23\") " pod="openshift-dns/node-resolver-wxlj2" Apr 23 13:32:09.671782 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.670834 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/b61ffc5b-bae6-4b74-a181-3e3df6606045-host-run-multus-certs\") pod \"multus-xvg2d\" (UID: \"b61ffc5b-bae6-4b74-a181-3e3df6606045\") " pod="openshift-multus/multus-xvg2d" Apr 23 13:32:09.671782 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.670851 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/0fc2bb02-44f9-45bf-aa27-62f98f04e5ac-etc-selinux\") pod \"aws-ebs-csi-driver-node-9pf4r\" (UID: \"0fc2bb02-44f9-45bf-aa27-62f98f04e5ac\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9pf4r" Apr 23 13:32:09.671782 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.670886 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7c29fef9-0671-485c-988d-0b06e4091d1a-tmp\") pod \"tuned-g4zgx\" (UID: \"7c29fef9-0671-485c-988d-0b06e4091d1a\") " pod="openshift-cluster-node-tuning-operator/tuned-g4zgx" Apr 23 13:32:09.671782 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.670898 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/0fc2bb02-44f9-45bf-aa27-62f98f04e5ac-etc-selinux\") pod \"aws-ebs-csi-driver-node-9pf4r\" (UID: \"0fc2bb02-44f9-45bf-aa27-62f98f04e5ac\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9pf4r" Apr 23 13:32:09.671782 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.670919 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/7c29fef9-0671-485c-988d-0b06e4091d1a-etc-sysctl-conf\") pod \"tuned-g4zgx\" (UID: \"7c29fef9-0671-485c-988d-0b06e4091d1a\") " pod="openshift-cluster-node-tuning-operator/tuned-g4zgx" Apr 23 13:32:09.671782 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.670940 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/b61ffc5b-bae6-4b74-a181-3e3df6606045-multus-daemon-config\") pod \"multus-xvg2d\" (UID: \"b61ffc5b-bae6-4b74-a181-3e3df6606045\") " pod="openshift-multus/multus-xvg2d" Apr 23 13:32:09.671782 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.670972 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b61ffc5b-bae6-4b74-a181-3e3df6606045-cnibin\") pod \"multus-xvg2d\" (UID: \"b61ffc5b-bae6-4b74-a181-3e3df6606045\") " pod="openshift-multus/multus-xvg2d" Apr 23 13:32:09.671782 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.670998 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b61ffc5b-bae6-4b74-a181-3e3df6606045-host-var-lib-cni-bin\") pod \"multus-xvg2d\" (UID: \"b61ffc5b-bae6-4b74-a181-3e3df6606045\") " pod="openshift-multus/multus-xvg2d" Apr 23 13:32:09.671782 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.671053 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b61ffc5b-bae6-4b74-a181-3e3df6606045-host-var-lib-cni-bin\") pod \"multus-xvg2d\" (UID: \"b61ffc5b-bae6-4b74-a181-3e3df6606045\") " pod="openshift-multus/multus-xvg2d" Apr 23 13:32:09.671782 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.671055 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b61ffc5b-bae6-4b74-a181-3e3df6606045-cnibin\") pod \"multus-xvg2d\" (UID: \"b61ffc5b-bae6-4b74-a181-3e3df6606045\") " pod="openshift-multus/multus-xvg2d" Apr 23 13:32:09.671782 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.671094 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5ab5642a-1989-41c1-956f-98f92fcc6f23-tmp-dir\") pod \"node-resolver-wxlj2\" (UID: \"5ab5642a-1989-41c1-956f-98f92fcc6f23\") " pod="openshift-dns/node-resolver-wxlj2" Apr 23 13:32:09.672322 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.671125 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0fc2bb02-44f9-45bf-aa27-62f98f04e5ac-kubelet-dir\") pod \"aws-ebs-csi-driver-node-9pf4r\" (UID: \"0fc2bb02-44f9-45bf-aa27-62f98f04e5ac\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9pf4r" Apr 23 13:32:09.672322 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.671148 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/7c29fef9-0671-485c-988d-0b06e4091d1a-etc-sysconfig\") pod \"tuned-g4zgx\" (UID: \"7c29fef9-0671-485c-988d-0b06e4091d1a\") " pod="openshift-cluster-node-tuning-operator/tuned-g4zgx" Apr 23 13:32:09.672322 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.671155 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b61ffc5b-bae6-4b74-a181-3e3df6606045-cni-binary-copy\") pod \"multus-xvg2d\" (UID: \"b61ffc5b-bae6-4b74-a181-3e3df6606045\") " pod="openshift-multus/multus-xvg2d" Apr 23 13:32:09.672322 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.671175 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b61ffc5b-bae6-4b74-a181-3e3df6606045-host-var-lib-kubelet\") pod \"multus-xvg2d\" (UID: \"b61ffc5b-bae6-4b74-a181-3e3df6606045\") " pod="openshift-multus/multus-xvg2d" Apr 23 13:32:09.672322 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.671210 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/7c29fef9-0671-485c-988d-0b06e4091d1a-etc-sysconfig\") pod \"tuned-g4zgx\" (UID: \"7c29fef9-0671-485c-988d-0b06e4091d1a\") " pod="openshift-cluster-node-tuning-operator/tuned-g4zgx" Apr 23 13:32:09.672322 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.671213 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0fc2bb02-44f9-45bf-aa27-62f98f04e5ac-kubelet-dir\") pod \"aws-ebs-csi-driver-node-9pf4r\" (UID: \"0fc2bb02-44f9-45bf-aa27-62f98f04e5ac\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9pf4r" Apr 23 13:32:09.672322 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.671233 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/0fc2bb02-44f9-45bf-aa27-62f98f04e5ac-sys-fs\") pod \"aws-ebs-csi-driver-node-9pf4r\" (UID: \"0fc2bb02-44f9-45bf-aa27-62f98f04e5ac\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9pf4r" Apr 23 13:32:09.672322 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.671257 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b61ffc5b-bae6-4b74-a181-3e3df6606045-host-var-lib-kubelet\") pod \"multus-xvg2d\" (UID: \"b61ffc5b-bae6-4b74-a181-3e3df6606045\") " pod="openshift-multus/multus-xvg2d" Apr 23 13:32:09.672322 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.671260 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xxj2c\" (UniqueName: \"kubernetes.io/projected/7c29fef9-0671-485c-988d-0b06e4091d1a-kube-api-access-xxj2c\") pod \"tuned-g4zgx\" (UID: \"7c29fef9-0671-485c-988d-0b06e4091d1a\") " pod="openshift-cluster-node-tuning-operator/tuned-g4zgx" Apr 23 13:32:09.672322 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.671293 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/b61ffc5b-bae6-4b74-a181-3e3df6606045-hostroot\") pod \"multus-xvg2d\" (UID: \"b61ffc5b-bae6-4b74-a181-3e3df6606045\") " pod="openshift-multus/multus-xvg2d" Apr 23 13:32:09.672322 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.671318 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/aecfee63-4703-49e8-81cc-aa07bc06ce4e-serviceca\") pod \"node-ca-mbln5\" (UID: \"aecfee63-4703-49e8-81cc-aa07bc06ce4e\") " pod="openshift-image-registry/node-ca-mbln5" Apr 23 13:32:09.672322 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.671321 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/0fc2bb02-44f9-45bf-aa27-62f98f04e5ac-sys-fs\") pod \"aws-ebs-csi-driver-node-9pf4r\" (UID: \"0fc2bb02-44f9-45bf-aa27-62f98f04e5ac\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9pf4r" Apr 23 13:32:09.672322 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.671340 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7c29fef9-0671-485c-988d-0b06e4091d1a-host\") pod \"tuned-g4zgx\" (UID: \"7c29fef9-0671-485c-988d-0b06e4091d1a\") " pod="openshift-cluster-node-tuning-operator/tuned-g4zgx" Apr 23 13:32:09.672322 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.671366 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b61ffc5b-bae6-4b74-a181-3e3df6606045-multus-conf-dir\") pod \"multus-xvg2d\" (UID: \"b61ffc5b-bae6-4b74-a181-3e3df6606045\") " pod="openshift-multus/multus-xvg2d" Apr 23 13:32:09.672322 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.671374 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/b61ffc5b-bae6-4b74-a181-3e3df6606045-hostroot\") pod \"multus-xvg2d\" (UID: \"b61ffc5b-bae6-4b74-a181-3e3df6606045\") " pod="openshift-multus/multus-xvg2d" Apr 23 13:32:09.672322 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.671396 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/0fc2bb02-44f9-45bf-aa27-62f98f04e5ac-device-dir\") pod \"aws-ebs-csi-driver-node-9pf4r\" (UID: \"0fc2bb02-44f9-45bf-aa27-62f98f04e5ac\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9pf4r" Apr 23 13:32:09.672322 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.671421 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7c29fef9-0671-485c-988d-0b06e4091d1a-etc-kubernetes\") pod \"tuned-g4zgx\" (UID: \"7c29fef9-0671-485c-988d-0b06e4091d1a\") " pod="openshift-cluster-node-tuning-operator/tuned-g4zgx" Apr 23 13:32:09.672322 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.671426 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5ab5642a-1989-41c1-956f-98f92fcc6f23-tmp-dir\") pod \"node-resolver-wxlj2\" (UID: \"5ab5642a-1989-41c1-956f-98f92fcc6f23\") " pod="openshift-dns/node-resolver-wxlj2" Apr 23 13:32:09.672826 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.671447 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/b61ffc5b-bae6-4b74-a181-3e3df6606045-host-var-lib-cni-multus\") pod \"multus-xvg2d\" (UID: \"b61ffc5b-bae6-4b74-a181-3e3df6606045\") " pod="openshift-multus/multus-xvg2d" Apr 23 13:32:09.672826 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.671463 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b61ffc5b-bae6-4b74-a181-3e3df6606045-multus-conf-dir\") pod \"multus-xvg2d\" (UID: \"b61ffc5b-bae6-4b74-a181-3e3df6606045\") " pod="openshift-multus/multus-xvg2d" Apr 23 13:32:09.672826 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.671422 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7c29fef9-0671-485c-988d-0b06e4091d1a-host\") pod \"tuned-g4zgx\" (UID: \"7c29fef9-0671-485c-988d-0b06e4091d1a\") " pod="openshift-cluster-node-tuning-operator/tuned-g4zgx" Apr 23 13:32:09.672826 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.671479 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/b61ffc5b-bae6-4b74-a181-3e3df6606045-multus-daemon-config\") pod \"multus-xvg2d\" (UID: \"b61ffc5b-bae6-4b74-a181-3e3df6606045\") " pod="openshift-multus/multus-xvg2d" Apr 23 13:32:09.672826 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.671478 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/5ab5642a-1989-41c1-956f-98f92fcc6f23-hosts-file\") pod \"node-resolver-wxlj2\" (UID: \"5ab5642a-1989-41c1-956f-98f92fcc6f23\") " pod="openshift-dns/node-resolver-wxlj2" Apr 23 13:32:09.672826 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.671540 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/0fc2bb02-44f9-45bf-aa27-62f98f04e5ac-device-dir\") pod \"aws-ebs-csi-driver-node-9pf4r\" (UID: \"0fc2bb02-44f9-45bf-aa27-62f98f04e5ac\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9pf4r" Apr 23 13:32:09.672826 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.671562 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/5ab5642a-1989-41c1-956f-98f92fcc6f23-hosts-file\") pod \"node-resolver-wxlj2\" (UID: \"5ab5642a-1989-41c1-956f-98f92fcc6f23\") " pod="openshift-dns/node-resolver-wxlj2" Apr 23 13:32:09.672826 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.671569 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/b61ffc5b-bae6-4b74-a181-3e3df6606045-host-var-lib-cni-multus\") pod \"multus-xvg2d\" (UID: \"b61ffc5b-bae6-4b74-a181-3e3df6606045\") " pod="openshift-multus/multus-xvg2d" Apr 23 13:32:09.672826 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.671509 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7c29fef9-0671-485c-988d-0b06e4091d1a-etc-kubernetes\") pod \"tuned-g4zgx\" (UID: \"7c29fef9-0671-485c-988d-0b06e4091d1a\") " pod="openshift-cluster-node-tuning-operator/tuned-g4zgx" Apr 23 13:32:09.672826 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.671571 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7c29fef9-0671-485c-988d-0b06e4091d1a-sys\") pod \"tuned-g4zgx\" (UID: \"7c29fef9-0671-485c-988d-0b06e4091d1a\") " pod="openshift-cluster-node-tuning-operator/tuned-g4zgx" Apr 23 13:32:09.672826 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.671610 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/7c29fef9-0671-485c-988d-0b06e4091d1a-etc-tuned\") pod \"tuned-g4zgx\" (UID: \"7c29fef9-0671-485c-988d-0b06e4091d1a\") " pod="openshift-cluster-node-tuning-operator/tuned-g4zgx" Apr 23 13:32:09.672826 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.671625 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7c29fef9-0671-485c-988d-0b06e4091d1a-sys\") pod \"tuned-g4zgx\" (UID: \"7c29fef9-0671-485c-988d-0b06e4091d1a\") " pod="openshift-cluster-node-tuning-operator/tuned-g4zgx" Apr 23 13:32:09.672826 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.671630 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/0fc2bb02-44f9-45bf-aa27-62f98f04e5ac-registration-dir\") pod \"aws-ebs-csi-driver-node-9pf4r\" (UID: \"0fc2bb02-44f9-45bf-aa27-62f98f04e5ac\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9pf4r" Apr 23 13:32:09.672826 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.671648 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/7c29fef9-0671-485c-988d-0b06e4091d1a-etc-modprobe-d\") pod \"tuned-g4zgx\" (UID: \"7c29fef9-0671-485c-988d-0b06e4091d1a\") " pod="openshift-cluster-node-tuning-operator/tuned-g4zgx" Apr 23 13:32:09.672826 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.671714 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/0fc2bb02-44f9-45bf-aa27-62f98f04e5ac-registration-dir\") pod \"aws-ebs-csi-driver-node-9pf4r\" (UID: \"0fc2bb02-44f9-45bf-aa27-62f98f04e5ac\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9pf4r" Apr 23 13:32:09.672826 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.671750 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/7c29fef9-0671-485c-988d-0b06e4091d1a-etc-modprobe-d\") pod \"tuned-g4zgx\" (UID: \"7c29fef9-0671-485c-988d-0b06e4091d1a\") " pod="openshift-cluster-node-tuning-operator/tuned-g4zgx" Apr 23 13:32:09.672826 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.671793 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/aecfee63-4703-49e8-81cc-aa07bc06ce4e-serviceca\") pod \"node-ca-mbln5\" (UID: \"aecfee63-4703-49e8-81cc-aa07bc06ce4e\") " pod="openshift-image-registry/node-ca-mbln5" Apr 23 13:32:09.673322 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.673045 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7c29fef9-0671-485c-988d-0b06e4091d1a-tmp\") pod \"tuned-g4zgx\" (UID: \"7c29fef9-0671-485c-988d-0b06e4091d1a\") " pod="openshift-cluster-node-tuning-operator/tuned-g4zgx" Apr 23 13:32:09.673648 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.673633 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/7c29fef9-0671-485c-988d-0b06e4091d1a-etc-tuned\") pod \"tuned-g4zgx\" (UID: \"7c29fef9-0671-485c-988d-0b06e4091d1a\") " pod="openshift-cluster-node-tuning-operator/tuned-g4zgx" Apr 23 13:32:09.679359 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.679331 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vr8wc\" (UniqueName: \"kubernetes.io/projected/0fc2bb02-44f9-45bf-aa27-62f98f04e5ac-kube-api-access-vr8wc\") pod \"aws-ebs-csi-driver-node-9pf4r\" (UID: \"0fc2bb02-44f9-45bf-aa27-62f98f04e5ac\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9pf4r" Apr 23 13:32:09.679469 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.679449 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzzwf\" (UniqueName: \"kubernetes.io/projected/5ab5642a-1989-41c1-956f-98f92fcc6f23-kube-api-access-qzzwf\") pod \"node-resolver-wxlj2\" (UID: \"5ab5642a-1989-41c1-956f-98f92fcc6f23\") " pod="openshift-dns/node-resolver-wxlj2" Apr 23 13:32:09.679893 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.679874 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fndp5\" (UniqueName: \"kubernetes.io/projected/b61ffc5b-bae6-4b74-a181-3e3df6606045-kube-api-access-fndp5\") pod \"multus-xvg2d\" (UID: \"b61ffc5b-bae6-4b74-a181-3e3df6606045\") " pod="openshift-multus/multus-xvg2d" Apr 23 13:32:09.680029 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:09.680012 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 13:32:09.680099 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:09.680034 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 13:32:09.680099 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:09.680047 2577 projected.go:194] Error preparing data for projected volume kube-api-access-gjg76 for pod openshift-network-diagnostics/network-check-target-fl2td: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:32:09.680166 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:09.680116 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/373af144-ae77-4496-8057-d855373807e4-kube-api-access-gjg76 podName:373af144-ae77-4496-8057-d855373807e4 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:10.180100727 +0000 UTC m=+3.158009302 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-gjg76" (UniqueName: "kubernetes.io/projected/373af144-ae77-4496-8057-d855373807e4-kube-api-access-gjg76") pod "network-check-target-fl2td" (UID: "373af144-ae77-4496-8057-d855373807e4") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:32:09.680166 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.680118 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kpq6\" (UniqueName: \"kubernetes.io/projected/821df7f9-3f87-4f86-a7e9-82cad302fff0-kube-api-access-5kpq6\") pod \"network-metrics-daemon-6pz6w\" (UID: \"821df7f9-3f87-4f86-a7e9-82cad302fff0\") " pod="openshift-multus/network-metrics-daemon-6pz6w" Apr 23 13:32:09.682177 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.682112 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxj2c\" (UniqueName: \"kubernetes.io/projected/7c29fef9-0671-485c-988d-0b06e4091d1a-kube-api-access-xxj2c\") pod \"tuned-g4zgx\" (UID: \"7c29fef9-0671-485c-988d-0b06e4091d1a\") " pod="openshift-cluster-node-tuning-operator/tuned-g4zgx" Apr 23 13:32:09.682177 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.682124 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-469cf\" (UniqueName: \"kubernetes.io/projected/aecfee63-4703-49e8-81cc-aa07bc06ce4e-kube-api-access-469cf\") pod \"node-ca-mbln5\" (UID: \"aecfee63-4703-49e8-81cc-aa07bc06ce4e\") " pod="openshift-image-registry/node-ca-mbln5" Apr 23 13:32:09.707281 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.707251 2577 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 13:32:09.758573 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.758551 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-p9tzn" Apr 23 13:32:09.768361 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.768337 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-v5s77" Apr 23 13:32:09.778184 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.778161 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-mwbjc" Apr 23 13:32:09.785027 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.785006 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-shltj" Apr 23 13:32:09.792599 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.792578 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9pf4r" Apr 23 13:32:09.800170 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.800147 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-g4zgx" Apr 23 13:32:09.806692 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.806670 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-mbln5" Apr 23 13:32:09.813161 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.813141 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-xvg2d" Apr 23 13:32:09.819678 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:09.819661 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-wxlj2" Apr 23 13:32:10.174046 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:10.173948 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/821df7f9-3f87-4f86-a7e9-82cad302fff0-metrics-certs\") pod \"network-metrics-daemon-6pz6w\" (UID: \"821df7f9-3f87-4f86-a7e9-82cad302fff0\") " pod="openshift-multus/network-metrics-daemon-6pz6w" Apr 23 13:32:10.174212 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:10.174138 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:32:10.174273 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:10.174234 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/821df7f9-3f87-4f86-a7e9-82cad302fff0-metrics-certs podName:821df7f9-3f87-4f86-a7e9-82cad302fff0 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:11.174213104 +0000 UTC m=+4.152121679 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/821df7f9-3f87-4f86-a7e9-82cad302fff0-metrics-certs") pod "network-metrics-daemon-6pz6w" (UID: "821df7f9-3f87-4f86-a7e9-82cad302fff0") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:32:10.245862 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:10.245836 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2241e05b_7796_4e2e_b1cf_f47baaeef969.slice/crio-43b23323fa9621c7f46e1e15a1017581e703ddc44398d3fca39448226182f248 WatchSource:0}: Error finding container 43b23323fa9621c7f46e1e15a1017581e703ddc44398d3fca39448226182f248: Status 404 returned error can't find the container with id 43b23323fa9621c7f46e1e15a1017581e703ddc44398d3fca39448226182f248 Apr 23 13:32:10.249518 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:10.249473 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a590caf_dc65_421e_a4c8_40d3258ddd7b.slice/crio-c2bf47ea117e8779ccd170157f6736658f5d31c014cd4fa9b55c6bae37f60063 WatchSource:0}: Error finding container c2bf47ea117e8779ccd170157f6736658f5d31c014cd4fa9b55c6bae37f60063: Status 404 returned error can't find the container with id c2bf47ea117e8779ccd170157f6736658f5d31c014cd4fa9b55c6bae37f60063 Apr 23 13:32:10.251171 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:10.251146 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod58abee5a_98ee_4a90_ab84_a17d06d08d00.slice/crio-f7a4d136541a249ed3763abad6044a9cf5fb384b9575dea9d303344bba07f519 WatchSource:0}: Error finding container f7a4d136541a249ed3763abad6044a9cf5fb384b9575dea9d303344bba07f519: Status 404 returned error can't find the container with id f7a4d136541a249ed3763abad6044a9cf5fb384b9575dea9d303344bba07f519 Apr 23 13:32:10.252372 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:10.252335 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8de6e215_aa9d_4003_aac4_d2a7bbdb59fb.slice/crio-7df92c489e9c41241276f4afa8e1948261ae69344ebd37cfc2f60478839cd379 WatchSource:0}: Error finding container 7df92c489e9c41241276f4afa8e1948261ae69344ebd37cfc2f60478839cd379: Status 404 returned error can't find the container with id 7df92c489e9c41241276f4afa8e1948261ae69344ebd37cfc2f60478839cd379 Apr 23 13:32:10.253089 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:10.253065 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaecfee63_4703_49e8_81cc_aa07bc06ce4e.slice/crio-91698f3083a7b155fb97b88aff5279895e248d053cce5417230225fe18e445e3 WatchSource:0}: Error finding container 91698f3083a7b155fb97b88aff5279895e248d053cce5417230225fe18e445e3: Status 404 returned error can't find the container with id 91698f3083a7b155fb97b88aff5279895e248d053cce5417230225fe18e445e3 Apr 23 13:32:10.254705 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:10.254675 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c29fef9_0671_485c_988d_0b06e4091d1a.slice/crio-825315a04f19f16439e6db9bc401084ae510b0c6f87fba7c930c1da3f70678d4 WatchSource:0}: Error finding container 825315a04f19f16439e6db9bc401084ae510b0c6f87fba7c930c1da3f70678d4: Status 404 returned error can't find the container with id 825315a04f19f16439e6db9bc401084ae510b0c6f87fba7c930c1da3f70678d4 Apr 23 13:32:10.255691 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:10.255661 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ab5642a_1989_41c1_956f_98f92fcc6f23.slice/crio-76ddfd2f312ec03ce5f0bdbc16e061fc29d525184c7093db4a0204156aadce0c WatchSource:0}: Error finding container 76ddfd2f312ec03ce5f0bdbc16e061fc29d525184c7093db4a0204156aadce0c: Status 404 returned error can't find the container with id 76ddfd2f312ec03ce5f0bdbc16e061fc29d525184c7093db4a0204156aadce0c Apr 23 13:32:10.256556 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:10.256454 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb61ffc5b_bae6_4b74_a181_3e3df6606045.slice/crio-0cdf437a2722f2649ffda07f16ad23908fa754e656041f5aa1e5b309d3a941f5 WatchSource:0}: Error finding container 0cdf437a2722f2649ffda07f16ad23908fa754e656041f5aa1e5b309d3a941f5: Status 404 returned error can't find the container with id 0cdf437a2722f2649ffda07f16ad23908fa754e656041f5aa1e5b309d3a941f5 Apr 23 13:32:10.257638 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:10.257606 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0fc2bb02_44f9_45bf_aa27_62f98f04e5ac.slice/crio-f3041c05bdf6c9a7fdce784be51f7935b4000c74191bef559e52db8579dec1de WatchSource:0}: Error finding container f3041c05bdf6c9a7fdce784be51f7935b4000c74191bef559e52db8579dec1de: Status 404 returned error can't find the container with id f3041c05bdf6c9a7fdce784be51f7935b4000c74191bef559e52db8579dec1de Apr 23 13:32:10.274989 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:10.274970 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gjg76\" (UniqueName: \"kubernetes.io/projected/373af144-ae77-4496-8057-d855373807e4-kube-api-access-gjg76\") pod \"network-check-target-fl2td\" (UID: \"373af144-ae77-4496-8057-d855373807e4\") " pod="openshift-network-diagnostics/network-check-target-fl2td" Apr 23 13:32:10.275105 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:10.275093 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 13:32:10.275148 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:10.275108 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 13:32:10.275148 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:10.275117 2577 projected.go:194] Error preparing data for projected volume kube-api-access-gjg76 for pod openshift-network-diagnostics/network-check-target-fl2td: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:32:10.275224 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:10.275154 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/373af144-ae77-4496-8057-d855373807e4-kube-api-access-gjg76 podName:373af144-ae77-4496-8057-d855373807e4 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:11.275141936 +0000 UTC m=+4.253050505 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-gjg76" (UniqueName: "kubernetes.io/projected/373af144-ae77-4496-8057-d855373807e4-kube-api-access-gjg76") pod "network-check-target-fl2td" (UID: "373af144-ae77-4496-8057-d855373807e4") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:32:10.500006 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:10.499971 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-22 13:27:08 +0000 UTC" deadline="2027-09-27 09:27:29.547649684 +0000 UTC" Apr 23 13:32:10.500006 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:10.500001 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12523h55m19.047650794s" Apr 23 13:32:10.579685 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:10.579652 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6pz6w" Apr 23 13:32:10.579830 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:10.579802 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6pz6w" podUID="821df7f9-3f87-4f86-a7e9-82cad302fff0" Apr 23 13:32:10.586958 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:10.586916 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-shltj" event={"ID":"2241e05b-7796-4e2e-b1cf-f47baaeef969","Type":"ContainerStarted","Data":"43b23323fa9621c7f46e1e15a1017581e703ddc44398d3fca39448226182f248"} Apr 23 13:32:10.589021 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:10.588980 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-207.ec2.internal" event={"ID":"bbbaef9da88b934e809d0d29bccb4dd7","Type":"ContainerStarted","Data":"4503235ccd2865528255fcb2e37f7884b3e56853dbb9244c0f7085cb9ec3c3d1"} Apr 23 13:32:10.590413 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:10.590383 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9pf4r" event={"ID":"0fc2bb02-44f9-45bf-aa27-62f98f04e5ac","Type":"ContainerStarted","Data":"f3041c05bdf6c9a7fdce784be51f7935b4000c74191bef559e52db8579dec1de"} Apr 23 13:32:10.591887 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:10.591855 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xvg2d" event={"ID":"b61ffc5b-bae6-4b74-a181-3e3df6606045","Type":"ContainerStarted","Data":"0cdf437a2722f2649ffda07f16ad23908fa754e656041f5aa1e5b309d3a941f5"} Apr 23 13:32:10.592887 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:10.592864 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-wxlj2" event={"ID":"5ab5642a-1989-41c1-956f-98f92fcc6f23","Type":"ContainerStarted","Data":"76ddfd2f312ec03ce5f0bdbc16e061fc29d525184c7093db4a0204156aadce0c"} Apr 23 13:32:10.594120 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:10.594091 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-g4zgx" event={"ID":"7c29fef9-0671-485c-988d-0b06e4091d1a","Type":"ContainerStarted","Data":"825315a04f19f16439e6db9bc401084ae510b0c6f87fba7c930c1da3f70678d4"} Apr 23 13:32:10.595207 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:10.595182 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-v5s77" event={"ID":"8de6e215-aa9d-4003-aac4-d2a7bbdb59fb","Type":"ContainerStarted","Data":"7df92c489e9c41241276f4afa8e1948261ae69344ebd37cfc2f60478839cd379"} Apr 23 13:32:10.596362 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:10.596315 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-mbln5" event={"ID":"aecfee63-4703-49e8-81cc-aa07bc06ce4e","Type":"ContainerStarted","Data":"91698f3083a7b155fb97b88aff5279895e248d053cce5417230225fe18e445e3"} Apr 23 13:32:10.599204 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:10.599176 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mwbjc" event={"ID":"58abee5a-98ee-4a90-ab84-a17d06d08d00","Type":"ContainerStarted","Data":"f7a4d136541a249ed3763abad6044a9cf5fb384b9575dea9d303344bba07f519"} Apr 23 13:32:10.600521 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:10.600481 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p9tzn" event={"ID":"4a590caf-dc65-421e-a4c8-40d3258ddd7b","Type":"ContainerStarted","Data":"c2bf47ea117e8779ccd170157f6736658f5d31c014cd4fa9b55c6bae37f60063"} Apr 23 13:32:10.602331 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:10.602277 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-207.ec2.internal" podStartSLOduration=2.602262975 podStartE2EDuration="2.602262975s" podCreationTimestamp="2026-04-23 13:32:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:32:10.601508639 +0000 UTC m=+3.579417232" watchObservedRunningTime="2026-04-23 13:32:10.602262975 +0000 UTC m=+3.580171569" Apr 23 13:32:11.181110 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:11.181074 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/821df7f9-3f87-4f86-a7e9-82cad302fff0-metrics-certs\") pod \"network-metrics-daemon-6pz6w\" (UID: \"821df7f9-3f87-4f86-a7e9-82cad302fff0\") " pod="openshift-multus/network-metrics-daemon-6pz6w" Apr 23 13:32:11.181256 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:11.181243 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:32:11.181311 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:11.181306 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/821df7f9-3f87-4f86-a7e9-82cad302fff0-metrics-certs podName:821df7f9-3f87-4f86-a7e9-82cad302fff0 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:13.181286756 +0000 UTC m=+6.159195333 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/821df7f9-3f87-4f86-a7e9-82cad302fff0-metrics-certs") pod "network-metrics-daemon-6pz6w" (UID: "821df7f9-3f87-4f86-a7e9-82cad302fff0") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:32:11.282456 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:11.281964 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gjg76\" (UniqueName: \"kubernetes.io/projected/373af144-ae77-4496-8057-d855373807e4-kube-api-access-gjg76\") pod \"network-check-target-fl2td\" (UID: \"373af144-ae77-4496-8057-d855373807e4\") " pod="openshift-network-diagnostics/network-check-target-fl2td" Apr 23 13:32:11.282456 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:11.282114 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 13:32:11.282456 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:11.282127 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 13:32:11.282456 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:11.282137 2577 projected.go:194] Error preparing data for projected volume kube-api-access-gjg76 for pod openshift-network-diagnostics/network-check-target-fl2td: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:32:11.282456 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:11.282176 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/373af144-ae77-4496-8057-d855373807e4-kube-api-access-gjg76 podName:373af144-ae77-4496-8057-d855373807e4 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:13.282164244 +0000 UTC m=+6.260072814 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-gjg76" (UniqueName: "kubernetes.io/projected/373af144-ae77-4496-8057-d855373807e4-kube-api-access-gjg76") pod "network-check-target-fl2td" (UID: "373af144-ae77-4496-8057-d855373807e4") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:32:11.583920 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:11.583827 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fl2td" Apr 23 13:32:11.584482 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:11.583947 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fl2td" podUID="373af144-ae77-4496-8057-d855373807e4" Apr 23 13:32:11.611062 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:11.611030 2577 generic.go:358] "Generic (PLEG): container finished" podID="e0714ba56e53b69dc64c17ef7d8f7924" containerID="fca6cd2e1cbca6b6475a74371445fa7ec81bf506d83567f69b34ea218eecfd10" exitCode=0 Apr 23 13:32:11.612036 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:11.612009 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-207.ec2.internal" event={"ID":"e0714ba56e53b69dc64c17ef7d8f7924","Type":"ContainerDied","Data":"fca6cd2e1cbca6b6475a74371445fa7ec81bf506d83567f69b34ea218eecfd10"} Apr 23 13:32:12.579931 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:12.579890 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6pz6w" Apr 23 13:32:12.580110 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:12.580031 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6pz6w" podUID="821df7f9-3f87-4f86-a7e9-82cad302fff0" Apr 23 13:32:12.624122 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:12.624066 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-207.ec2.internal" event={"ID":"e0714ba56e53b69dc64c17ef7d8f7924","Type":"ContainerStarted","Data":"269c5961ef8adb0866ec0c030d50bc0f80763276f87a466a4475ff93e0788bab"} Apr 23 13:32:13.196394 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:13.195838 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/821df7f9-3f87-4f86-a7e9-82cad302fff0-metrics-certs\") pod \"network-metrics-daemon-6pz6w\" (UID: \"821df7f9-3f87-4f86-a7e9-82cad302fff0\") " pod="openshift-multus/network-metrics-daemon-6pz6w" Apr 23 13:32:13.196394 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:13.195974 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:32:13.196394 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:13.196033 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/821df7f9-3f87-4f86-a7e9-82cad302fff0-metrics-certs podName:821df7f9-3f87-4f86-a7e9-82cad302fff0 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:17.196015429 +0000 UTC m=+10.173924008 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/821df7f9-3f87-4f86-a7e9-82cad302fff0-metrics-certs") pod "network-metrics-daemon-6pz6w" (UID: "821df7f9-3f87-4f86-a7e9-82cad302fff0") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:32:13.296484 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:13.296212 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gjg76\" (UniqueName: \"kubernetes.io/projected/373af144-ae77-4496-8057-d855373807e4-kube-api-access-gjg76\") pod \"network-check-target-fl2td\" (UID: \"373af144-ae77-4496-8057-d855373807e4\") " pod="openshift-network-diagnostics/network-check-target-fl2td" Apr 23 13:32:13.296484 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:13.296389 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 13:32:13.296484 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:13.296410 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 13:32:13.296484 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:13.296422 2577 projected.go:194] Error preparing data for projected volume kube-api-access-gjg76 for pod openshift-network-diagnostics/network-check-target-fl2td: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:32:13.296484 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:13.296481 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/373af144-ae77-4496-8057-d855373807e4-kube-api-access-gjg76 podName:373af144-ae77-4496-8057-d855373807e4 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:17.296461807 +0000 UTC m=+10.274370400 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-gjg76" (UniqueName: "kubernetes.io/projected/373af144-ae77-4496-8057-d855373807e4-kube-api-access-gjg76") pod "network-check-target-fl2td" (UID: "373af144-ae77-4496-8057-d855373807e4") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:32:13.362859 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:13.361871 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-207.ec2.internal" podStartSLOduration=5.361851293 podStartE2EDuration="5.361851293s" podCreationTimestamp="2026-04-23 13:32:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:32:12.639332865 +0000 UTC m=+5.617241461" watchObservedRunningTime="2026-04-23 13:32:13.361851293 +0000 UTC m=+6.339759886" Apr 23 13:32:13.362859 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:13.362104 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-x4nd8"] Apr 23 13:32:13.365206 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:13.365181 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-x4nd8" Apr 23 13:32:13.365332 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:13.365266 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-x4nd8" podUID="75053ed6-040a-450c-b423-ce9ec4714d2f" Apr 23 13:32:13.396782 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:13.396754 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/75053ed6-040a-450c-b423-ce9ec4714d2f-dbus\") pod \"global-pull-secret-syncer-x4nd8\" (UID: \"75053ed6-040a-450c-b423-ce9ec4714d2f\") " pod="kube-system/global-pull-secret-syncer-x4nd8" Apr 23 13:32:13.396911 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:13.396799 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/75053ed6-040a-450c-b423-ce9ec4714d2f-kubelet-config\") pod \"global-pull-secret-syncer-x4nd8\" (UID: \"75053ed6-040a-450c-b423-ce9ec4714d2f\") " pod="kube-system/global-pull-secret-syncer-x4nd8" Apr 23 13:32:13.396911 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:13.396854 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/75053ed6-040a-450c-b423-ce9ec4714d2f-original-pull-secret\") pod \"global-pull-secret-syncer-x4nd8\" (UID: \"75053ed6-040a-450c-b423-ce9ec4714d2f\") " pod="kube-system/global-pull-secret-syncer-x4nd8" Apr 23 13:32:13.498026 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:13.497940 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/75053ed6-040a-450c-b423-ce9ec4714d2f-kubelet-config\") pod \"global-pull-secret-syncer-x4nd8\" (UID: \"75053ed6-040a-450c-b423-ce9ec4714d2f\") " pod="kube-system/global-pull-secret-syncer-x4nd8" Apr 23 13:32:13.498026 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:13.498015 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/75053ed6-040a-450c-b423-ce9ec4714d2f-original-pull-secret\") pod \"global-pull-secret-syncer-x4nd8\" (UID: \"75053ed6-040a-450c-b423-ce9ec4714d2f\") " pod="kube-system/global-pull-secret-syncer-x4nd8" Apr 23 13:32:13.498237 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:13.498087 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/75053ed6-040a-450c-b423-ce9ec4714d2f-dbus\") pod \"global-pull-secret-syncer-x4nd8\" (UID: \"75053ed6-040a-450c-b423-ce9ec4714d2f\") " pod="kube-system/global-pull-secret-syncer-x4nd8" Apr 23 13:32:13.498295 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:13.498237 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/75053ed6-040a-450c-b423-ce9ec4714d2f-dbus\") pod \"global-pull-secret-syncer-x4nd8\" (UID: \"75053ed6-040a-450c-b423-ce9ec4714d2f\") " pod="kube-system/global-pull-secret-syncer-x4nd8" Apr 23 13:32:13.498347 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:13.498297 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/75053ed6-040a-450c-b423-ce9ec4714d2f-kubelet-config\") pod \"global-pull-secret-syncer-x4nd8\" (UID: \"75053ed6-040a-450c-b423-ce9ec4714d2f\") " pod="kube-system/global-pull-secret-syncer-x4nd8" Apr 23 13:32:13.498402 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:13.498386 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 13:32:13.498455 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:13.498440 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/75053ed6-040a-450c-b423-ce9ec4714d2f-original-pull-secret podName:75053ed6-040a-450c-b423-ce9ec4714d2f nodeName:}" failed. No retries permitted until 2026-04-23 13:32:13.998422384 +0000 UTC m=+6.976330954 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/75053ed6-040a-450c-b423-ce9ec4714d2f-original-pull-secret") pod "global-pull-secret-syncer-x4nd8" (UID: "75053ed6-040a-450c-b423-ce9ec4714d2f") : object "kube-system"/"original-pull-secret" not registered Apr 23 13:32:13.580420 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:13.580212 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fl2td" Apr 23 13:32:13.580420 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:13.580337 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fl2td" podUID="373af144-ae77-4496-8057-d855373807e4" Apr 23 13:32:14.002678 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:14.002636 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/75053ed6-040a-450c-b423-ce9ec4714d2f-original-pull-secret\") pod \"global-pull-secret-syncer-x4nd8\" (UID: \"75053ed6-040a-450c-b423-ce9ec4714d2f\") " pod="kube-system/global-pull-secret-syncer-x4nd8" Apr 23 13:32:14.003150 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:14.002808 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 13:32:14.003150 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:14.002868 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/75053ed6-040a-450c-b423-ce9ec4714d2f-original-pull-secret podName:75053ed6-040a-450c-b423-ce9ec4714d2f nodeName:}" failed. No retries permitted until 2026-04-23 13:32:15.00285058 +0000 UTC m=+7.980759151 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/75053ed6-040a-450c-b423-ce9ec4714d2f-original-pull-secret") pod "global-pull-secret-syncer-x4nd8" (UID: "75053ed6-040a-450c-b423-ce9ec4714d2f") : object "kube-system"/"original-pull-secret" not registered Apr 23 13:32:14.579999 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:14.579964 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-x4nd8" Apr 23 13:32:14.580170 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:14.579976 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6pz6w" Apr 23 13:32:14.580170 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:14.580097 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-x4nd8" podUID="75053ed6-040a-450c-b423-ce9ec4714d2f" Apr 23 13:32:14.580291 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:14.580209 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6pz6w" podUID="821df7f9-3f87-4f86-a7e9-82cad302fff0" Apr 23 13:32:15.010866 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:15.010813 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/75053ed6-040a-450c-b423-ce9ec4714d2f-original-pull-secret\") pod \"global-pull-secret-syncer-x4nd8\" (UID: \"75053ed6-040a-450c-b423-ce9ec4714d2f\") " pod="kube-system/global-pull-secret-syncer-x4nd8" Apr 23 13:32:15.011274 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:15.010978 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 13:32:15.011274 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:15.011035 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/75053ed6-040a-450c-b423-ce9ec4714d2f-original-pull-secret podName:75053ed6-040a-450c-b423-ce9ec4714d2f nodeName:}" failed. No retries permitted until 2026-04-23 13:32:17.011017632 +0000 UTC m=+9.988926203 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/75053ed6-040a-450c-b423-ce9ec4714d2f-original-pull-secret") pod "global-pull-secret-syncer-x4nd8" (UID: "75053ed6-040a-450c-b423-ce9ec4714d2f") : object "kube-system"/"original-pull-secret" not registered Apr 23 13:32:15.579919 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:15.579887 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fl2td" Apr 23 13:32:15.580095 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:15.580008 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fl2td" podUID="373af144-ae77-4496-8057-d855373807e4" Apr 23 13:32:16.580125 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:16.580089 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-x4nd8" Apr 23 13:32:16.580591 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:16.580225 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-x4nd8" podUID="75053ed6-040a-450c-b423-ce9ec4714d2f" Apr 23 13:32:16.580591 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:16.580090 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6pz6w" Apr 23 13:32:16.580591 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:16.580392 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6pz6w" podUID="821df7f9-3f87-4f86-a7e9-82cad302fff0" Apr 23 13:32:17.028758 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:17.028723 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/75053ed6-040a-450c-b423-ce9ec4714d2f-original-pull-secret\") pod \"global-pull-secret-syncer-x4nd8\" (UID: \"75053ed6-040a-450c-b423-ce9ec4714d2f\") " pod="kube-system/global-pull-secret-syncer-x4nd8" Apr 23 13:32:17.028953 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:17.028881 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 13:32:17.028953 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:17.028948 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/75053ed6-040a-450c-b423-ce9ec4714d2f-original-pull-secret podName:75053ed6-040a-450c-b423-ce9ec4714d2f nodeName:}" failed. No retries permitted until 2026-04-23 13:32:21.028929421 +0000 UTC m=+14.006838004 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/75053ed6-040a-450c-b423-ce9ec4714d2f-original-pull-secret") pod "global-pull-secret-syncer-x4nd8" (UID: "75053ed6-040a-450c-b423-ce9ec4714d2f") : object "kube-system"/"original-pull-secret" not registered Apr 23 13:32:17.229911 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:17.229874 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/821df7f9-3f87-4f86-a7e9-82cad302fff0-metrics-certs\") pod \"network-metrics-daemon-6pz6w\" (UID: \"821df7f9-3f87-4f86-a7e9-82cad302fff0\") " pod="openshift-multus/network-metrics-daemon-6pz6w" Apr 23 13:32:17.230097 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:17.230046 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:32:17.230157 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:17.230117 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/821df7f9-3f87-4f86-a7e9-82cad302fff0-metrics-certs podName:821df7f9-3f87-4f86-a7e9-82cad302fff0 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:25.230094343 +0000 UTC m=+18.208002917 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/821df7f9-3f87-4f86-a7e9-82cad302fff0-metrics-certs") pod "network-metrics-daemon-6pz6w" (UID: "821df7f9-3f87-4f86-a7e9-82cad302fff0") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:32:17.330927 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:17.330843 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gjg76\" (UniqueName: \"kubernetes.io/projected/373af144-ae77-4496-8057-d855373807e4-kube-api-access-gjg76\") pod \"network-check-target-fl2td\" (UID: \"373af144-ae77-4496-8057-d855373807e4\") " pod="openshift-network-diagnostics/network-check-target-fl2td" Apr 23 13:32:17.331386 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:17.331362 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 13:32:17.331509 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:17.331391 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 13:32:17.331509 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:17.331405 2577 projected.go:194] Error preparing data for projected volume kube-api-access-gjg76 for pod openshift-network-diagnostics/network-check-target-fl2td: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:32:17.332514 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:17.331788 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/373af144-ae77-4496-8057-d855373807e4-kube-api-access-gjg76 podName:373af144-ae77-4496-8057-d855373807e4 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:25.331759347 +0000 UTC m=+18.309667921 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-gjg76" (UniqueName: "kubernetes.io/projected/373af144-ae77-4496-8057-d855373807e4-kube-api-access-gjg76") pod "network-check-target-fl2td" (UID: "373af144-ae77-4496-8057-d855373807e4") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:32:17.580210 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:17.580160 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fl2td" Apr 23 13:32:17.580645 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:17.580294 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fl2td" podUID="373af144-ae77-4496-8057-d855373807e4" Apr 23 13:32:18.580144 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:18.580115 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-x4nd8" Apr 23 13:32:18.580144 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:18.580132 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6pz6w" Apr 23 13:32:18.580690 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:18.580232 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-x4nd8" podUID="75053ed6-040a-450c-b423-ce9ec4714d2f" Apr 23 13:32:18.580690 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:18.580343 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6pz6w" podUID="821df7f9-3f87-4f86-a7e9-82cad302fff0" Apr 23 13:32:19.579766 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:19.579691 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fl2td" Apr 23 13:32:19.579912 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:19.579797 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fl2td" podUID="373af144-ae77-4496-8057-d855373807e4" Apr 23 13:32:20.579589 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:20.579556 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-x4nd8" Apr 23 13:32:20.580046 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:20.579557 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6pz6w" Apr 23 13:32:20.580046 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:20.579685 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-x4nd8" podUID="75053ed6-040a-450c-b423-ce9ec4714d2f" Apr 23 13:32:20.580046 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:20.579751 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6pz6w" podUID="821df7f9-3f87-4f86-a7e9-82cad302fff0" Apr 23 13:32:21.056573 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:21.056529 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/75053ed6-040a-450c-b423-ce9ec4714d2f-original-pull-secret\") pod \"global-pull-secret-syncer-x4nd8\" (UID: \"75053ed6-040a-450c-b423-ce9ec4714d2f\") " pod="kube-system/global-pull-secret-syncer-x4nd8" Apr 23 13:32:21.056755 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:21.056674 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 13:32:21.056755 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:21.056742 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/75053ed6-040a-450c-b423-ce9ec4714d2f-original-pull-secret podName:75053ed6-040a-450c-b423-ce9ec4714d2f nodeName:}" failed. No retries permitted until 2026-04-23 13:32:29.056724035 +0000 UTC m=+22.034632622 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/75053ed6-040a-450c-b423-ce9ec4714d2f-original-pull-secret") pod "global-pull-secret-syncer-x4nd8" (UID: "75053ed6-040a-450c-b423-ce9ec4714d2f") : object "kube-system"/"original-pull-secret" not registered Apr 23 13:32:21.579522 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:21.579476 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fl2td" Apr 23 13:32:21.579690 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:21.579613 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fl2td" podUID="373af144-ae77-4496-8057-d855373807e4" Apr 23 13:32:22.579540 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:22.579508 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6pz6w" Apr 23 13:32:22.579700 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:22.579508 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-x4nd8" Apr 23 13:32:22.579700 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:22.579643 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6pz6w" podUID="821df7f9-3f87-4f86-a7e9-82cad302fff0" Apr 23 13:32:22.579700 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:22.579678 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-x4nd8" podUID="75053ed6-040a-450c-b423-ce9ec4714d2f" Apr 23 13:32:23.579995 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:23.579963 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fl2td" Apr 23 13:32:23.580440 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:23.580078 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fl2td" podUID="373af144-ae77-4496-8057-d855373807e4" Apr 23 13:32:24.579660 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:24.579481 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-x4nd8" Apr 23 13:32:24.579911 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:24.579481 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6pz6w" Apr 23 13:32:24.579911 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:24.579763 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-x4nd8" podUID="75053ed6-040a-450c-b423-ce9ec4714d2f" Apr 23 13:32:24.579911 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:24.579850 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6pz6w" podUID="821df7f9-3f87-4f86-a7e9-82cad302fff0" Apr 23 13:32:25.293535 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:25.293473 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/821df7f9-3f87-4f86-a7e9-82cad302fff0-metrics-certs\") pod \"network-metrics-daemon-6pz6w\" (UID: \"821df7f9-3f87-4f86-a7e9-82cad302fff0\") " pod="openshift-multus/network-metrics-daemon-6pz6w" Apr 23 13:32:25.293714 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:25.293655 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:32:25.293768 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:25.293735 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/821df7f9-3f87-4f86-a7e9-82cad302fff0-metrics-certs podName:821df7f9-3f87-4f86-a7e9-82cad302fff0 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:41.293714097 +0000 UTC m=+34.271622668 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/821df7f9-3f87-4f86-a7e9-82cad302fff0-metrics-certs") pod "network-metrics-daemon-6pz6w" (UID: "821df7f9-3f87-4f86-a7e9-82cad302fff0") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:32:25.394760 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:25.394727 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gjg76\" (UniqueName: \"kubernetes.io/projected/373af144-ae77-4496-8057-d855373807e4-kube-api-access-gjg76\") pod \"network-check-target-fl2td\" (UID: \"373af144-ae77-4496-8057-d855373807e4\") " pod="openshift-network-diagnostics/network-check-target-fl2td" Apr 23 13:32:25.394903 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:25.394865 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 13:32:25.394903 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:25.394882 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 13:32:25.394903 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:25.394891 2577 projected.go:194] Error preparing data for projected volume kube-api-access-gjg76 for pod openshift-network-diagnostics/network-check-target-fl2td: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:32:25.395025 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:25.394939 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/373af144-ae77-4496-8057-d855373807e4-kube-api-access-gjg76 podName:373af144-ae77-4496-8057-d855373807e4 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:41.39492636 +0000 UTC m=+34.372834930 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-gjg76" (UniqueName: "kubernetes.io/projected/373af144-ae77-4496-8057-d855373807e4-kube-api-access-gjg76") pod "network-check-target-fl2td" (UID: "373af144-ae77-4496-8057-d855373807e4") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:32:25.580220 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:25.580151 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fl2td" Apr 23 13:32:25.580624 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:25.580247 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fl2td" podUID="373af144-ae77-4496-8057-d855373807e4" Apr 23 13:32:26.579436 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:26.579406 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-x4nd8" Apr 23 13:32:26.579638 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:26.579413 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6pz6w" Apr 23 13:32:26.579638 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:26.579535 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-x4nd8" podUID="75053ed6-040a-450c-b423-ce9ec4714d2f" Apr 23 13:32:26.579638 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:26.579589 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6pz6w" podUID="821df7f9-3f87-4f86-a7e9-82cad302fff0" Apr 23 13:32:27.580635 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:27.580607 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fl2td" Apr 23 13:32:27.581261 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:27.580715 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fl2td" podUID="373af144-ae77-4496-8057-d855373807e4" Apr 23 13:32:27.649067 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:27.649033 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xvg2d" event={"ID":"b61ffc5b-bae6-4b74-a181-3e3df6606045","Type":"ContainerStarted","Data":"fc149d003b2d4c91549aab445d4b38719c0db63a67fe29598a5cd676b92c310e"} Apr 23 13:32:27.650560 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:27.650238 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-g4zgx" event={"ID":"7c29fef9-0671-485c-988d-0b06e4091d1a","Type":"ContainerStarted","Data":"90e3fbc41be2e695b1136410e9d933f75126efdcdabe05de1cfc750f7e5ede38"} Apr 23 13:32:27.651890 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:27.651856 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-mbln5" event={"ID":"aecfee63-4703-49e8-81cc-aa07bc06ce4e","Type":"ContainerStarted","Data":"c6ea73970bb8e608855b59fe8f0cd8065dd879c96c7bc3467f807997f989824b"} Apr 23 13:32:27.653620 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:27.653050 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p9tzn" event={"ID":"4a590caf-dc65-421e-a4c8-40d3258ddd7b","Type":"ContainerStarted","Data":"9fbe5807028bc96716773cb08370a0bf7063b17ed8151bee5590d70accb84a81"} Apr 23 13:32:27.656112 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:27.656079 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-shltj" event={"ID":"2241e05b-7796-4e2e-b1cf-f47baaeef969","Type":"ContainerStarted","Data":"6a027cc341f82df1d5ca4b5b975c1422b672ea62d25399577320198b36bfafcd"} Apr 23 13:32:27.668806 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:27.668766 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-g4zgx" podStartSLOduration=3.5571672420000002 podStartE2EDuration="20.668739558s" podCreationTimestamp="2026-04-23 13:32:07 +0000 UTC" firstStartedPulling="2026-04-23 13:32:10.256756912 +0000 UTC m=+3.234665488" lastFinishedPulling="2026-04-23 13:32:27.368329228 +0000 UTC m=+20.346237804" observedRunningTime="2026-04-23 13:32:27.668071854 +0000 UTC m=+20.645980448" watchObservedRunningTime="2026-04-23 13:32:27.668739558 +0000 UTC m=+20.646648150" Apr 23 13:32:27.724848 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:27.724806 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-shltj" podStartSLOduration=3.606812262 podStartE2EDuration="20.72478542s" podCreationTimestamp="2026-04-23 13:32:07 +0000 UTC" firstStartedPulling="2026-04-23 13:32:10.248558292 +0000 UTC m=+3.226466865" lastFinishedPulling="2026-04-23 13:32:27.366531453 +0000 UTC m=+20.344440023" observedRunningTime="2026-04-23 13:32:27.724716412 +0000 UTC m=+20.702625004" watchObservedRunningTime="2026-04-23 13:32:27.72478542 +0000 UTC m=+20.702694068" Apr 23 13:32:27.748029 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:27.747987 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-mbln5" podStartSLOduration=8.361977091 podStartE2EDuration="20.747973934s" podCreationTimestamp="2026-04-23 13:32:07 +0000 UTC" firstStartedPulling="2026-04-23 13:32:10.255296818 +0000 UTC m=+3.233205389" lastFinishedPulling="2026-04-23 13:32:22.641293645 +0000 UTC m=+15.619202232" observedRunningTime="2026-04-23 13:32:27.747619853 +0000 UTC m=+20.725528449" watchObservedRunningTime="2026-04-23 13:32:27.747973934 +0000 UTC m=+20.725882580" Apr 23 13:32:28.580089 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:28.579923 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6pz6w" Apr 23 13:32:28.580196 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:28.579923 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-x4nd8" Apr 23 13:32:28.580196 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:28.580163 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6pz6w" podUID="821df7f9-3f87-4f86-a7e9-82cad302fff0" Apr 23 13:32:28.580299 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:28.580253 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-x4nd8" podUID="75053ed6-040a-450c-b423-ce9ec4714d2f" Apr 23 13:32:28.659031 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:28.658993 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-wxlj2" event={"ID":"5ab5642a-1989-41c1-956f-98f92fcc6f23","Type":"ContainerStarted","Data":"e9a03a3bc2f67a52cd11653d9cee63924265f4af2d3329eeeb1eb1dbf54cb95e"} Apr 23 13:32:28.663342 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:28.663314 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mwbjc" event={"ID":"58abee5a-98ee-4a90-ab84-a17d06d08d00","Type":"ContainerStarted","Data":"2ff04359b70b88f7afcb7a5af02799c1422a9bcdfc289ce310b2bf805af53021"} Apr 23 13:32:28.663342 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:28.663345 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mwbjc" event={"ID":"58abee5a-98ee-4a90-ab84-a17d06d08d00","Type":"ContainerStarted","Data":"b756c9797d327a1d82b577ba82158f2352894f46830d513e6007a60b44ee58e8"} Apr 23 13:32:28.663624 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:28.663359 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mwbjc" event={"ID":"58abee5a-98ee-4a90-ab84-a17d06d08d00","Type":"ContainerStarted","Data":"e07e3ecc01a78a04b13c78006ff338f6524ba3b87f276048c9935fb94d38b0dd"} Apr 23 13:32:28.663624 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:28.663372 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mwbjc" event={"ID":"58abee5a-98ee-4a90-ab84-a17d06d08d00","Type":"ContainerStarted","Data":"ace49ed6f607a4094c0121d06baad011225e099bbce86b26aa2ca96215875adb"} Apr 23 13:32:28.663624 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:28.663384 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mwbjc" event={"ID":"58abee5a-98ee-4a90-ab84-a17d06d08d00","Type":"ContainerStarted","Data":"b6d69ee8415b11c034e4742874a29ca88ba269a36c8506628a31d8003464d5d4"} Apr 23 13:32:28.663624 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:28.663397 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mwbjc" event={"ID":"58abee5a-98ee-4a90-ab84-a17d06d08d00","Type":"ContainerStarted","Data":"0ecf2e7fed7b14e540fb8dfd110f6dccb79dbe8a1a56dc5a905132d56cf1e215"} Apr 23 13:32:28.665520 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:28.665467 2577 generic.go:358] "Generic (PLEG): container finished" podID="4a590caf-dc65-421e-a4c8-40d3258ddd7b" containerID="9fbe5807028bc96716773cb08370a0bf7063b17ed8151bee5590d70accb84a81" exitCode=0 Apr 23 13:32:28.665616 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:28.665567 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p9tzn" event={"ID":"4a590caf-dc65-421e-a4c8-40d3258ddd7b","Type":"ContainerDied","Data":"9fbe5807028bc96716773cb08370a0bf7063b17ed8151bee5590d70accb84a81"} Apr 23 13:32:28.669482 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:28.669455 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9pf4r" event={"ID":"0fc2bb02-44f9-45bf-aa27-62f98f04e5ac","Type":"ContainerStarted","Data":"7052d780d3a9fddcca509252c111eaeb762322d0bc17717d9f06a491cd89be6e"} Apr 23 13:32:28.672856 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:28.672820 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-wxlj2" podStartSLOduration=3.564376794 podStartE2EDuration="20.672809709s" podCreationTimestamp="2026-04-23 13:32:08 +0000 UTC" firstStartedPulling="2026-04-23 13:32:10.258334985 +0000 UTC m=+3.236243555" lastFinishedPulling="2026-04-23 13:32:27.366767889 +0000 UTC m=+20.344676470" observedRunningTime="2026-04-23 13:32:28.672728603 +0000 UTC m=+21.650637194" watchObservedRunningTime="2026-04-23 13:32:28.672809709 +0000 UTC m=+21.650718301" Apr 23 13:32:28.704410 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:28.704371 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-xvg2d" podStartSLOduration=4.414545424 podStartE2EDuration="21.704360096s" podCreationTimestamp="2026-04-23 13:32:07 +0000 UTC" firstStartedPulling="2026-04-23 13:32:10.258504225 +0000 UTC m=+3.236412795" lastFinishedPulling="2026-04-23 13:32:27.548318893 +0000 UTC m=+20.526227467" observedRunningTime="2026-04-23 13:32:28.686890419 +0000 UTC m=+21.664799012" watchObservedRunningTime="2026-04-23 13:32:28.704360096 +0000 UTC m=+21.682268687" Apr 23 13:32:28.784683 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:28.784520 2577 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 23 13:32:28.857195 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:28.857131 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-shltj" Apr 23 13:32:28.858129 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:28.858113 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-shltj" Apr 23 13:32:29.128253 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:29.128177 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/75053ed6-040a-450c-b423-ce9ec4714d2f-original-pull-secret\") pod \"global-pull-secret-syncer-x4nd8\" (UID: \"75053ed6-040a-450c-b423-ce9ec4714d2f\") " pod="kube-system/global-pull-secret-syncer-x4nd8" Apr 23 13:32:29.128379 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:29.128312 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 13:32:29.128379 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:29.128363 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/75053ed6-040a-450c-b423-ce9ec4714d2f-original-pull-secret podName:75053ed6-040a-450c-b423-ce9ec4714d2f nodeName:}" failed. No retries permitted until 2026-04-23 13:32:45.128350273 +0000 UTC m=+38.106258843 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/75053ed6-040a-450c-b423-ce9ec4714d2f-original-pull-secret") pod "global-pull-secret-syncer-x4nd8" (UID: "75053ed6-040a-450c-b423-ce9ec4714d2f") : object "kube-system"/"original-pull-secret" not registered Apr 23 13:32:29.527287 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:29.527138 2577 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-23T13:32:28.784680047Z","UUID":"5b2100d4-cbfd-4c7b-b875-83378142e2f7","Handler":null,"Name":"","Endpoint":""} Apr 23 13:32:29.529781 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:29.529750 2577 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 23 13:32:29.529899 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:29.529791 2577 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 23 13:32:29.579266 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:29.579239 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fl2td" Apr 23 13:32:29.579400 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:29.579359 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fl2td" podUID="373af144-ae77-4496-8057-d855373807e4" Apr 23 13:32:29.673456 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:29.673414 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9pf4r" event={"ID":"0fc2bb02-44f9-45bf-aa27-62f98f04e5ac","Type":"ContainerStarted","Data":"af0689b12b89bd1c80fe799a71010cb6c70e4ea27beace61365f03b153b8d79e"} Apr 23 13:32:29.675582 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:29.675551 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-v5s77" event={"ID":"8de6e215-aa9d-4003-aac4-d2a7bbdb59fb","Type":"ContainerStarted","Data":"73681549ec5b23a144260edec0ab77d0b7d347a94f3b1a6fd70ae11c27aee34c"} Apr 23 13:32:30.579830 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:30.579798 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6pz6w" Apr 23 13:32:30.580035 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:30.579798 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-x4nd8" Apr 23 13:32:30.580035 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:30.579926 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6pz6w" podUID="821df7f9-3f87-4f86-a7e9-82cad302fff0" Apr 23 13:32:30.580035 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:30.579979 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-x4nd8" podUID="75053ed6-040a-450c-b423-ce9ec4714d2f" Apr 23 13:32:30.679353 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:30.679323 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9pf4r" event={"ID":"0fc2bb02-44f9-45bf-aa27-62f98f04e5ac","Type":"ContainerStarted","Data":"491c251be603e6d380062bcc0743adf6c67eaa5cf0753d169c7f986aba4fe6d7"} Apr 23 13:32:30.679879 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:30.679337 2577 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 23 13:32:30.702763 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:30.702716 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-v5s77" podStartSLOduration=6.590083155 podStartE2EDuration="23.702699896s" podCreationTimestamp="2026-04-23 13:32:07 +0000 UTC" firstStartedPulling="2026-04-23 13:32:10.254136671 +0000 UTC m=+3.232045244" lastFinishedPulling="2026-04-23 13:32:27.366753409 +0000 UTC m=+20.344661985" observedRunningTime="2026-04-23 13:32:29.697549337 +0000 UTC m=+22.675457942" watchObservedRunningTime="2026-04-23 13:32:30.702699896 +0000 UTC m=+23.680608490" Apr 23 13:32:30.703112 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:30.703073 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9pf4r" podStartSLOduration=4.369310585 podStartE2EDuration="23.703066475s" podCreationTimestamp="2026-04-23 13:32:07 +0000 UTC" firstStartedPulling="2026-04-23 13:32:10.259764971 +0000 UTC m=+3.237673555" lastFinishedPulling="2026-04-23 13:32:29.593520872 +0000 UTC m=+22.571429445" observedRunningTime="2026-04-23 13:32:30.702373151 +0000 UTC m=+23.680281743" watchObservedRunningTime="2026-04-23 13:32:30.703066475 +0000 UTC m=+23.680975068" Apr 23 13:32:31.579967 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:31.579935 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fl2td" Apr 23 13:32:31.580149 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:31.580046 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fl2td" podUID="373af144-ae77-4496-8057-d855373807e4" Apr 23 13:32:31.684102 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:31.684061 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mwbjc" event={"ID":"58abee5a-98ee-4a90-ab84-a17d06d08d00","Type":"ContainerStarted","Data":"f0ce1931b379a5fd2d8a944e4d62364a22279d52c14785a9720ea6b7ffc1cd3d"} Apr 23 13:32:32.579833 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:32.579803 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-x4nd8" Apr 23 13:32:32.579992 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:32.579805 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6pz6w" Apr 23 13:32:32.579992 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:32.579924 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-x4nd8" podUID="75053ed6-040a-450c-b423-ce9ec4714d2f" Apr 23 13:32:32.579992 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:32.579982 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6pz6w" podUID="821df7f9-3f87-4f86-a7e9-82cad302fff0" Apr 23 13:32:33.579915 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:33.579742 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fl2td" Apr 23 13:32:33.580330 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:33.579985 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fl2td" podUID="373af144-ae77-4496-8057-d855373807e4" Apr 23 13:32:33.690272 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:33.690236 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mwbjc" event={"ID":"58abee5a-98ee-4a90-ab84-a17d06d08d00","Type":"ContainerStarted","Data":"9c7527bea5a9c4d8c6ace1795e7a05d75f5b0f9be60ba1781bf70585d3b63f0d"} Apr 23 13:32:33.690553 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:33.690531 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-mwbjc" Apr 23 13:32:33.690668 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:33.690561 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-mwbjc" Apr 23 13:32:33.691694 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:33.691664 2577 generic.go:358] "Generic (PLEG): container finished" podID="4a590caf-dc65-421e-a4c8-40d3258ddd7b" containerID="d9e1f7572c9cc5bb2c743d9441ae7cd9b917e31e7c5fd0398936974981706219" exitCode=0 Apr 23 13:32:33.691857 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:33.691700 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p9tzn" event={"ID":"4a590caf-dc65-421e-a4c8-40d3258ddd7b","Type":"ContainerDied","Data":"d9e1f7572c9cc5bb2c743d9441ae7cd9b917e31e7c5fd0398936974981706219"} Apr 23 13:32:33.705330 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:33.705307 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mwbjc" Apr 23 13:32:33.705426 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:33.705368 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mwbjc" Apr 23 13:32:33.724257 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:33.722742 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-mwbjc" podStartSLOduration=9.034546647 podStartE2EDuration="26.722726418s" podCreationTimestamp="2026-04-23 13:32:07 +0000 UTC" firstStartedPulling="2026-04-23 13:32:10.252935486 +0000 UTC m=+3.230844071" lastFinishedPulling="2026-04-23 13:32:27.941115268 +0000 UTC m=+20.919023842" observedRunningTime="2026-04-23 13:32:33.720179206 +0000 UTC m=+26.698087808" watchObservedRunningTime="2026-04-23 13:32:33.722726418 +0000 UTC m=+26.700635012" Apr 23 13:32:34.579403 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:34.579336 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-x4nd8" Apr 23 13:32:34.579537 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:34.579336 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6pz6w" Apr 23 13:32:34.579537 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:34.579444 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-x4nd8" podUID="75053ed6-040a-450c-b423-ce9ec4714d2f" Apr 23 13:32:34.579608 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:34.579531 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6pz6w" podUID="821df7f9-3f87-4f86-a7e9-82cad302fff0" Apr 23 13:32:34.696036 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:34.696005 2577 generic.go:358] "Generic (PLEG): container finished" podID="4a590caf-dc65-421e-a4c8-40d3258ddd7b" containerID="408c1142c5e80fed6a0eedd371bcb7cb2c481d6b2d67405403a97369973197f1" exitCode=0 Apr 23 13:32:34.696427 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:34.696092 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p9tzn" event={"ID":"4a590caf-dc65-421e-a4c8-40d3258ddd7b","Type":"ContainerDied","Data":"408c1142c5e80fed6a0eedd371bcb7cb2c481d6b2d67405403a97369973197f1"} Apr 23 13:32:34.696427 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:34.696318 2577 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 23 13:32:34.933803 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:34.933776 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-x4nd8"] Apr 23 13:32:34.933939 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:34.933869 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-x4nd8" Apr 23 13:32:34.933986 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:34.933946 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-x4nd8" podUID="75053ed6-040a-450c-b423-ce9ec4714d2f" Apr 23 13:32:34.937150 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:34.937124 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-6pz6w"] Apr 23 13:32:34.937254 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:34.937219 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6pz6w" Apr 23 13:32:34.937346 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:34.937326 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6pz6w" podUID="821df7f9-3f87-4f86-a7e9-82cad302fff0" Apr 23 13:32:34.937759 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:34.937740 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-fl2td"] Apr 23 13:32:34.937862 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:34.937819 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fl2td" Apr 23 13:32:34.937911 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:34.937896 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fl2td" podUID="373af144-ae77-4496-8057-d855373807e4" Apr 23 13:32:35.209613 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:35.209543 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-mwbjc" Apr 23 13:32:35.699428 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:35.699399 2577 generic.go:358] "Generic (PLEG): container finished" podID="4a590caf-dc65-421e-a4c8-40d3258ddd7b" containerID="66d215d7caa9d5e607098872a7a2c329df85e856c283f1b9ccc2e4287052ca01" exitCode=0 Apr 23 13:32:35.699806 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:35.699483 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p9tzn" event={"ID":"4a590caf-dc65-421e-a4c8-40d3258ddd7b","Type":"ContainerDied","Data":"66d215d7caa9d5e607098872a7a2c329df85e856c283f1b9ccc2e4287052ca01"} Apr 23 13:32:36.580112 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:36.580073 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-x4nd8" Apr 23 13:32:36.580264 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:36.580124 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6pz6w" Apr 23 13:32:36.580264 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:36.580125 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fl2td" Apr 23 13:32:36.580264 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:36.580224 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-x4nd8" podUID="75053ed6-040a-450c-b423-ce9ec4714d2f" Apr 23 13:32:36.580378 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:36.580343 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6pz6w" podUID="821df7f9-3f87-4f86-a7e9-82cad302fff0" Apr 23 13:32:36.580425 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:36.580413 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fl2td" podUID="373af144-ae77-4496-8057-d855373807e4" Apr 23 13:32:36.643831 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:36.643802 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-shltj" Apr 23 13:32:36.644140 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:36.643940 2577 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 23 13:32:36.644543 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:36.644507 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-shltj" Apr 23 13:32:38.579206 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:38.579172 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6pz6w" Apr 23 13:32:38.579931 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:38.579172 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-x4nd8" Apr 23 13:32:38.579931 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:38.579300 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6pz6w" podUID="821df7f9-3f87-4f86-a7e9-82cad302fff0" Apr 23 13:32:38.579931 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:38.579172 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fl2td" Apr 23 13:32:38.579931 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:38.579361 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-x4nd8" podUID="75053ed6-040a-450c-b423-ce9ec4714d2f" Apr 23 13:32:38.579931 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:38.579443 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fl2td" podUID="373af144-ae77-4496-8057-d855373807e4" Apr 23 13:32:40.579381 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:40.579346 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6pz6w" Apr 23 13:32:40.580023 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:40.579346 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-x4nd8" Apr 23 13:32:40.580023 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:40.579470 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6pz6w" podUID="821df7f9-3f87-4f86-a7e9-82cad302fff0" Apr 23 13:32:40.580023 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:40.579346 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fl2td" Apr 23 13:32:40.580023 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:40.579556 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-x4nd8" podUID="75053ed6-040a-450c-b423-ce9ec4714d2f" Apr 23 13:32:40.580023 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:40.579633 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fl2td" podUID="373af144-ae77-4496-8057-d855373807e4" Apr 23 13:32:41.321252 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:41.321185 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/821df7f9-3f87-4f86-a7e9-82cad302fff0-metrics-certs\") pod \"network-metrics-daemon-6pz6w\" (UID: \"821df7f9-3f87-4f86-a7e9-82cad302fff0\") " pod="openshift-multus/network-metrics-daemon-6pz6w" Apr 23 13:32:41.321437 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:41.321293 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:32:41.321437 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:41.321341 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/821df7f9-3f87-4f86-a7e9-82cad302fff0-metrics-certs podName:821df7f9-3f87-4f86-a7e9-82cad302fff0 nodeName:}" failed. No retries permitted until 2026-04-23 13:33:13.321327691 +0000 UTC m=+66.299236260 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/821df7f9-3f87-4f86-a7e9-82cad302fff0-metrics-certs") pod "network-metrics-daemon-6pz6w" (UID: "821df7f9-3f87-4f86-a7e9-82cad302fff0") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:32:41.368748 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:41.368722 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-207.ec2.internal" event="NodeReady" Apr 23 13:32:41.368887 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:41.368849 2577 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 23 13:32:41.404188 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:41.404118 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-986fd6f74-qp86z"] Apr 23 13:32:41.420913 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:41.420717 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-rrmk8"] Apr 23 13:32:41.421149 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:41.421125 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-986fd6f74-qp86z" Apr 23 13:32:41.422015 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:41.421980 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gjg76\" (UniqueName: \"kubernetes.io/projected/373af144-ae77-4496-8057-d855373807e4-kube-api-access-gjg76\") pod \"network-check-target-fl2td\" (UID: \"373af144-ae77-4496-8057-d855373807e4\") " pod="openshift-network-diagnostics/network-check-target-fl2td" Apr 23 13:32:41.422191 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:41.422172 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 13:32:41.422254 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:41.422197 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 13:32:41.422254 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:41.422212 2577 projected.go:194] Error preparing data for projected volume kube-api-access-gjg76 for pod openshift-network-diagnostics/network-check-target-fl2td: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:32:41.422361 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:41.422265 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/373af144-ae77-4496-8057-d855373807e4-kube-api-access-gjg76 podName:373af144-ae77-4496-8057-d855373807e4 nodeName:}" failed. No retries permitted until 2026-04-23 13:33:13.422253556 +0000 UTC m=+66.400162126 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-gjg76" (UniqueName: "kubernetes.io/projected/373af144-ae77-4496-8057-d855373807e4-kube-api-access-gjg76") pod "network-check-target-fl2td" (UID: "373af144-ae77-4496-8057-d855373807e4") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:32:41.423890 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:41.423872 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 23 13:32:41.424370 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:41.424354 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-7pd9g\"" Apr 23 13:32:41.424647 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:41.424628 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 23 13:32:41.424745 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:41.424633 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 23 13:32:41.435978 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:41.435956 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-b9l6f"] Apr 23 13:32:41.436309 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:41.436155 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-rrmk8" Apr 23 13:32:41.438001 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:41.437973 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 23 13:32:41.438656 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:41.438641 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 23 13:32:41.439152 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:41.439135 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 23 13:32:41.439239 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:41.439170 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-svh2p\"" Apr 23 13:32:41.450792 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:41.450774 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-986fd6f74-qp86z"] Apr 23 13:32:41.450792 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:41.450793 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-rrmk8"] Apr 23 13:32:41.450906 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:41.450802 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-b9l6f"] Apr 23 13:32:41.450906 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:41.450871 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-b9l6f" Apr 23 13:32:41.453787 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:41.453768 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 23 13:32:41.454006 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:41.453801 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-cqcw8\"" Apr 23 13:32:41.454006 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:41.453850 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 23 13:32:41.454098 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:41.454083 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 23 13:32:41.522474 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:41.522450 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2de7f8d7-9da6-4fa8-b5a9-c641f96806e0-trusted-ca\") pod \"image-registry-986fd6f74-qp86z\" (UID: \"2de7f8d7-9da6-4fa8-b5a9-c641f96806e0\") " pod="openshift-image-registry/image-registry-986fd6f74-qp86z" Apr 23 13:32:41.522645 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:41.522477 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2de7f8d7-9da6-4fa8-b5a9-c641f96806e0-bound-sa-token\") pod \"image-registry-986fd6f74-qp86z\" (UID: \"2de7f8d7-9da6-4fa8-b5a9-c641f96806e0\") " pod="openshift-image-registry/image-registry-986fd6f74-qp86z" Apr 23 13:32:41.522645 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:41.522512 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47jnp\" (UniqueName: \"kubernetes.io/projected/2de7f8d7-9da6-4fa8-b5a9-c641f96806e0-kube-api-access-47jnp\") pod \"image-registry-986fd6f74-qp86z\" (UID: \"2de7f8d7-9da6-4fa8-b5a9-c641f96806e0\") " pod="openshift-image-registry/image-registry-986fd6f74-qp86z" Apr 23 13:32:41.522645 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:41.522582 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d3ae8909-ecd0-47e1-a99c-4ea293db3077-config-volume\") pod \"dns-default-rrmk8\" (UID: \"d3ae8909-ecd0-47e1-a99c-4ea293db3077\") " pod="openshift-dns/dns-default-rrmk8" Apr 23 13:32:41.522645 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:41.522613 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d3ae8909-ecd0-47e1-a99c-4ea293db3077-metrics-tls\") pod \"dns-default-rrmk8\" (UID: \"d3ae8909-ecd0-47e1-a99c-4ea293db3077\") " pod="openshift-dns/dns-default-rrmk8" Apr 23 13:32:41.522645 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:41.522637 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d3ae8909-ecd0-47e1-a99c-4ea293db3077-tmp-dir\") pod \"dns-default-rrmk8\" (UID: \"d3ae8909-ecd0-47e1-a99c-4ea293db3077\") " pod="openshift-dns/dns-default-rrmk8" Apr 23 13:32:41.522903 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:41.522706 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/2de7f8d7-9da6-4fa8-b5a9-c641f96806e0-image-registry-private-configuration\") pod \"image-registry-986fd6f74-qp86z\" (UID: \"2de7f8d7-9da6-4fa8-b5a9-c641f96806e0\") " pod="openshift-image-registry/image-registry-986fd6f74-qp86z" Apr 23 13:32:41.522903 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:41.522743 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2de7f8d7-9da6-4fa8-b5a9-c641f96806e0-ca-trust-extracted\") pod \"image-registry-986fd6f74-qp86z\" (UID: \"2de7f8d7-9da6-4fa8-b5a9-c641f96806e0\") " pod="openshift-image-registry/image-registry-986fd6f74-qp86z" Apr 23 13:32:41.522903 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:41.522780 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2de7f8d7-9da6-4fa8-b5a9-c641f96806e0-registry-tls\") pod \"image-registry-986fd6f74-qp86z\" (UID: \"2de7f8d7-9da6-4fa8-b5a9-c641f96806e0\") " pod="openshift-image-registry/image-registry-986fd6f74-qp86z" Apr 23 13:32:41.522903 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:41.522803 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fdj8\" (UniqueName: \"kubernetes.io/projected/d3ae8909-ecd0-47e1-a99c-4ea293db3077-kube-api-access-8fdj8\") pod \"dns-default-rrmk8\" (UID: \"d3ae8909-ecd0-47e1-a99c-4ea293db3077\") " pod="openshift-dns/dns-default-rrmk8" Apr 23 13:32:41.522903 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:41.522849 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2de7f8d7-9da6-4fa8-b5a9-c641f96806e0-installation-pull-secrets\") pod \"image-registry-986fd6f74-qp86z\" (UID: \"2de7f8d7-9da6-4fa8-b5a9-c641f96806e0\") " pod="openshift-image-registry/image-registry-986fd6f74-qp86z" Apr 23 13:32:41.522903 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:41.522895 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2de7f8d7-9da6-4fa8-b5a9-c641f96806e0-registry-certificates\") pod \"image-registry-986fd6f74-qp86z\" (UID: \"2de7f8d7-9da6-4fa8-b5a9-c641f96806e0\") " pod="openshift-image-registry/image-registry-986fd6f74-qp86z" Apr 23 13:32:41.624156 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:41.624135 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/2de7f8d7-9da6-4fa8-b5a9-c641f96806e0-image-registry-private-configuration\") pod \"image-registry-986fd6f74-qp86z\" (UID: \"2de7f8d7-9da6-4fa8-b5a9-c641f96806e0\") " pod="openshift-image-registry/image-registry-986fd6f74-qp86z" Apr 23 13:32:41.624565 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:41.624170 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5d0b8118-d3c3-4333-a6a3-c53abf8e3daa-cert\") pod \"ingress-canary-b9l6f\" (UID: \"5d0b8118-d3c3-4333-a6a3-c53abf8e3daa\") " pod="openshift-ingress-canary/ingress-canary-b9l6f" Apr 23 13:32:41.624565 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:41.624201 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2de7f8d7-9da6-4fa8-b5a9-c641f96806e0-ca-trust-extracted\") pod \"image-registry-986fd6f74-qp86z\" (UID: \"2de7f8d7-9da6-4fa8-b5a9-c641f96806e0\") " pod="openshift-image-registry/image-registry-986fd6f74-qp86z" Apr 23 13:32:41.624565 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:41.624251 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2de7f8d7-9da6-4fa8-b5a9-c641f96806e0-registry-tls\") pod \"image-registry-986fd6f74-qp86z\" (UID: \"2de7f8d7-9da6-4fa8-b5a9-c641f96806e0\") " pod="openshift-image-registry/image-registry-986fd6f74-qp86z" Apr 23 13:32:41.624565 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:41.624283 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8fdj8\" (UniqueName: \"kubernetes.io/projected/d3ae8909-ecd0-47e1-a99c-4ea293db3077-kube-api-access-8fdj8\") pod \"dns-default-rrmk8\" (UID: \"d3ae8909-ecd0-47e1-a99c-4ea293db3077\") " pod="openshift-dns/dns-default-rrmk8" Apr 23 13:32:41.624565 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:41.624316 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2de7f8d7-9da6-4fa8-b5a9-c641f96806e0-installation-pull-secrets\") pod \"image-registry-986fd6f74-qp86z\" (UID: \"2de7f8d7-9da6-4fa8-b5a9-c641f96806e0\") " pod="openshift-image-registry/image-registry-986fd6f74-qp86z" Apr 23 13:32:41.624565 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:41.624353 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 13:32:41.624565 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:41.624371 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-986fd6f74-qp86z: secret "image-registry-tls" not found Apr 23 13:32:41.624565 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:41.624353 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2de7f8d7-9da6-4fa8-b5a9-c641f96806e0-registry-certificates\") pod \"image-registry-986fd6f74-qp86z\" (UID: \"2de7f8d7-9da6-4fa8-b5a9-c641f96806e0\") " pod="openshift-image-registry/image-registry-986fd6f74-qp86z" Apr 23 13:32:41.624565 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:41.624415 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2de7f8d7-9da6-4fa8-b5a9-c641f96806e0-registry-tls podName:2de7f8d7-9da6-4fa8-b5a9-c641f96806e0 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:42.124396452 +0000 UTC m=+35.102305022 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/2de7f8d7-9da6-4fa8-b5a9-c641f96806e0-registry-tls") pod "image-registry-986fd6f74-qp86z" (UID: "2de7f8d7-9da6-4fa8-b5a9-c641f96806e0") : secret "image-registry-tls" not found Apr 23 13:32:41.624565 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:41.624455 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2de7f8d7-9da6-4fa8-b5a9-c641f96806e0-trusted-ca\") pod \"image-registry-986fd6f74-qp86z\" (UID: \"2de7f8d7-9da6-4fa8-b5a9-c641f96806e0\") " pod="openshift-image-registry/image-registry-986fd6f74-qp86z" Apr 23 13:32:41.624565 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:41.624479 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2de7f8d7-9da6-4fa8-b5a9-c641f96806e0-bound-sa-token\") pod \"image-registry-986fd6f74-qp86z\" (UID: \"2de7f8d7-9da6-4fa8-b5a9-c641f96806e0\") " pod="openshift-image-registry/image-registry-986fd6f74-qp86z" Apr 23 13:32:41.624565 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:41.624521 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-47jnp\" (UniqueName: \"kubernetes.io/projected/2de7f8d7-9da6-4fa8-b5a9-c641f96806e0-kube-api-access-47jnp\") pod \"image-registry-986fd6f74-qp86z\" (UID: \"2de7f8d7-9da6-4fa8-b5a9-c641f96806e0\") " pod="openshift-image-registry/image-registry-986fd6f74-qp86z" Apr 23 13:32:41.624565 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:41.624544 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2de7f8d7-9da6-4fa8-b5a9-c641f96806e0-ca-trust-extracted\") pod \"image-registry-986fd6f74-qp86z\" (UID: \"2de7f8d7-9da6-4fa8-b5a9-c641f96806e0\") " pod="openshift-image-registry/image-registry-986fd6f74-qp86z" Apr 23 13:32:41.624565 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:41.624562 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d3ae8909-ecd0-47e1-a99c-4ea293db3077-config-volume\") pod \"dns-default-rrmk8\" (UID: \"d3ae8909-ecd0-47e1-a99c-4ea293db3077\") " pod="openshift-dns/dns-default-rrmk8" Apr 23 13:32:41.625154 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:41.624587 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d3ae8909-ecd0-47e1-a99c-4ea293db3077-metrics-tls\") pod \"dns-default-rrmk8\" (UID: \"d3ae8909-ecd0-47e1-a99c-4ea293db3077\") " pod="openshift-dns/dns-default-rrmk8" Apr 23 13:32:41.625154 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:41.624609 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d3ae8909-ecd0-47e1-a99c-4ea293db3077-tmp-dir\") pod \"dns-default-rrmk8\" (UID: \"d3ae8909-ecd0-47e1-a99c-4ea293db3077\") " pod="openshift-dns/dns-default-rrmk8" Apr 23 13:32:41.625154 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:41.624636 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fh47\" (UniqueName: \"kubernetes.io/projected/5d0b8118-d3c3-4333-a6a3-c53abf8e3daa-kube-api-access-6fh47\") pod \"ingress-canary-b9l6f\" (UID: \"5d0b8118-d3c3-4333-a6a3-c53abf8e3daa\") " pod="openshift-ingress-canary/ingress-canary-b9l6f" Apr 23 13:32:41.625154 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:41.624906 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 13:32:41.625154 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:41.624961 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d3ae8909-ecd0-47e1-a99c-4ea293db3077-metrics-tls podName:d3ae8909-ecd0-47e1-a99c-4ea293db3077 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:42.124946272 +0000 UTC m=+35.102854850 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d3ae8909-ecd0-47e1-a99c-4ea293db3077-metrics-tls") pod "dns-default-rrmk8" (UID: "d3ae8909-ecd0-47e1-a99c-4ea293db3077") : secret "dns-default-metrics-tls" not found Apr 23 13:32:41.625154 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:41.624967 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2de7f8d7-9da6-4fa8-b5a9-c641f96806e0-registry-certificates\") pod \"image-registry-986fd6f74-qp86z\" (UID: \"2de7f8d7-9da6-4fa8-b5a9-c641f96806e0\") " pod="openshift-image-registry/image-registry-986fd6f74-qp86z" Apr 23 13:32:41.625154 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:41.625094 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d3ae8909-ecd0-47e1-a99c-4ea293db3077-tmp-dir\") pod \"dns-default-rrmk8\" (UID: \"d3ae8909-ecd0-47e1-a99c-4ea293db3077\") " pod="openshift-dns/dns-default-rrmk8" Apr 23 13:32:41.625361 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:41.625224 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d3ae8909-ecd0-47e1-a99c-4ea293db3077-config-volume\") pod \"dns-default-rrmk8\" (UID: \"d3ae8909-ecd0-47e1-a99c-4ea293db3077\") " pod="openshift-dns/dns-default-rrmk8" Apr 23 13:32:41.625578 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:41.625562 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2de7f8d7-9da6-4fa8-b5a9-c641f96806e0-trusted-ca\") pod \"image-registry-986fd6f74-qp86z\" (UID: \"2de7f8d7-9da6-4fa8-b5a9-c641f96806e0\") " pod="openshift-image-registry/image-registry-986fd6f74-qp86z" Apr 23 13:32:41.628183 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:41.628150 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/2de7f8d7-9da6-4fa8-b5a9-c641f96806e0-image-registry-private-configuration\") pod \"image-registry-986fd6f74-qp86z\" (UID: \"2de7f8d7-9da6-4fa8-b5a9-c641f96806e0\") " pod="openshift-image-registry/image-registry-986fd6f74-qp86z" Apr 23 13:32:41.628183 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:41.628177 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2de7f8d7-9da6-4fa8-b5a9-c641f96806e0-installation-pull-secrets\") pod \"image-registry-986fd6f74-qp86z\" (UID: \"2de7f8d7-9da6-4fa8-b5a9-c641f96806e0\") " pod="openshift-image-registry/image-registry-986fd6f74-qp86z" Apr 23 13:32:41.634575 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:41.634384 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fdj8\" (UniqueName: \"kubernetes.io/projected/d3ae8909-ecd0-47e1-a99c-4ea293db3077-kube-api-access-8fdj8\") pod \"dns-default-rrmk8\" (UID: \"d3ae8909-ecd0-47e1-a99c-4ea293db3077\") " pod="openshift-dns/dns-default-rrmk8" Apr 23 13:32:41.634860 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:41.634834 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2de7f8d7-9da6-4fa8-b5a9-c641f96806e0-bound-sa-token\") pod \"image-registry-986fd6f74-qp86z\" (UID: \"2de7f8d7-9da6-4fa8-b5a9-c641f96806e0\") " pod="openshift-image-registry/image-registry-986fd6f74-qp86z" Apr 23 13:32:41.635185 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:41.635164 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-47jnp\" (UniqueName: \"kubernetes.io/projected/2de7f8d7-9da6-4fa8-b5a9-c641f96806e0-kube-api-access-47jnp\") pod \"image-registry-986fd6f74-qp86z\" (UID: \"2de7f8d7-9da6-4fa8-b5a9-c641f96806e0\") " pod="openshift-image-registry/image-registry-986fd6f74-qp86z" Apr 23 13:32:41.725113 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:41.725091 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6fh47\" (UniqueName: \"kubernetes.io/projected/5d0b8118-d3c3-4333-a6a3-c53abf8e3daa-kube-api-access-6fh47\") pod \"ingress-canary-b9l6f\" (UID: \"5d0b8118-d3c3-4333-a6a3-c53abf8e3daa\") " pod="openshift-ingress-canary/ingress-canary-b9l6f" Apr 23 13:32:41.725225 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:41.725127 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5d0b8118-d3c3-4333-a6a3-c53abf8e3daa-cert\") pod \"ingress-canary-b9l6f\" (UID: \"5d0b8118-d3c3-4333-a6a3-c53abf8e3daa\") " pod="openshift-ingress-canary/ingress-canary-b9l6f" Apr 23 13:32:41.725263 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:41.725225 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 13:32:41.725297 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:41.725279 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5d0b8118-d3c3-4333-a6a3-c53abf8e3daa-cert podName:5d0b8118-d3c3-4333-a6a3-c53abf8e3daa nodeName:}" failed. No retries permitted until 2026-04-23 13:32:42.225263247 +0000 UTC m=+35.203171817 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5d0b8118-d3c3-4333-a6a3-c53abf8e3daa-cert") pod "ingress-canary-b9l6f" (UID: "5d0b8118-d3c3-4333-a6a3-c53abf8e3daa") : secret "canary-serving-cert" not found Apr 23 13:32:41.743657 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:41.743629 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6fh47\" (UniqueName: \"kubernetes.io/projected/5d0b8118-d3c3-4333-a6a3-c53abf8e3daa-kube-api-access-6fh47\") pod \"ingress-canary-b9l6f\" (UID: \"5d0b8118-d3c3-4333-a6a3-c53abf8e3daa\") " pod="openshift-ingress-canary/ingress-canary-b9l6f" Apr 23 13:32:41.882026 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:41.881997 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-68bb9b987f-g22tk"] Apr 23 13:32:41.901836 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:41.901810 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-68bb9b987f-g22tk"] Apr 23 13:32:41.901937 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:41.901857 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-68bb9b987f-g22tk" Apr 23 13:32:41.904693 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:41.904669 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 23 13:32:41.904812 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:41.904672 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 23 13:32:41.904812 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:41.904721 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-hv55t\"" Apr 23 13:32:41.904812 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:41.904680 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 23 13:32:41.904812 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:41.904672 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 23 13:32:41.906250 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:41.906228 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-5c77fc8847-462fs"] Apr 23 13:32:41.918371 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:41.918346 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-769587dc6f-7jf96"] Apr 23 13:32:41.918523 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:41.918507 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5c77fc8847-462fs" Apr 23 13:32:41.921007 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:41.920989 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 23 13:32:41.932319 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:41.932298 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-5c77fc8847-462fs"] Apr 23 13:32:41.932319 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:41.932323 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-769587dc6f-7jf96"] Apr 23 13:32:41.932436 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:41.932394 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-769587dc6f-7jf96" Apr 23 13:32:41.935167 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:41.935150 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 23 13:32:41.935261 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:41.935185 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 23 13:32:41.935261 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:41.935183 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 23 13:32:41.935261 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:41.935254 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 23 13:32:42.027672 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:42.027635 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/70f94fd0-7f09-4697-9cbe-0c891f7f9740-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-769587dc6f-7jf96\" (UID: \"70f94fd0-7f09-4697-9cbe-0c891f7f9740\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-769587dc6f-7jf96" Apr 23 13:32:42.027797 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:42.027684 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f701d829-2677-4e05-be5e-da1e52476ffb-tmp\") pod \"klusterlet-addon-workmgr-5c77fc8847-462fs\" (UID: \"f701d829-2677-4e05-be5e-da1e52476ffb\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5c77fc8847-462fs" Apr 23 13:32:42.027797 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:42.027745 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/70f94fd0-7f09-4697-9cbe-0c891f7f9740-ca\") pod \"cluster-proxy-proxy-agent-769587dc6f-7jf96\" (UID: \"70f94fd0-7f09-4697-9cbe-0c891f7f9740\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-769587dc6f-7jf96" Apr 23 13:32:42.027797 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:42.027779 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxgsb\" (UniqueName: \"kubernetes.io/projected/f701d829-2677-4e05-be5e-da1e52476ffb-kube-api-access-vxgsb\") pod \"klusterlet-addon-workmgr-5c77fc8847-462fs\" (UID: \"f701d829-2677-4e05-be5e-da1e52476ffb\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5c77fc8847-462fs" Apr 23 13:32:42.027964 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:42.027807 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/70f94fd0-7f09-4697-9cbe-0c891f7f9740-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-769587dc6f-7jf96\" (UID: \"70f94fd0-7f09-4697-9cbe-0c891f7f9740\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-769587dc6f-7jf96" Apr 23 13:32:42.027964 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:42.027853 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/70f94fd0-7f09-4697-9cbe-0c891f7f9740-hub\") pod \"cluster-proxy-proxy-agent-769587dc6f-7jf96\" (UID: \"70f94fd0-7f09-4697-9cbe-0c891f7f9740\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-769587dc6f-7jf96" Apr 23 13:32:42.027964 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:42.027883 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kjbb\" (UniqueName: \"kubernetes.io/projected/cc46cf5b-f4d6-43f9-b959-fad786ee3667-kube-api-access-9kjbb\") pod \"managed-serviceaccount-addon-agent-68bb9b987f-g22tk\" (UID: \"cc46cf5b-f4d6-43f9-b959-fad786ee3667\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-68bb9b987f-g22tk" Apr 23 13:32:42.027964 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:42.027922 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/70f94fd0-7f09-4697-9cbe-0c891f7f9740-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-769587dc6f-7jf96\" (UID: \"70f94fd0-7f09-4697-9cbe-0c891f7f9740\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-769587dc6f-7jf96" Apr 23 13:32:42.027964 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:42.027948 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b645t\" (UniqueName: \"kubernetes.io/projected/70f94fd0-7f09-4697-9cbe-0c891f7f9740-kube-api-access-b645t\") pod \"cluster-proxy-proxy-agent-769587dc6f-7jf96\" (UID: \"70f94fd0-7f09-4697-9cbe-0c891f7f9740\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-769587dc6f-7jf96" Apr 23 13:32:42.028171 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:42.027989 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/cc46cf5b-f4d6-43f9-b959-fad786ee3667-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-68bb9b987f-g22tk\" (UID: \"cc46cf5b-f4d6-43f9-b959-fad786ee3667\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-68bb9b987f-g22tk" Apr 23 13:32:42.028171 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:42.028014 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/f701d829-2677-4e05-be5e-da1e52476ffb-klusterlet-config\") pod \"klusterlet-addon-workmgr-5c77fc8847-462fs\" (UID: \"f701d829-2677-4e05-be5e-da1e52476ffb\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5c77fc8847-462fs" Apr 23 13:32:42.129358 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:42.129280 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/70f94fd0-7f09-4697-9cbe-0c891f7f9740-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-769587dc6f-7jf96\" (UID: \"70f94fd0-7f09-4697-9cbe-0c891f7f9740\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-769587dc6f-7jf96" Apr 23 13:32:42.129358 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:42.129321 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b645t\" (UniqueName: \"kubernetes.io/projected/70f94fd0-7f09-4697-9cbe-0c891f7f9740-kube-api-access-b645t\") pod \"cluster-proxy-proxy-agent-769587dc6f-7jf96\" (UID: \"70f94fd0-7f09-4697-9cbe-0c891f7f9740\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-769587dc6f-7jf96" Apr 23 13:32:42.129626 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:42.129360 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/cc46cf5b-f4d6-43f9-b959-fad786ee3667-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-68bb9b987f-g22tk\" (UID: \"cc46cf5b-f4d6-43f9-b959-fad786ee3667\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-68bb9b987f-g22tk" Apr 23 13:32:42.129626 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:42.129386 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/f701d829-2677-4e05-be5e-da1e52476ffb-klusterlet-config\") pod \"klusterlet-addon-workmgr-5c77fc8847-462fs\" (UID: \"f701d829-2677-4e05-be5e-da1e52476ffb\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5c77fc8847-462fs" Apr 23 13:32:42.129819 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:42.129793 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d3ae8909-ecd0-47e1-a99c-4ea293db3077-metrics-tls\") pod \"dns-default-rrmk8\" (UID: \"d3ae8909-ecd0-47e1-a99c-4ea293db3077\") " pod="openshift-dns/dns-default-rrmk8" Apr 23 13:32:42.129878 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:42.129864 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/70f94fd0-7f09-4697-9cbe-0c891f7f9740-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-769587dc6f-7jf96\" (UID: \"70f94fd0-7f09-4697-9cbe-0c891f7f9740\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-769587dc6f-7jf96" Apr 23 13:32:42.129974 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:42.129950 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 13:32:42.130060 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:42.130028 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d3ae8909-ecd0-47e1-a99c-4ea293db3077-metrics-tls podName:d3ae8909-ecd0-47e1-a99c-4ea293db3077 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:43.130007348 +0000 UTC m=+36.107915918 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d3ae8909-ecd0-47e1-a99c-4ea293db3077-metrics-tls") pod "dns-default-rrmk8" (UID: "d3ae8909-ecd0-47e1-a99c-4ea293db3077") : secret "dns-default-metrics-tls" not found Apr 23 13:32:42.130267 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:42.130237 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f701d829-2677-4e05-be5e-da1e52476ffb-tmp\") pod \"klusterlet-addon-workmgr-5c77fc8847-462fs\" (UID: \"f701d829-2677-4e05-be5e-da1e52476ffb\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5c77fc8847-462fs" Apr 23 13:32:42.130685 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:42.130609 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f701d829-2677-4e05-be5e-da1e52476ffb-tmp\") pod \"klusterlet-addon-workmgr-5c77fc8847-462fs\" (UID: \"f701d829-2677-4e05-be5e-da1e52476ffb\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5c77fc8847-462fs" Apr 23 13:32:42.130803 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:42.130671 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2de7f8d7-9da6-4fa8-b5a9-c641f96806e0-registry-tls\") pod \"image-registry-986fd6f74-qp86z\" (UID: \"2de7f8d7-9da6-4fa8-b5a9-c641f96806e0\") " pod="openshift-image-registry/image-registry-986fd6f74-qp86z" Apr 23 13:32:42.130803 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:42.130743 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/70f94fd0-7f09-4697-9cbe-0c891f7f9740-ca\") pod \"cluster-proxy-proxy-agent-769587dc6f-7jf96\" (UID: \"70f94fd0-7f09-4697-9cbe-0c891f7f9740\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-769587dc6f-7jf96" Apr 23 13:32:42.130803 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:42.130783 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vxgsb\" (UniqueName: \"kubernetes.io/projected/f701d829-2677-4e05-be5e-da1e52476ffb-kube-api-access-vxgsb\") pod \"klusterlet-addon-workmgr-5c77fc8847-462fs\" (UID: \"f701d829-2677-4e05-be5e-da1e52476ffb\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5c77fc8847-462fs" Apr 23 13:32:42.131023 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:42.130819 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/70f94fd0-7f09-4697-9cbe-0c891f7f9740-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-769587dc6f-7jf96\" (UID: \"70f94fd0-7f09-4697-9cbe-0c891f7f9740\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-769587dc6f-7jf96" Apr 23 13:32:42.131023 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:42.130851 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/70f94fd0-7f09-4697-9cbe-0c891f7f9740-hub\") pod \"cluster-proxy-proxy-agent-769587dc6f-7jf96\" (UID: \"70f94fd0-7f09-4697-9cbe-0c891f7f9740\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-769587dc6f-7jf96" Apr 23 13:32:42.131023 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:42.130875 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9kjbb\" (UniqueName: \"kubernetes.io/projected/cc46cf5b-f4d6-43f9-b959-fad786ee3667-kube-api-access-9kjbb\") pod \"managed-serviceaccount-addon-agent-68bb9b987f-g22tk\" (UID: \"cc46cf5b-f4d6-43f9-b959-fad786ee3667\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-68bb9b987f-g22tk" Apr 23 13:32:42.131239 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:42.130752 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 13:32:42.131239 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:42.131130 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-986fd6f74-qp86z: secret "image-registry-tls" not found Apr 23 13:32:42.131239 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:42.131191 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2de7f8d7-9da6-4fa8-b5a9-c641f96806e0-registry-tls podName:2de7f8d7-9da6-4fa8-b5a9-c641f96806e0 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:43.131175468 +0000 UTC m=+36.109084040 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/2de7f8d7-9da6-4fa8-b5a9-c641f96806e0-registry-tls") pod "image-registry-986fd6f74-qp86z" (UID: "2de7f8d7-9da6-4fa8-b5a9-c641f96806e0") : secret "image-registry-tls" not found Apr 23 13:32:42.132345 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:42.132324 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/70f94fd0-7f09-4697-9cbe-0c891f7f9740-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-769587dc6f-7jf96\" (UID: \"70f94fd0-7f09-4697-9cbe-0c891f7f9740\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-769587dc6f-7jf96" Apr 23 13:32:42.132567 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:42.132549 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/70f94fd0-7f09-4697-9cbe-0c891f7f9740-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-769587dc6f-7jf96\" (UID: \"70f94fd0-7f09-4697-9cbe-0c891f7f9740\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-769587dc6f-7jf96" Apr 23 13:32:42.133262 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:42.133240 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/70f94fd0-7f09-4697-9cbe-0c891f7f9740-ca\") pod \"cluster-proxy-proxy-agent-769587dc6f-7jf96\" (UID: \"70f94fd0-7f09-4697-9cbe-0c891f7f9740\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-769587dc6f-7jf96" Apr 23 13:32:42.133692 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:42.133676 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/70f94fd0-7f09-4697-9cbe-0c891f7f9740-hub\") pod \"cluster-proxy-proxy-agent-769587dc6f-7jf96\" (UID: \"70f94fd0-7f09-4697-9cbe-0c891f7f9740\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-769587dc6f-7jf96" Apr 23 13:32:42.139833 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:42.139809 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/70f94fd0-7f09-4697-9cbe-0c891f7f9740-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-769587dc6f-7jf96\" (UID: \"70f94fd0-7f09-4697-9cbe-0c891f7f9740\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-769587dc6f-7jf96" Apr 23 13:32:42.139931 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:42.139903 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/f701d829-2677-4e05-be5e-da1e52476ffb-klusterlet-config\") pod \"klusterlet-addon-workmgr-5c77fc8847-462fs\" (UID: \"f701d829-2677-4e05-be5e-da1e52476ffb\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5c77fc8847-462fs" Apr 23 13:32:42.139985 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:42.139966 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/cc46cf5b-f4d6-43f9-b959-fad786ee3667-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-68bb9b987f-g22tk\" (UID: \"cc46cf5b-f4d6-43f9-b959-fad786ee3667\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-68bb9b987f-g22tk" Apr 23 13:32:42.140087 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:42.140068 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b645t\" (UniqueName: \"kubernetes.io/projected/70f94fd0-7f09-4697-9cbe-0c891f7f9740-kube-api-access-b645t\") pod \"cluster-proxy-proxy-agent-769587dc6f-7jf96\" (UID: \"70f94fd0-7f09-4697-9cbe-0c891f7f9740\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-769587dc6f-7jf96" Apr 23 13:32:42.140263 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:42.140248 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kjbb\" (UniqueName: \"kubernetes.io/projected/cc46cf5b-f4d6-43f9-b959-fad786ee3667-kube-api-access-9kjbb\") pod \"managed-serviceaccount-addon-agent-68bb9b987f-g22tk\" (UID: \"cc46cf5b-f4d6-43f9-b959-fad786ee3667\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-68bb9b987f-g22tk" Apr 23 13:32:42.140362 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:42.140346 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxgsb\" (UniqueName: \"kubernetes.io/projected/f701d829-2677-4e05-be5e-da1e52476ffb-kube-api-access-vxgsb\") pod \"klusterlet-addon-workmgr-5c77fc8847-462fs\" (UID: \"f701d829-2677-4e05-be5e-da1e52476ffb\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5c77fc8847-462fs" Apr 23 13:32:42.232025 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:42.231994 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5d0b8118-d3c3-4333-a6a3-c53abf8e3daa-cert\") pod \"ingress-canary-b9l6f\" (UID: \"5d0b8118-d3c3-4333-a6a3-c53abf8e3daa\") " pod="openshift-ingress-canary/ingress-canary-b9l6f" Apr 23 13:32:42.232174 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:42.232132 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 13:32:42.232213 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:42.232192 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5d0b8118-d3c3-4333-a6a3-c53abf8e3daa-cert podName:5d0b8118-d3c3-4333-a6a3-c53abf8e3daa nodeName:}" failed. No retries permitted until 2026-04-23 13:32:43.232175421 +0000 UTC m=+36.210083996 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5d0b8118-d3c3-4333-a6a3-c53abf8e3daa-cert") pod "ingress-canary-b9l6f" (UID: "5d0b8118-d3c3-4333-a6a3-c53abf8e3daa") : secret "canary-serving-cert" not found Apr 23 13:32:42.267980 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:42.267958 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-68bb9b987f-g22tk" Apr 23 13:32:42.273652 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:42.273625 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5c77fc8847-462fs" Apr 23 13:32:42.278293 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:42.278269 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-769587dc6f-7jf96" Apr 23 13:32:42.433406 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:42.433374 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-5c77fc8847-462fs"] Apr 23 13:32:42.436249 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:42.436228 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-68bb9b987f-g22tk"] Apr 23 13:32:42.437234 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:42.437213 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf701d829_2677_4e05_be5e_da1e52476ffb.slice/crio-067fd95f6d73f80b79afe45a18614092111d794031491452e7b1ed306b39a7d5 WatchSource:0}: Error finding container 067fd95f6d73f80b79afe45a18614092111d794031491452e7b1ed306b39a7d5: Status 404 returned error can't find the container with id 067fd95f6d73f80b79afe45a18614092111d794031491452e7b1ed306b39a7d5 Apr 23 13:32:42.439364 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:42.439340 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcc46cf5b_f4d6_43f9_b959_fad786ee3667.slice/crio-f52c1be4d57b0bb289b572802dd1bb46c58ef03343f0c341625a78674842e14c WatchSource:0}: Error finding container f52c1be4d57b0bb289b572802dd1bb46c58ef03343f0c341625a78674842e14c: Status 404 returned error can't find the container with id f52c1be4d57b0bb289b572802dd1bb46c58ef03343f0c341625a78674842e14c Apr 23 13:32:42.444019 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:42.443996 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-769587dc6f-7jf96"] Apr 23 13:32:42.452545 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:42.452463 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod70f94fd0_7f09_4697_9cbe_0c891f7f9740.slice/crio-41227c7b108ad5ed50610ae9f2dea0dc91e368f2590ce5df3bb9616499ca05a8 WatchSource:0}: Error finding container 41227c7b108ad5ed50610ae9f2dea0dc91e368f2590ce5df3bb9616499ca05a8: Status 404 returned error can't find the container with id 41227c7b108ad5ed50610ae9f2dea0dc91e368f2590ce5df3bb9616499ca05a8 Apr 23 13:32:42.579759 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:42.579735 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-x4nd8" Apr 23 13:32:42.579896 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:42.579736 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fl2td" Apr 23 13:32:42.579956 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:42.579740 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6pz6w" Apr 23 13:32:42.583023 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:42.583002 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-jcmp5\"" Apr 23 13:32:42.583023 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:42.583015 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 23 13:32:42.583212 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:42.583029 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 23 13:32:42.583212 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:42.583064 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 23 13:32:42.583212 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:42.583072 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-lj9pj\"" Apr 23 13:32:42.583212 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:42.583182 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 23 13:32:42.716070 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:42.716043 2577 generic.go:358] "Generic (PLEG): container finished" podID="4a590caf-dc65-421e-a4c8-40d3258ddd7b" containerID="d0d8e06a0d3e5f1ccc6c17608339f9ad64b9cb89439720625b415a39923e2c9a" exitCode=0 Apr 23 13:32:42.716455 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:42.716105 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p9tzn" event={"ID":"4a590caf-dc65-421e-a4c8-40d3258ddd7b","Type":"ContainerDied","Data":"d0d8e06a0d3e5f1ccc6c17608339f9ad64b9cb89439720625b415a39923e2c9a"} Apr 23 13:32:42.717205 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:42.717174 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5c77fc8847-462fs" event={"ID":"f701d829-2677-4e05-be5e-da1e52476ffb","Type":"ContainerStarted","Data":"067fd95f6d73f80b79afe45a18614092111d794031491452e7b1ed306b39a7d5"} Apr 23 13:32:42.718192 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:42.718170 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-68bb9b987f-g22tk" event={"ID":"cc46cf5b-f4d6-43f9-b959-fad786ee3667","Type":"ContainerStarted","Data":"f52c1be4d57b0bb289b572802dd1bb46c58ef03343f0c341625a78674842e14c"} Apr 23 13:32:42.719055 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:42.719031 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-769587dc6f-7jf96" event={"ID":"70f94fd0-7f09-4697-9cbe-0c891f7f9740","Type":"ContainerStarted","Data":"41227c7b108ad5ed50610ae9f2dea0dc91e368f2590ce5df3bb9616499ca05a8"} Apr 23 13:32:43.137757 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:43.137578 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d3ae8909-ecd0-47e1-a99c-4ea293db3077-metrics-tls\") pod \"dns-default-rrmk8\" (UID: \"d3ae8909-ecd0-47e1-a99c-4ea293db3077\") " pod="openshift-dns/dns-default-rrmk8" Apr 23 13:32:43.137757 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:43.137746 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 13:32:43.138025 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:43.137813 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d3ae8909-ecd0-47e1-a99c-4ea293db3077-metrics-tls podName:d3ae8909-ecd0-47e1-a99c-4ea293db3077 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:45.137795303 +0000 UTC m=+38.115703873 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d3ae8909-ecd0-47e1-a99c-4ea293db3077-metrics-tls") pod "dns-default-rrmk8" (UID: "d3ae8909-ecd0-47e1-a99c-4ea293db3077") : secret "dns-default-metrics-tls" not found Apr 23 13:32:43.138025 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:43.137856 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2de7f8d7-9da6-4fa8-b5a9-c641f96806e0-registry-tls\") pod \"image-registry-986fd6f74-qp86z\" (UID: \"2de7f8d7-9da6-4fa8-b5a9-c641f96806e0\") " pod="openshift-image-registry/image-registry-986fd6f74-qp86z" Apr 23 13:32:43.138025 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:43.138018 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 13:32:43.138177 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:43.138035 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-986fd6f74-qp86z: secret "image-registry-tls" not found Apr 23 13:32:43.138177 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:43.138090 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2de7f8d7-9da6-4fa8-b5a9-c641f96806e0-registry-tls podName:2de7f8d7-9da6-4fa8-b5a9-c641f96806e0 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:45.138074054 +0000 UTC m=+38.115982631 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/2de7f8d7-9da6-4fa8-b5a9-c641f96806e0-registry-tls") pod "image-registry-986fd6f74-qp86z" (UID: "2de7f8d7-9da6-4fa8-b5a9-c641f96806e0") : secret "image-registry-tls" not found Apr 23 13:32:43.239134 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:43.239101 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5d0b8118-d3c3-4333-a6a3-c53abf8e3daa-cert\") pod \"ingress-canary-b9l6f\" (UID: \"5d0b8118-d3c3-4333-a6a3-c53abf8e3daa\") " pod="openshift-ingress-canary/ingress-canary-b9l6f" Apr 23 13:32:43.239332 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:43.239216 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 13:32:43.239332 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:43.239283 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5d0b8118-d3c3-4333-a6a3-c53abf8e3daa-cert podName:5d0b8118-d3c3-4333-a6a3-c53abf8e3daa nodeName:}" failed. No retries permitted until 2026-04-23 13:32:45.239267221 +0000 UTC m=+38.217175791 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5d0b8118-d3c3-4333-a6a3-c53abf8e3daa-cert") pod "ingress-canary-b9l6f" (UID: "5d0b8118-d3c3-4333-a6a3-c53abf8e3daa") : secret "canary-serving-cert" not found Apr 23 13:32:43.726101 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:43.725990 2577 generic.go:358] "Generic (PLEG): container finished" podID="4a590caf-dc65-421e-a4c8-40d3258ddd7b" containerID="860cd4b3053a0ab150b6a19f29b7cec8e7ea713d6108b56b0b40367add3edd81" exitCode=0 Apr 23 13:32:43.726101 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:43.726063 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p9tzn" event={"ID":"4a590caf-dc65-421e-a4c8-40d3258ddd7b","Type":"ContainerDied","Data":"860cd4b3053a0ab150b6a19f29b7cec8e7ea713d6108b56b0b40367add3edd81"} Apr 23 13:32:45.157742 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:45.157212 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d3ae8909-ecd0-47e1-a99c-4ea293db3077-metrics-tls\") pod \"dns-default-rrmk8\" (UID: \"d3ae8909-ecd0-47e1-a99c-4ea293db3077\") " pod="openshift-dns/dns-default-rrmk8" Apr 23 13:32:45.157742 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:45.157306 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2de7f8d7-9da6-4fa8-b5a9-c641f96806e0-registry-tls\") pod \"image-registry-986fd6f74-qp86z\" (UID: \"2de7f8d7-9da6-4fa8-b5a9-c641f96806e0\") " pod="openshift-image-registry/image-registry-986fd6f74-qp86z" Apr 23 13:32:45.157742 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:45.157373 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/75053ed6-040a-450c-b423-ce9ec4714d2f-original-pull-secret\") pod \"global-pull-secret-syncer-x4nd8\" (UID: \"75053ed6-040a-450c-b423-ce9ec4714d2f\") " pod="kube-system/global-pull-secret-syncer-x4nd8" Apr 23 13:32:45.158587 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:45.157744 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 13:32:45.158587 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:45.157785 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 13:32:45.158587 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:45.157808 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d3ae8909-ecd0-47e1-a99c-4ea293db3077-metrics-tls podName:d3ae8909-ecd0-47e1-a99c-4ea293db3077 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:49.157789286 +0000 UTC m=+42.135697871 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d3ae8909-ecd0-47e1-a99c-4ea293db3077-metrics-tls") pod "dns-default-rrmk8" (UID: "d3ae8909-ecd0-47e1-a99c-4ea293db3077") : secret "dns-default-metrics-tls" not found Apr 23 13:32:45.158587 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:45.157808 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-986fd6f74-qp86z: secret "image-registry-tls" not found Apr 23 13:32:45.158587 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:45.157841 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2de7f8d7-9da6-4fa8-b5a9-c641f96806e0-registry-tls podName:2de7f8d7-9da6-4fa8-b5a9-c641f96806e0 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:49.157833946 +0000 UTC m=+42.135742515 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/2de7f8d7-9da6-4fa8-b5a9-c641f96806e0-registry-tls") pod "image-registry-986fd6f74-qp86z" (UID: "2de7f8d7-9da6-4fa8-b5a9-c641f96806e0") : secret "image-registry-tls" not found Apr 23 13:32:45.165036 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:45.164983 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/75053ed6-040a-450c-b423-ce9ec4714d2f-original-pull-secret\") pod \"global-pull-secret-syncer-x4nd8\" (UID: \"75053ed6-040a-450c-b423-ce9ec4714d2f\") " pod="kube-system/global-pull-secret-syncer-x4nd8" Apr 23 13:32:45.258197 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:45.258158 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5d0b8118-d3c3-4333-a6a3-c53abf8e3daa-cert\") pod \"ingress-canary-b9l6f\" (UID: \"5d0b8118-d3c3-4333-a6a3-c53abf8e3daa\") " pod="openshift-ingress-canary/ingress-canary-b9l6f" Apr 23 13:32:45.258373 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:45.258326 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 13:32:45.258449 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:45.258399 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5d0b8118-d3c3-4333-a6a3-c53abf8e3daa-cert podName:5d0b8118-d3c3-4333-a6a3-c53abf8e3daa nodeName:}" failed. No retries permitted until 2026-04-23 13:32:49.25837936 +0000 UTC m=+42.236287936 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5d0b8118-d3c3-4333-a6a3-c53abf8e3daa-cert") pod "ingress-canary-b9l6f" (UID: "5d0b8118-d3c3-4333-a6a3-c53abf8e3daa") : secret "canary-serving-cert" not found Apr 23 13:32:45.289948 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:45.289916 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-x4nd8" Apr 23 13:32:45.732344 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:45.732303 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p9tzn" event={"ID":"4a590caf-dc65-421e-a4c8-40d3258ddd7b","Type":"ContainerStarted","Data":"2027d9569ec1243986d370d4e7f4ebde3a138e4d704a0d16b121238673733cdd"} Apr 23 13:32:45.755279 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:45.755240 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-p9tzn" podStartSLOduration=7.395883694 podStartE2EDuration="38.755228856s" podCreationTimestamp="2026-04-23 13:32:07 +0000 UTC" firstStartedPulling="2026-04-23 13:32:10.251111093 +0000 UTC m=+3.229019662" lastFinishedPulling="2026-04-23 13:32:41.61045625 +0000 UTC m=+34.588364824" observedRunningTime="2026-04-23 13:32:45.754605641 +0000 UTC m=+38.732514234" watchObservedRunningTime="2026-04-23 13:32:45.755228856 +0000 UTC m=+38.733137447" Apr 23 13:32:48.449351 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:48.449324 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-x4nd8"] Apr 23 13:32:48.452353 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:32:48.452323 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod75053ed6_040a_450c_b423_ce9ec4714d2f.slice/crio-477b2387c198021c3ffb79c11089e61725b797bd1885eb9196310afd2cae936d WatchSource:0}: Error finding container 477b2387c198021c3ffb79c11089e61725b797bd1885eb9196310afd2cae936d: Status 404 returned error can't find the container with id 477b2387c198021c3ffb79c11089e61725b797bd1885eb9196310afd2cae936d Apr 23 13:32:48.739607 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:48.739570 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-769587dc6f-7jf96" event={"ID":"70f94fd0-7f09-4697-9cbe-0c891f7f9740","Type":"ContainerStarted","Data":"1db2fb898e52114ad8f6ec41fb5dd18ca36a231dedd4fb1a3fd6be21fa75a044"} Apr 23 13:32:48.740716 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:48.740694 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5c77fc8847-462fs" event={"ID":"f701d829-2677-4e05-be5e-da1e52476ffb","Type":"ContainerStarted","Data":"59758a7044633e91e60e09064878c72a5b02fd6bb1f6dff9c56085527224b973"} Apr 23 13:32:48.740935 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:48.740910 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5c77fc8847-462fs" Apr 23 13:32:48.742118 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:48.742094 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-68bb9b987f-g22tk" event={"ID":"cc46cf5b-f4d6-43f9-b959-fad786ee3667","Type":"ContainerStarted","Data":"13d3cd68c6fa11ba2e4683c0c1f6beb5b15cfbb7ed1949563d512012e894798e"} Apr 23 13:32:48.742614 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:48.742595 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5c77fc8847-462fs" Apr 23 13:32:48.745385 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:48.745364 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-x4nd8" event={"ID":"75053ed6-040a-450c-b423-ce9ec4714d2f","Type":"ContainerStarted","Data":"477b2387c198021c3ffb79c11089e61725b797bd1885eb9196310afd2cae936d"} Apr 23 13:32:48.757433 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:48.757396 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5c77fc8847-462fs" podStartSLOduration=1.852570353 podStartE2EDuration="7.757386798s" podCreationTimestamp="2026-04-23 13:32:41 +0000 UTC" firstStartedPulling="2026-04-23 13:32:42.439087871 +0000 UTC m=+35.416996442" lastFinishedPulling="2026-04-23 13:32:48.343904314 +0000 UTC m=+41.321812887" observedRunningTime="2026-04-23 13:32:48.756957081 +0000 UTC m=+41.734865673" watchObservedRunningTime="2026-04-23 13:32:48.757386798 +0000 UTC m=+41.735295389" Apr 23 13:32:48.787711 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:48.787672 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-68bb9b987f-g22tk" podStartSLOduration=1.9012769710000001 podStartE2EDuration="7.787662012s" podCreationTimestamp="2026-04-23 13:32:41 +0000 UTC" firstStartedPulling="2026-04-23 13:32:42.441090228 +0000 UTC m=+35.418998802" lastFinishedPulling="2026-04-23 13:32:48.32747527 +0000 UTC m=+41.305383843" observedRunningTime="2026-04-23 13:32:48.78702452 +0000 UTC m=+41.764933113" watchObservedRunningTime="2026-04-23 13:32:48.787662012 +0000 UTC m=+41.765570582" Apr 23 13:32:49.191579 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:49.191546 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d3ae8909-ecd0-47e1-a99c-4ea293db3077-metrics-tls\") pod \"dns-default-rrmk8\" (UID: \"d3ae8909-ecd0-47e1-a99c-4ea293db3077\") " pod="openshift-dns/dns-default-rrmk8" Apr 23 13:32:49.191742 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:49.191635 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2de7f8d7-9da6-4fa8-b5a9-c641f96806e0-registry-tls\") pod \"image-registry-986fd6f74-qp86z\" (UID: \"2de7f8d7-9da6-4fa8-b5a9-c641f96806e0\") " pod="openshift-image-registry/image-registry-986fd6f74-qp86z" Apr 23 13:32:49.191742 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:49.191706 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 13:32:49.191846 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:49.191750 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 13:32:49.191846 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:49.191765 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-986fd6f74-qp86z: secret "image-registry-tls" not found Apr 23 13:32:49.191846 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:49.191780 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d3ae8909-ecd0-47e1-a99c-4ea293db3077-metrics-tls podName:d3ae8909-ecd0-47e1-a99c-4ea293db3077 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:57.191759307 +0000 UTC m=+50.169667883 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d3ae8909-ecd0-47e1-a99c-4ea293db3077-metrics-tls") pod "dns-default-rrmk8" (UID: "d3ae8909-ecd0-47e1-a99c-4ea293db3077") : secret "dns-default-metrics-tls" not found Apr 23 13:32:49.191846 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:49.191814 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2de7f8d7-9da6-4fa8-b5a9-c641f96806e0-registry-tls podName:2de7f8d7-9da6-4fa8-b5a9-c641f96806e0 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:57.19179853 +0000 UTC m=+50.169707105 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/2de7f8d7-9da6-4fa8-b5a9-c641f96806e0-registry-tls") pod "image-registry-986fd6f74-qp86z" (UID: "2de7f8d7-9da6-4fa8-b5a9-c641f96806e0") : secret "image-registry-tls" not found Apr 23 13:32:49.292940 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:49.292869 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5d0b8118-d3c3-4333-a6a3-c53abf8e3daa-cert\") pod \"ingress-canary-b9l6f\" (UID: \"5d0b8118-d3c3-4333-a6a3-c53abf8e3daa\") " pod="openshift-ingress-canary/ingress-canary-b9l6f" Apr 23 13:32:49.293087 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:49.293005 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 13:32:49.293087 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:49.293067 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5d0b8118-d3c3-4333-a6a3-c53abf8e3daa-cert podName:5d0b8118-d3c3-4333-a6a3-c53abf8e3daa nodeName:}" failed. No retries permitted until 2026-04-23 13:32:57.29304821 +0000 UTC m=+50.270956779 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5d0b8118-d3c3-4333-a6a3-c53abf8e3daa-cert") pod "ingress-canary-b9l6f" (UID: "5d0b8118-d3c3-4333-a6a3-c53abf8e3daa") : secret "canary-serving-cert" not found Apr 23 13:32:53.760286 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:53.760252 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-x4nd8" event={"ID":"75053ed6-040a-450c-b423-ce9ec4714d2f","Type":"ContainerStarted","Data":"c7034564c909d84713fcd02ad6841465794ce251516673d170e76fc06adea691"} Apr 23 13:32:53.762007 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:53.761981 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-769587dc6f-7jf96" event={"ID":"70f94fd0-7f09-4697-9cbe-0c891f7f9740","Type":"ContainerStarted","Data":"a0daec5d95e6c72a8f2691024f20b9044b0bb95eb36c16103265fcf5a7d04370"} Apr 23 13:32:53.762129 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:53.762012 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-769587dc6f-7jf96" event={"ID":"70f94fd0-7f09-4697-9cbe-0c891f7f9740","Type":"ContainerStarted","Data":"6d708bed5f2b4a54b182033fcf269ad117897e998ab61c6418bba1c33f180f38"} Apr 23 13:32:53.783394 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:53.783351 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-x4nd8" podStartSLOduration=35.94817684 podStartE2EDuration="40.783339912s" podCreationTimestamp="2026-04-23 13:32:13 +0000 UTC" firstStartedPulling="2026-04-23 13:32:48.454035009 +0000 UTC m=+41.431943579" lastFinishedPulling="2026-04-23 13:32:53.289198081 +0000 UTC m=+46.267106651" observedRunningTime="2026-04-23 13:32:53.782606124 +0000 UTC m=+46.760514716" watchObservedRunningTime="2026-04-23 13:32:53.783339912 +0000 UTC m=+46.761248500" Apr 23 13:32:53.802959 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:53.802891 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-769587dc6f-7jf96" podStartSLOduration=1.962160245 podStartE2EDuration="12.802881322s" podCreationTimestamp="2026-04-23 13:32:41 +0000 UTC" firstStartedPulling="2026-04-23 13:32:42.453956791 +0000 UTC m=+35.431865361" lastFinishedPulling="2026-04-23 13:32:53.294677867 +0000 UTC m=+46.272586438" observedRunningTime="2026-04-23 13:32:53.802311362 +0000 UTC m=+46.780219954" watchObservedRunningTime="2026-04-23 13:32:53.802881322 +0000 UTC m=+46.780789951" Apr 23 13:32:57.254883 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:57.254852 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2de7f8d7-9da6-4fa8-b5a9-c641f96806e0-registry-tls\") pod \"image-registry-986fd6f74-qp86z\" (UID: \"2de7f8d7-9da6-4fa8-b5a9-c641f96806e0\") " pod="openshift-image-registry/image-registry-986fd6f74-qp86z" Apr 23 13:32:57.255242 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:57.254914 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d3ae8909-ecd0-47e1-a99c-4ea293db3077-metrics-tls\") pod \"dns-default-rrmk8\" (UID: \"d3ae8909-ecd0-47e1-a99c-4ea293db3077\") " pod="openshift-dns/dns-default-rrmk8" Apr 23 13:32:57.255242 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:57.254998 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 13:32:57.255242 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:57.255017 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-986fd6f74-qp86z: secret "image-registry-tls" not found Apr 23 13:32:57.255242 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:57.255065 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2de7f8d7-9da6-4fa8-b5a9-c641f96806e0-registry-tls podName:2de7f8d7-9da6-4fa8-b5a9-c641f96806e0 nodeName:}" failed. No retries permitted until 2026-04-23 13:33:13.255051474 +0000 UTC m=+66.232960045 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/2de7f8d7-9da6-4fa8-b5a9-c641f96806e0-registry-tls") pod "image-registry-986fd6f74-qp86z" (UID: "2de7f8d7-9da6-4fa8-b5a9-c641f96806e0") : secret "image-registry-tls" not found Apr 23 13:32:57.255242 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:57.255007 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 13:32:57.255242 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:57.255135 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d3ae8909-ecd0-47e1-a99c-4ea293db3077-metrics-tls podName:d3ae8909-ecd0-47e1-a99c-4ea293db3077 nodeName:}" failed. No retries permitted until 2026-04-23 13:33:13.255120457 +0000 UTC m=+66.233029029 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d3ae8909-ecd0-47e1-a99c-4ea293db3077-metrics-tls") pod "dns-default-rrmk8" (UID: "d3ae8909-ecd0-47e1-a99c-4ea293db3077") : secret "dns-default-metrics-tls" not found Apr 23 13:32:57.355419 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:32:57.355390 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5d0b8118-d3c3-4333-a6a3-c53abf8e3daa-cert\") pod \"ingress-canary-b9l6f\" (UID: \"5d0b8118-d3c3-4333-a6a3-c53abf8e3daa\") " pod="openshift-ingress-canary/ingress-canary-b9l6f" Apr 23 13:32:57.355577 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:57.355511 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 13:32:57.355577 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:32:57.355556 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5d0b8118-d3c3-4333-a6a3-c53abf8e3daa-cert podName:5d0b8118-d3c3-4333-a6a3-c53abf8e3daa nodeName:}" failed. No retries permitted until 2026-04-23 13:33:13.355545186 +0000 UTC m=+66.333453756 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5d0b8118-d3c3-4333-a6a3-c53abf8e3daa-cert") pod "ingress-canary-b9l6f" (UID: "5d0b8118-d3c3-4333-a6a3-c53abf8e3daa") : secret "canary-serving-cert" not found Apr 23 13:33:06.712214 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:33:06.712186 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mwbjc" Apr 23 13:33:13.260848 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:33:13.260812 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2de7f8d7-9da6-4fa8-b5a9-c641f96806e0-registry-tls\") pod \"image-registry-986fd6f74-qp86z\" (UID: \"2de7f8d7-9da6-4fa8-b5a9-c641f96806e0\") " pod="openshift-image-registry/image-registry-986fd6f74-qp86z" Apr 23 13:33:13.261282 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:33:13.260878 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d3ae8909-ecd0-47e1-a99c-4ea293db3077-metrics-tls\") pod \"dns-default-rrmk8\" (UID: \"d3ae8909-ecd0-47e1-a99c-4ea293db3077\") " pod="openshift-dns/dns-default-rrmk8" Apr 23 13:33:13.261282 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:33:13.260976 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 13:33:13.261282 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:33:13.260986 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 13:33:13.261282 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:33:13.261010 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-986fd6f74-qp86z: secret "image-registry-tls" not found Apr 23 13:33:13.261282 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:33:13.261025 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d3ae8909-ecd0-47e1-a99c-4ea293db3077-metrics-tls podName:d3ae8909-ecd0-47e1-a99c-4ea293db3077 nodeName:}" failed. No retries permitted until 2026-04-23 13:33:45.261012792 +0000 UTC m=+98.238921362 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d3ae8909-ecd0-47e1-a99c-4ea293db3077-metrics-tls") pod "dns-default-rrmk8" (UID: "d3ae8909-ecd0-47e1-a99c-4ea293db3077") : secret "dns-default-metrics-tls" not found Apr 23 13:33:13.261282 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:33:13.261061 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2de7f8d7-9da6-4fa8-b5a9-c641f96806e0-registry-tls podName:2de7f8d7-9da6-4fa8-b5a9-c641f96806e0 nodeName:}" failed. No retries permitted until 2026-04-23 13:33:45.261044565 +0000 UTC m=+98.238953140 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/2de7f8d7-9da6-4fa8-b5a9-c641f96806e0-registry-tls") pod "image-registry-986fd6f74-qp86z" (UID: "2de7f8d7-9da6-4fa8-b5a9-c641f96806e0") : secret "image-registry-tls" not found Apr 23 13:33:13.361776 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:33:13.361741 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/821df7f9-3f87-4f86-a7e9-82cad302fff0-metrics-certs\") pod \"network-metrics-daemon-6pz6w\" (UID: \"821df7f9-3f87-4f86-a7e9-82cad302fff0\") " pod="openshift-multus/network-metrics-daemon-6pz6w" Apr 23 13:33:13.361776 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:33:13.361780 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5d0b8118-d3c3-4333-a6a3-c53abf8e3daa-cert\") pod \"ingress-canary-b9l6f\" (UID: \"5d0b8118-d3c3-4333-a6a3-c53abf8e3daa\") " pod="openshift-ingress-canary/ingress-canary-b9l6f" Apr 23 13:33:13.362015 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:33:13.361896 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 13:33:13.362015 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:33:13.361946 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5d0b8118-d3c3-4333-a6a3-c53abf8e3daa-cert podName:5d0b8118-d3c3-4333-a6a3-c53abf8e3daa nodeName:}" failed. No retries permitted until 2026-04-23 13:33:45.361933236 +0000 UTC m=+98.339841806 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5d0b8118-d3c3-4333-a6a3-c53abf8e3daa-cert") pod "ingress-canary-b9l6f" (UID: "5d0b8118-d3c3-4333-a6a3-c53abf8e3daa") : secret "canary-serving-cert" not found Apr 23 13:33:13.364679 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:33:13.364657 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 23 13:33:13.372665 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:33:13.372646 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 23 13:33:13.372744 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:33:13.372705 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/821df7f9-3f87-4f86-a7e9-82cad302fff0-metrics-certs podName:821df7f9-3f87-4f86-a7e9-82cad302fff0 nodeName:}" failed. No retries permitted until 2026-04-23 13:34:17.372692161 +0000 UTC m=+130.350600732 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/821df7f9-3f87-4f86-a7e9-82cad302fff0-metrics-certs") pod "network-metrics-daemon-6pz6w" (UID: "821df7f9-3f87-4f86-a7e9-82cad302fff0") : secret "metrics-daemon-secret" not found Apr 23 13:33:13.462622 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:33:13.462587 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gjg76\" (UniqueName: \"kubernetes.io/projected/373af144-ae77-4496-8057-d855373807e4-kube-api-access-gjg76\") pod \"network-check-target-fl2td\" (UID: \"373af144-ae77-4496-8057-d855373807e4\") " pod="openshift-network-diagnostics/network-check-target-fl2td" Apr 23 13:33:13.465671 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:33:13.465650 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 23 13:33:13.475612 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:33:13.475589 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 23 13:33:13.486516 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:33:13.486466 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjg76\" (UniqueName: \"kubernetes.io/projected/373af144-ae77-4496-8057-d855373807e4-kube-api-access-gjg76\") pod \"network-check-target-fl2td\" (UID: \"373af144-ae77-4496-8057-d855373807e4\") " pod="openshift-network-diagnostics/network-check-target-fl2td" Apr 23 13:33:13.498098 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:33:13.498081 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-jcmp5\"" Apr 23 13:33:13.506309 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:33:13.506291 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fl2td" Apr 23 13:33:13.616334 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:33:13.616228 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-fl2td"] Apr 23 13:33:13.618980 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:33:13.618953 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod373af144_ae77_4496_8057_d855373807e4.slice/crio-bf3df03ef34c3b27681ae02f71f821dd63d0236c112a1ab53815ecc504087eb3 WatchSource:0}: Error finding container bf3df03ef34c3b27681ae02f71f821dd63d0236c112a1ab53815ecc504087eb3: Status 404 returned error can't find the container with id bf3df03ef34c3b27681ae02f71f821dd63d0236c112a1ab53815ecc504087eb3 Apr 23 13:33:13.809680 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:33:13.809596 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-fl2td" event={"ID":"373af144-ae77-4496-8057-d855373807e4","Type":"ContainerStarted","Data":"bf3df03ef34c3b27681ae02f71f821dd63d0236c112a1ab53815ecc504087eb3"} Apr 23 13:33:16.818483 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:33:16.818457 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-fl2td" event={"ID":"373af144-ae77-4496-8057-d855373807e4","Type":"ContainerStarted","Data":"a98a1257c3a6828e420defd35916980ab2f5d2567ff3489792702ea859713374"} Apr 23 13:33:16.818798 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:33:16.818600 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-fl2td" Apr 23 13:33:16.836434 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:33:16.836392 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-fl2td" podStartSLOduration=66.800680453 podStartE2EDuration="1m9.836379801s" podCreationTimestamp="2026-04-23 13:32:07 +0000 UTC" firstStartedPulling="2026-04-23 13:33:13.621271958 +0000 UTC m=+66.599180528" lastFinishedPulling="2026-04-23 13:33:16.656971305 +0000 UTC m=+69.634879876" observedRunningTime="2026-04-23 13:33:16.835661181 +0000 UTC m=+69.813569774" watchObservedRunningTime="2026-04-23 13:33:16.836379801 +0000 UTC m=+69.814288393" Apr 23 13:33:45.284063 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:33:45.284026 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2de7f8d7-9da6-4fa8-b5a9-c641f96806e0-registry-tls\") pod \"image-registry-986fd6f74-qp86z\" (UID: \"2de7f8d7-9da6-4fa8-b5a9-c641f96806e0\") " pod="openshift-image-registry/image-registry-986fd6f74-qp86z" Apr 23 13:33:45.284584 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:33:45.284088 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d3ae8909-ecd0-47e1-a99c-4ea293db3077-metrics-tls\") pod \"dns-default-rrmk8\" (UID: \"d3ae8909-ecd0-47e1-a99c-4ea293db3077\") " pod="openshift-dns/dns-default-rrmk8" Apr 23 13:33:45.284584 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:33:45.284180 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 13:33:45.284584 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:33:45.284186 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 13:33:45.284584 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:33:45.284196 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-986fd6f74-qp86z: secret "image-registry-tls" not found Apr 23 13:33:45.284584 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:33:45.284262 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2de7f8d7-9da6-4fa8-b5a9-c641f96806e0-registry-tls podName:2de7f8d7-9da6-4fa8-b5a9-c641f96806e0 nodeName:}" failed. No retries permitted until 2026-04-23 13:34:49.284247168 +0000 UTC m=+162.262155742 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/2de7f8d7-9da6-4fa8-b5a9-c641f96806e0-registry-tls") pod "image-registry-986fd6f74-qp86z" (UID: "2de7f8d7-9da6-4fa8-b5a9-c641f96806e0") : secret "image-registry-tls" not found Apr 23 13:33:45.284584 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:33:45.284276 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d3ae8909-ecd0-47e1-a99c-4ea293db3077-metrics-tls podName:d3ae8909-ecd0-47e1-a99c-4ea293db3077 nodeName:}" failed. No retries permitted until 2026-04-23 13:34:49.284270867 +0000 UTC m=+162.262179440 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d3ae8909-ecd0-47e1-a99c-4ea293db3077-metrics-tls") pod "dns-default-rrmk8" (UID: "d3ae8909-ecd0-47e1-a99c-4ea293db3077") : secret "dns-default-metrics-tls" not found Apr 23 13:33:45.385212 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:33:45.385188 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5d0b8118-d3c3-4333-a6a3-c53abf8e3daa-cert\") pod \"ingress-canary-b9l6f\" (UID: \"5d0b8118-d3c3-4333-a6a3-c53abf8e3daa\") " pod="openshift-ingress-canary/ingress-canary-b9l6f" Apr 23 13:33:45.385326 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:33:45.385292 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 13:33:45.385370 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:33:45.385330 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5d0b8118-d3c3-4333-a6a3-c53abf8e3daa-cert podName:5d0b8118-d3c3-4333-a6a3-c53abf8e3daa nodeName:}" failed. No retries permitted until 2026-04-23 13:34:49.385319534 +0000 UTC m=+162.363228105 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5d0b8118-d3c3-4333-a6a3-c53abf8e3daa-cert") pod "ingress-canary-b9l6f" (UID: "5d0b8118-d3c3-4333-a6a3-c53abf8e3daa") : secret "canary-serving-cert" not found Apr 23 13:33:47.823809 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:33:47.823779 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-fl2td" Apr 23 13:34:13.402183 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:34:13.402157 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-wxlj2_5ab5642a-1989-41c1-956f-98f92fcc6f23/dns-node-resolver/0.log" Apr 23 13:34:14.208175 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:34:14.208146 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-mbln5_aecfee63-4703-49e8-81cc-aa07bc06ce4e/node-ca/0.log" Apr 23 13:34:17.407692 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:34:17.407647 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/821df7f9-3f87-4f86-a7e9-82cad302fff0-metrics-certs\") pod \"network-metrics-daemon-6pz6w\" (UID: \"821df7f9-3f87-4f86-a7e9-82cad302fff0\") " pod="openshift-multus/network-metrics-daemon-6pz6w" Apr 23 13:34:17.408088 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:34:17.407795 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 23 13:34:17.408088 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:34:17.407865 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/821df7f9-3f87-4f86-a7e9-82cad302fff0-metrics-certs podName:821df7f9-3f87-4f86-a7e9-82cad302fff0 nodeName:}" failed. No retries permitted until 2026-04-23 13:36:19.40785116 +0000 UTC m=+252.385759729 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/821df7f9-3f87-4f86-a7e9-82cad302fff0-metrics-certs") pod "network-metrics-daemon-6pz6w" (UID: "821df7f9-3f87-4f86-a7e9-82cad302fff0") : secret "metrics-daemon-secret" not found Apr 23 13:34:35.933917 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:34:35.933888 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-dhqrw"] Apr 23 13:34:35.936939 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:34:35.936921 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-dhqrw" Apr 23 13:34:35.942505 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:34:35.942474 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 23 13:34:35.942505 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:34:35.942520 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-hkznh\"" Apr 23 13:34:35.943308 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:34:35.942947 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 23 13:34:35.943308 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:34:35.943209 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 23 13:34:35.944932 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:34:35.944916 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 23 13:34:35.967209 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:34:35.967183 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-dhqrw"] Apr 23 13:34:36.038444 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:34:36.038422 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/bec1f7dd-ec47-40f1-8ca6-554c81f3b55c-crio-socket\") pod \"insights-runtime-extractor-dhqrw\" (UID: \"bec1f7dd-ec47-40f1-8ca6-554c81f3b55c\") " pod="openshift-insights/insights-runtime-extractor-dhqrw" Apr 23 13:34:36.038444 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:34:36.038450 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/bec1f7dd-ec47-40f1-8ca6-554c81f3b55c-data-volume\") pod \"insights-runtime-extractor-dhqrw\" (UID: \"bec1f7dd-ec47-40f1-8ca6-554c81f3b55c\") " pod="openshift-insights/insights-runtime-extractor-dhqrw" Apr 23 13:34:36.038621 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:34:36.038474 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/bec1f7dd-ec47-40f1-8ca6-554c81f3b55c-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-dhqrw\" (UID: \"bec1f7dd-ec47-40f1-8ca6-554c81f3b55c\") " pod="openshift-insights/insights-runtime-extractor-dhqrw" Apr 23 13:34:36.038621 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:34:36.038541 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8c8bd\" (UniqueName: \"kubernetes.io/projected/bec1f7dd-ec47-40f1-8ca6-554c81f3b55c-kube-api-access-8c8bd\") pod \"insights-runtime-extractor-dhqrw\" (UID: \"bec1f7dd-ec47-40f1-8ca6-554c81f3b55c\") " pod="openshift-insights/insights-runtime-extractor-dhqrw" Apr 23 13:34:36.038621 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:34:36.038577 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/bec1f7dd-ec47-40f1-8ca6-554c81f3b55c-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-dhqrw\" (UID: \"bec1f7dd-ec47-40f1-8ca6-554c81f3b55c\") " pod="openshift-insights/insights-runtime-extractor-dhqrw" Apr 23 13:34:36.139561 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:34:36.139539 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/bec1f7dd-ec47-40f1-8ca6-554c81f3b55c-crio-socket\") pod \"insights-runtime-extractor-dhqrw\" (UID: \"bec1f7dd-ec47-40f1-8ca6-554c81f3b55c\") " pod="openshift-insights/insights-runtime-extractor-dhqrw" Apr 23 13:34:36.139673 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:34:36.139566 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/bec1f7dd-ec47-40f1-8ca6-554c81f3b55c-data-volume\") pod \"insights-runtime-extractor-dhqrw\" (UID: \"bec1f7dd-ec47-40f1-8ca6-554c81f3b55c\") " pod="openshift-insights/insights-runtime-extractor-dhqrw" Apr 23 13:34:36.139673 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:34:36.139590 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/bec1f7dd-ec47-40f1-8ca6-554c81f3b55c-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-dhqrw\" (UID: \"bec1f7dd-ec47-40f1-8ca6-554c81f3b55c\") " pod="openshift-insights/insights-runtime-extractor-dhqrw" Apr 23 13:34:36.139673 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:34:36.139605 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8c8bd\" (UniqueName: \"kubernetes.io/projected/bec1f7dd-ec47-40f1-8ca6-554c81f3b55c-kube-api-access-8c8bd\") pod \"insights-runtime-extractor-dhqrw\" (UID: \"bec1f7dd-ec47-40f1-8ca6-554c81f3b55c\") " pod="openshift-insights/insights-runtime-extractor-dhqrw" Apr 23 13:34:36.139673 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:34:36.139636 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/bec1f7dd-ec47-40f1-8ca6-554c81f3b55c-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-dhqrw\" (UID: \"bec1f7dd-ec47-40f1-8ca6-554c81f3b55c\") " pod="openshift-insights/insights-runtime-extractor-dhqrw" Apr 23 13:34:36.139673 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:34:36.139652 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/bec1f7dd-ec47-40f1-8ca6-554c81f3b55c-crio-socket\") pod \"insights-runtime-extractor-dhqrw\" (UID: \"bec1f7dd-ec47-40f1-8ca6-554c81f3b55c\") " pod="openshift-insights/insights-runtime-extractor-dhqrw" Apr 23 13:34:36.139952 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:34:36.139936 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/bec1f7dd-ec47-40f1-8ca6-554c81f3b55c-data-volume\") pod \"insights-runtime-extractor-dhqrw\" (UID: \"bec1f7dd-ec47-40f1-8ca6-554c81f3b55c\") " pod="openshift-insights/insights-runtime-extractor-dhqrw" Apr 23 13:34:36.140188 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:34:36.140173 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/bec1f7dd-ec47-40f1-8ca6-554c81f3b55c-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-dhqrw\" (UID: \"bec1f7dd-ec47-40f1-8ca6-554c81f3b55c\") " pod="openshift-insights/insights-runtime-extractor-dhqrw" Apr 23 13:34:36.141833 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:34:36.141813 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/bec1f7dd-ec47-40f1-8ca6-554c81f3b55c-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-dhqrw\" (UID: \"bec1f7dd-ec47-40f1-8ca6-554c81f3b55c\") " pod="openshift-insights/insights-runtime-extractor-dhqrw" Apr 23 13:34:36.148107 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:34:36.148083 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8c8bd\" (UniqueName: \"kubernetes.io/projected/bec1f7dd-ec47-40f1-8ca6-554c81f3b55c-kube-api-access-8c8bd\") pod \"insights-runtime-extractor-dhqrw\" (UID: \"bec1f7dd-ec47-40f1-8ca6-554c81f3b55c\") " pod="openshift-insights/insights-runtime-extractor-dhqrw" Apr 23 13:34:36.245538 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:34:36.245471 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-dhqrw" Apr 23 13:34:36.375926 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:34:36.375898 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-dhqrw"] Apr 23 13:34:36.378429 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:34:36.378393 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbec1f7dd_ec47_40f1_8ca6_554c81f3b55c.slice/crio-2fc8a9888c6979f848955ad3c900f01062c57b9ba3f29c4cf96a4c2914ae75be WatchSource:0}: Error finding container 2fc8a9888c6979f848955ad3c900f01062c57b9ba3f29c4cf96a4c2914ae75be: Status 404 returned error can't find the container with id 2fc8a9888c6979f848955ad3c900f01062c57b9ba3f29c4cf96a4c2914ae75be Apr 23 13:34:36.995653 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:34:36.995619 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-dhqrw" event={"ID":"bec1f7dd-ec47-40f1-8ca6-554c81f3b55c","Type":"ContainerStarted","Data":"0eeab7dce25500148bbef204dbb8fb4578db6129f784ff6ac0e2a97b90847c9a"} Apr 23 13:34:36.995653 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:34:36.995656 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-dhqrw" event={"ID":"bec1f7dd-ec47-40f1-8ca6-554c81f3b55c","Type":"ContainerStarted","Data":"2fc8a9888c6979f848955ad3c900f01062c57b9ba3f29c4cf96a4c2914ae75be"} Apr 23 13:34:37.999599 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:34:37.999551 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-dhqrw" event={"ID":"bec1f7dd-ec47-40f1-8ca6-554c81f3b55c","Type":"ContainerStarted","Data":"d80527b16e553e042268472d2fd13df6be9cde56875e6638157e86221887b957"} Apr 23 13:34:39.002914 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:34:39.002876 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-dhqrw" event={"ID":"bec1f7dd-ec47-40f1-8ca6-554c81f3b55c","Type":"ContainerStarted","Data":"c319b0e4f1ff48e60094adb76a08948d9cadb08dae6117b73a1503724c9de8f0"} Apr 23 13:34:39.020193 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:34:39.020142 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-dhqrw" podStartSLOduration=2.074916677 podStartE2EDuration="4.020127731s" podCreationTimestamp="2026-04-23 13:34:35 +0000 UTC" firstStartedPulling="2026-04-23 13:34:36.437200871 +0000 UTC m=+149.415109445" lastFinishedPulling="2026-04-23 13:34:38.382411924 +0000 UTC m=+151.360320499" observedRunningTime="2026-04-23 13:34:39.019421312 +0000 UTC m=+151.997329920" watchObservedRunningTime="2026-04-23 13:34:39.020127731 +0000 UTC m=+151.998036321" Apr 23 13:34:44.432392 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:34:44.432346 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-986fd6f74-qp86z" podUID="2de7f8d7-9da6-4fa8-b5a9-c641f96806e0" Apr 23 13:34:44.444569 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:34:44.444540 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-rrmk8" podUID="d3ae8909-ecd0-47e1-a99c-4ea293db3077" Apr 23 13:34:44.458717 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:34:44.458687 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-b9l6f" podUID="5d0b8118-d3c3-4333-a6a3-c53abf8e3daa" Apr 23 13:34:45.015222 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:34:45.015188 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-986fd6f74-qp86z" Apr 23 13:34:45.015419 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:34:45.015188 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-rrmk8" Apr 23 13:34:45.600817 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:34:45.600783 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-6pz6w" podUID="821df7f9-3f87-4f86-a7e9-82cad302fff0" Apr 23 13:34:47.433014 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:34:47.432984 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-hw27g"] Apr 23 13:34:47.438731 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:34:47.438712 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-hw27g" Apr 23 13:34:47.441057 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:34:47.441035 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 23 13:34:47.441283 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:34:47.441260 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 23 13:34:47.441389 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:34:47.441272 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 23 13:34:47.441389 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:34:47.441314 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-2z86s\"" Apr 23 13:34:47.442536 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:34:47.442522 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 23 13:34:47.442625 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:34:47.442610 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 23 13:34:47.442681 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:34:47.442630 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 23 13:34:47.526057 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:34:47.526026 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/9fe377af-7b17-4ea6-9181-973d470d1441-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-hw27g\" (UID: \"9fe377af-7b17-4ea6-9181-973d470d1441\") " pod="openshift-monitoring/node-exporter-hw27g" Apr 23 13:34:47.526170 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:34:47.526061 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/9fe377af-7b17-4ea6-9181-973d470d1441-node-exporter-textfile\") pod \"node-exporter-hw27g\" (UID: \"9fe377af-7b17-4ea6-9181-973d470d1441\") " pod="openshift-monitoring/node-exporter-hw27g" Apr 23 13:34:47.526170 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:34:47.526100 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/9fe377af-7b17-4ea6-9181-973d470d1441-root\") pod \"node-exporter-hw27g\" (UID: \"9fe377af-7b17-4ea6-9181-973d470d1441\") " pod="openshift-monitoring/node-exporter-hw27g" Apr 23 13:34:47.526170 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:34:47.526117 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crwqn\" (UniqueName: \"kubernetes.io/projected/9fe377af-7b17-4ea6-9181-973d470d1441-kube-api-access-crwqn\") pod \"node-exporter-hw27g\" (UID: \"9fe377af-7b17-4ea6-9181-973d470d1441\") " pod="openshift-monitoring/node-exporter-hw27g" Apr 23 13:34:47.526307 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:34:47.526174 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9fe377af-7b17-4ea6-9181-973d470d1441-sys\") pod \"node-exporter-hw27g\" (UID: \"9fe377af-7b17-4ea6-9181-973d470d1441\") " pod="openshift-monitoring/node-exporter-hw27g" Apr 23 13:34:47.526307 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:34:47.526194 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/9fe377af-7b17-4ea6-9181-973d470d1441-node-exporter-tls\") pod \"node-exporter-hw27g\" (UID: \"9fe377af-7b17-4ea6-9181-973d470d1441\") " pod="openshift-monitoring/node-exporter-hw27g" Apr 23 13:34:47.526307 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:34:47.526225 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9fe377af-7b17-4ea6-9181-973d470d1441-metrics-client-ca\") pod \"node-exporter-hw27g\" (UID: \"9fe377af-7b17-4ea6-9181-973d470d1441\") " pod="openshift-monitoring/node-exporter-hw27g" Apr 23 13:34:47.526307 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:34:47.526242 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/9fe377af-7b17-4ea6-9181-973d470d1441-node-exporter-wtmp\") pod \"node-exporter-hw27g\" (UID: \"9fe377af-7b17-4ea6-9181-973d470d1441\") " pod="openshift-monitoring/node-exporter-hw27g" Apr 23 13:34:47.526465 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:34:47.526312 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/9fe377af-7b17-4ea6-9181-973d470d1441-node-exporter-accelerators-collector-config\") pod \"node-exporter-hw27g\" (UID: \"9fe377af-7b17-4ea6-9181-973d470d1441\") " pod="openshift-monitoring/node-exporter-hw27g" Apr 23 13:34:47.626718 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:34:47.626694 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/9fe377af-7b17-4ea6-9181-973d470d1441-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-hw27g\" (UID: \"9fe377af-7b17-4ea6-9181-973d470d1441\") " pod="openshift-monitoring/node-exporter-hw27g" Apr 23 13:34:47.626828 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:34:47.626723 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/9fe377af-7b17-4ea6-9181-973d470d1441-node-exporter-textfile\") pod \"node-exporter-hw27g\" (UID: \"9fe377af-7b17-4ea6-9181-973d470d1441\") " pod="openshift-monitoring/node-exporter-hw27g" Apr 23 13:34:47.626828 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:34:47.626741 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/9fe377af-7b17-4ea6-9181-973d470d1441-root\") pod \"node-exporter-hw27g\" (UID: \"9fe377af-7b17-4ea6-9181-973d470d1441\") " pod="openshift-monitoring/node-exporter-hw27g" Apr 23 13:34:47.626828 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:34:47.626756 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-crwqn\" (UniqueName: \"kubernetes.io/projected/9fe377af-7b17-4ea6-9181-973d470d1441-kube-api-access-crwqn\") pod \"node-exporter-hw27g\" (UID: \"9fe377af-7b17-4ea6-9181-973d470d1441\") " pod="openshift-monitoring/node-exporter-hw27g" Apr 23 13:34:47.626975 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:34:47.626822 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/9fe377af-7b17-4ea6-9181-973d470d1441-root\") pod \"node-exporter-hw27g\" (UID: \"9fe377af-7b17-4ea6-9181-973d470d1441\") " pod="openshift-monitoring/node-exporter-hw27g" Apr 23 13:34:47.626975 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:34:47.626857 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9fe377af-7b17-4ea6-9181-973d470d1441-sys\") pod \"node-exporter-hw27g\" (UID: \"9fe377af-7b17-4ea6-9181-973d470d1441\") " pod="openshift-monitoring/node-exporter-hw27g" Apr 23 13:34:47.626975 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:34:47.626882 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/9fe377af-7b17-4ea6-9181-973d470d1441-node-exporter-tls\") pod \"node-exporter-hw27g\" (UID: \"9fe377af-7b17-4ea6-9181-973d470d1441\") " pod="openshift-monitoring/node-exporter-hw27g" Apr 23 13:34:47.626975 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:34:47.626921 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9fe377af-7b17-4ea6-9181-973d470d1441-metrics-client-ca\") pod \"node-exporter-hw27g\" (UID: \"9fe377af-7b17-4ea6-9181-973d470d1441\") " pod="openshift-monitoring/node-exporter-hw27g" Apr 23 13:34:47.626975 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:34:47.626944 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/9fe377af-7b17-4ea6-9181-973d470d1441-node-exporter-wtmp\") pod \"node-exporter-hw27g\" (UID: \"9fe377af-7b17-4ea6-9181-973d470d1441\") " pod="openshift-monitoring/node-exporter-hw27g" Apr 23 13:34:47.626975 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:34:47.626954 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9fe377af-7b17-4ea6-9181-973d470d1441-sys\") pod \"node-exporter-hw27g\" (UID: \"9fe377af-7b17-4ea6-9181-973d470d1441\") " pod="openshift-monitoring/node-exporter-hw27g" Apr 23 13:34:47.626975 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:34:47.626975 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/9fe377af-7b17-4ea6-9181-973d470d1441-node-exporter-accelerators-collector-config\") pod \"node-exporter-hw27g\" (UID: \"9fe377af-7b17-4ea6-9181-973d470d1441\") " pod="openshift-monitoring/node-exporter-hw27g" Apr 23 13:34:47.627320 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:34:47.627009 2577 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 23 13:34:47.627320 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:34:47.627022 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/9fe377af-7b17-4ea6-9181-973d470d1441-node-exporter-textfile\") pod \"node-exporter-hw27g\" (UID: \"9fe377af-7b17-4ea6-9181-973d470d1441\") " pod="openshift-monitoring/node-exporter-hw27g" Apr 23 13:34:47.627320 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:34:47.627065 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9fe377af-7b17-4ea6-9181-973d470d1441-node-exporter-tls podName:9fe377af-7b17-4ea6-9181-973d470d1441 nodeName:}" failed. No retries permitted until 2026-04-23 13:34:48.127047452 +0000 UTC m=+161.104956026 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/9fe377af-7b17-4ea6-9181-973d470d1441-node-exporter-tls") pod "node-exporter-hw27g" (UID: "9fe377af-7b17-4ea6-9181-973d470d1441") : secret "node-exporter-tls" not found Apr 23 13:34:47.627320 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:34:47.627126 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/9fe377af-7b17-4ea6-9181-973d470d1441-node-exporter-wtmp\") pod \"node-exporter-hw27g\" (UID: \"9fe377af-7b17-4ea6-9181-973d470d1441\") " pod="openshift-monitoring/node-exporter-hw27g" Apr 23 13:34:47.627552 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:34:47.627481 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/9fe377af-7b17-4ea6-9181-973d470d1441-node-exporter-accelerators-collector-config\") pod \"node-exporter-hw27g\" (UID: \"9fe377af-7b17-4ea6-9181-973d470d1441\") " pod="openshift-monitoring/node-exporter-hw27g" Apr 23 13:34:47.627552 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:34:47.627524 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9fe377af-7b17-4ea6-9181-973d470d1441-metrics-client-ca\") pod \"node-exporter-hw27g\" (UID: \"9fe377af-7b17-4ea6-9181-973d470d1441\") " pod="openshift-monitoring/node-exporter-hw27g" Apr 23 13:34:47.628972 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:34:47.628951 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/9fe377af-7b17-4ea6-9181-973d470d1441-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-hw27g\" (UID: \"9fe377af-7b17-4ea6-9181-973d470d1441\") " pod="openshift-monitoring/node-exporter-hw27g" Apr 23 13:34:47.636594 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:34:47.636575 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-crwqn\" (UniqueName: \"kubernetes.io/projected/9fe377af-7b17-4ea6-9181-973d470d1441-kube-api-access-crwqn\") pod \"node-exporter-hw27g\" (UID: \"9fe377af-7b17-4ea6-9181-973d470d1441\") " pod="openshift-monitoring/node-exporter-hw27g" Apr 23 13:34:48.132216 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:34:48.132187 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/9fe377af-7b17-4ea6-9181-973d470d1441-node-exporter-tls\") pod \"node-exporter-hw27g\" (UID: \"9fe377af-7b17-4ea6-9181-973d470d1441\") " pod="openshift-monitoring/node-exporter-hw27g" Apr 23 13:34:48.134283 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:34:48.134257 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/9fe377af-7b17-4ea6-9181-973d470d1441-node-exporter-tls\") pod \"node-exporter-hw27g\" (UID: \"9fe377af-7b17-4ea6-9181-973d470d1441\") " pod="openshift-monitoring/node-exporter-hw27g" Apr 23 13:34:48.347959 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:34:48.347926 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-hw27g" Apr 23 13:34:48.357194 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:34:48.357168 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9fe377af_7b17_4ea6_9181_973d470d1441.slice/crio-85afb4001df52a6b0340c54e64a42b2e24466fa9fe0676f90b7f0c28784b46ad WatchSource:0}: Error finding container 85afb4001df52a6b0340c54e64a42b2e24466fa9fe0676f90b7f0c28784b46ad: Status 404 returned error can't find the container with id 85afb4001df52a6b0340c54e64a42b2e24466fa9fe0676f90b7f0c28784b46ad Apr 23 13:34:48.741813 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:34:48.741750 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5c77fc8847-462fs" podUID="f701d829-2677-4e05-be5e-da1e52476ffb" containerName="acm-agent" probeResult="failure" output="Get \"http://10.132.0.10:8000/readyz\": dial tcp 10.132.0.10:8000: connect: connection refused" Apr 23 13:34:49.024643 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:34:49.024620 2577 generic.go:358] "Generic (PLEG): container finished" podID="f701d829-2677-4e05-be5e-da1e52476ffb" containerID="59758a7044633e91e60e09064878c72a5b02fd6bb1f6dff9c56085527224b973" exitCode=1 Apr 23 13:34:49.024731 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:34:49.024691 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5c77fc8847-462fs" event={"ID":"f701d829-2677-4e05-be5e-da1e52476ffb","Type":"ContainerDied","Data":"59758a7044633e91e60e09064878c72a5b02fd6bb1f6dff9c56085527224b973"} Apr 23 13:34:49.025049 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:34:49.025028 2577 scope.go:117] "RemoveContainer" containerID="59758a7044633e91e60e09064878c72a5b02fd6bb1f6dff9c56085527224b973" Apr 23 13:34:49.025824 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:34:49.025803 2577 generic.go:358] "Generic (PLEG): container finished" podID="cc46cf5b-f4d6-43f9-b959-fad786ee3667" containerID="13d3cd68c6fa11ba2e4683c0c1f6beb5b15cfbb7ed1949563d512012e894798e" exitCode=255 Apr 23 13:34:49.025909 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:34:49.025872 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-68bb9b987f-g22tk" event={"ID":"cc46cf5b-f4d6-43f9-b959-fad786ee3667","Type":"ContainerDied","Data":"13d3cd68c6fa11ba2e4683c0c1f6beb5b15cfbb7ed1949563d512012e894798e"} Apr 23 13:34:49.026221 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:34:49.026206 2577 scope.go:117] "RemoveContainer" containerID="13d3cd68c6fa11ba2e4683c0c1f6beb5b15cfbb7ed1949563d512012e894798e" Apr 23 13:34:49.026917 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:34:49.026886 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-hw27g" event={"ID":"9fe377af-7b17-4ea6-9181-973d470d1441","Type":"ContainerStarted","Data":"85afb4001df52a6b0340c54e64a42b2e24466fa9fe0676f90b7f0c28784b46ad"} Apr 23 13:34:49.342853 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:34:49.342760 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2de7f8d7-9da6-4fa8-b5a9-c641f96806e0-registry-tls\") pod \"image-registry-986fd6f74-qp86z\" (UID: \"2de7f8d7-9da6-4fa8-b5a9-c641f96806e0\") " pod="openshift-image-registry/image-registry-986fd6f74-qp86z" Apr 23 13:34:49.342853 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:34:49.342814 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d3ae8909-ecd0-47e1-a99c-4ea293db3077-metrics-tls\") pod \"dns-default-rrmk8\" (UID: \"d3ae8909-ecd0-47e1-a99c-4ea293db3077\") " pod="openshift-dns/dns-default-rrmk8" Apr 23 13:34:49.345060 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:34:49.345033 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d3ae8909-ecd0-47e1-a99c-4ea293db3077-metrics-tls\") pod \"dns-default-rrmk8\" (UID: \"d3ae8909-ecd0-47e1-a99c-4ea293db3077\") " pod="openshift-dns/dns-default-rrmk8" Apr 23 13:34:49.345183 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:34:49.345160 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2de7f8d7-9da6-4fa8-b5a9-c641f96806e0-registry-tls\") pod \"image-registry-986fd6f74-qp86z\" (UID: \"2de7f8d7-9da6-4fa8-b5a9-c641f96806e0\") " pod="openshift-image-registry/image-registry-986fd6f74-qp86z" Apr 23 13:34:49.443109 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:34:49.443079 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5d0b8118-d3c3-4333-a6a3-c53abf8e3daa-cert\") pod \"ingress-canary-b9l6f\" (UID: \"5d0b8118-d3c3-4333-a6a3-c53abf8e3daa\") " pod="openshift-ingress-canary/ingress-canary-b9l6f" Apr 23 13:34:49.445214 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:34:49.445196 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5d0b8118-d3c3-4333-a6a3-c53abf8e3daa-cert\") pod \"ingress-canary-b9l6f\" (UID: \"5d0b8118-d3c3-4333-a6a3-c53abf8e3daa\") " pod="openshift-ingress-canary/ingress-canary-b9l6f" Apr 23 13:34:49.519689 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:34:49.519665 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-svh2p\"" Apr 23 13:34:49.519793 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:34:49.519665 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-7pd9g\"" Apr 23 13:34:49.527006 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:34:49.526991 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-rrmk8" Apr 23 13:34:49.527089 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:34:49.527075 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-986fd6f74-qp86z" Apr 23 13:34:49.651359 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:34:49.651338 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-rrmk8"] Apr 23 13:34:49.653360 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:34:49.653323 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd3ae8909_ecd0_47e1_a99c_4ea293db3077.slice/crio-996c124ab0b1cd18d9215b911ab0e465051462cbc2abcf2bbb9a1cc9745c63b6 WatchSource:0}: Error finding container 996c124ab0b1cd18d9215b911ab0e465051462cbc2abcf2bbb9a1cc9745c63b6: Status 404 returned error can't find the container with id 996c124ab0b1cd18d9215b911ab0e465051462cbc2abcf2bbb9a1cc9745c63b6 Apr 23 13:34:49.667593 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:34:49.667565 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-986fd6f74-qp86z"] Apr 23 13:34:49.670677 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:34:49.670656 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2de7f8d7_9da6_4fa8_b5a9_c641f96806e0.slice/crio-16632a03c6bbc367719eb5c9bd02ac29fc95bcdec25f863b649de99a3d835e28 WatchSource:0}: Error finding container 16632a03c6bbc367719eb5c9bd02ac29fc95bcdec25f863b649de99a3d835e28: Status 404 returned error can't find the container with id 16632a03c6bbc367719eb5c9bd02ac29fc95bcdec25f863b649de99a3d835e28 Apr 23 13:34:50.030841 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:34:50.030803 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5c77fc8847-462fs" event={"ID":"f701d829-2677-4e05-be5e-da1e52476ffb","Type":"ContainerStarted","Data":"6a1cf776d6a1d9c192ab9bdb67e62ce357c0d30e819491c96344cd5aeba3fbdf"} Apr 23 13:34:50.031353 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:34:50.031098 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5c77fc8847-462fs" Apr 23 13:34:50.031723 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:34:50.031705 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5c77fc8847-462fs" Apr 23 13:34:50.032609 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:34:50.032585 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-68bb9b987f-g22tk" event={"ID":"cc46cf5b-f4d6-43f9-b959-fad786ee3667","Type":"ContainerStarted","Data":"15f30c22daa3e8dffd79b640cb8cf19cb53cb730894ae0eda2fe10932d9ed92b"} Apr 23 13:34:50.033805 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:34:50.033782 2577 generic.go:358] "Generic (PLEG): container finished" podID="9fe377af-7b17-4ea6-9181-973d470d1441" containerID="454f6c866e41c7f37c3df77ac58c5d16a09fea2ad34fe4309c89f7313fe8c754" exitCode=0 Apr 23 13:34:50.033908 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:34:50.033853 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-hw27g" event={"ID":"9fe377af-7b17-4ea6-9181-973d470d1441","Type":"ContainerDied","Data":"454f6c866e41c7f37c3df77ac58c5d16a09fea2ad34fe4309c89f7313fe8c754"} Apr 23 13:34:50.035393 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:34:50.035359 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-986fd6f74-qp86z" event={"ID":"2de7f8d7-9da6-4fa8-b5a9-c641f96806e0","Type":"ContainerStarted","Data":"f9de026c2f9c4c382f0e05c6abb9f509f19510c56645a4e375042a19e226396a"} Apr 23 13:34:50.035393 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:34:50.035389 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-986fd6f74-qp86z" event={"ID":"2de7f8d7-9da6-4fa8-b5a9-c641f96806e0","Type":"ContainerStarted","Data":"16632a03c6bbc367719eb5c9bd02ac29fc95bcdec25f863b649de99a3d835e28"} Apr 23 13:34:50.035561 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:34:50.035461 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-986fd6f74-qp86z" Apr 23 13:34:50.036506 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:34:50.036454 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-rrmk8" event={"ID":"d3ae8909-ecd0-47e1-a99c-4ea293db3077","Type":"ContainerStarted","Data":"996c124ab0b1cd18d9215b911ab0e465051462cbc2abcf2bbb9a1cc9745c63b6"} Apr 23 13:34:50.120774 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:34:50.120729 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-986fd6f74-qp86z" podStartSLOduration=162.120711433 podStartE2EDuration="2m42.120711433s" podCreationTimestamp="2026-04-23 13:32:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:34:50.119613498 +0000 UTC m=+163.097522089" watchObservedRunningTime="2026-04-23 13:34:50.120711433 +0000 UTC m=+163.098620025" Apr 23 13:34:51.041382 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:34:51.041353 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-hw27g" event={"ID":"9fe377af-7b17-4ea6-9181-973d470d1441","Type":"ContainerStarted","Data":"7c159822f7094e08ef79058ec43d40d2ce90ec01920a9b1c009410b82495e666"} Apr 23 13:34:51.041773 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:34:51.041391 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-hw27g" event={"ID":"9fe377af-7b17-4ea6-9181-973d470d1441","Type":"ContainerStarted","Data":"cb98e690ebfd7c1ed42d2ccbcebdaea6db85c59d732f9c0f8af7b3f4018531af"} Apr 23 13:34:51.061664 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:34:51.061584 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-hw27g" podStartSLOduration=3.416564478 podStartE2EDuration="4.061570594s" podCreationTimestamp="2026-04-23 13:34:47 +0000 UTC" firstStartedPulling="2026-04-23 13:34:48.359118019 +0000 UTC m=+161.337026589" lastFinishedPulling="2026-04-23 13:34:49.004124122 +0000 UTC m=+161.982032705" observedRunningTime="2026-04-23 13:34:51.060023516 +0000 UTC m=+164.037932107" watchObservedRunningTime="2026-04-23 13:34:51.061570594 +0000 UTC m=+164.039479223" Apr 23 13:34:52.047712 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:34:52.047677 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-rrmk8" event={"ID":"d3ae8909-ecd0-47e1-a99c-4ea293db3077","Type":"ContainerStarted","Data":"104f9b1518747ea6c2f79401125c027aa4059bec87253941ef76dfd6e9b22ed3"} Apr 23 13:34:52.048169 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:34:52.048022 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-rrmk8" Apr 23 13:34:52.048169 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:34:52.048034 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-rrmk8" event={"ID":"d3ae8909-ecd0-47e1-a99c-4ea293db3077","Type":"ContainerStarted","Data":"190d498d53c48e4a8dc8d685f0ab925b8b2d3bb0dbb5269a8396f3e91619ff2e"} Apr 23 13:34:52.065894 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:34:52.065853 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-rrmk8" podStartSLOduration=129.787535126 podStartE2EDuration="2m11.065840561s" podCreationTimestamp="2026-04-23 13:32:41 +0000 UTC" firstStartedPulling="2026-04-23 13:34:49.655175457 +0000 UTC m=+162.633084030" lastFinishedPulling="2026-04-23 13:34:50.933480896 +0000 UTC m=+163.911389465" observedRunningTime="2026-04-23 13:34:52.065108281 +0000 UTC m=+165.043016870" watchObservedRunningTime="2026-04-23 13:34:52.065840561 +0000 UTC m=+165.043749151" Apr 23 13:34:56.580055 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:34:56.579952 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-b9l6f" Apr 23 13:34:56.583109 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:34:56.583081 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-cqcw8\"" Apr 23 13:34:56.590400 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:34:56.590381 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-b9l6f" Apr 23 13:34:56.699963 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:34:56.699930 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-b9l6f"] Apr 23 13:34:56.703710 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:34:56.703684 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d0b8118_d3c3_4333_a6a3_c53abf8e3daa.slice/crio-e60fea02cbdd5c97b1498547850863c52153e156b4793a7b5118614bde60b4a2 WatchSource:0}: Error finding container e60fea02cbdd5c97b1498547850863c52153e156b4793a7b5118614bde60b4a2: Status 404 returned error can't find the container with id e60fea02cbdd5c97b1498547850863c52153e156b4793a7b5118614bde60b4a2 Apr 23 13:34:57.062213 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:34:57.062168 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-b9l6f" event={"ID":"5d0b8118-d3c3-4333-a6a3-c53abf8e3daa","Type":"ContainerStarted","Data":"e60fea02cbdd5c97b1498547850863c52153e156b4793a7b5118614bde60b4a2"} Apr 23 13:34:59.067471 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:34:59.067438 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-b9l6f" event={"ID":"5d0b8118-d3c3-4333-a6a3-c53abf8e3daa","Type":"ContainerStarted","Data":"d781d21050a5268dbb8ebb0b646d638c70ecabdef830321db3696a6d41d0a8c6"} Apr 23 13:34:59.083968 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:34:59.083925 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-b9l6f" podStartSLOduration=136.69510561 podStartE2EDuration="2m18.083913284s" podCreationTimestamp="2026-04-23 13:32:41 +0000 UTC" firstStartedPulling="2026-04-23 13:34:56.705600171 +0000 UTC m=+169.683508746" lastFinishedPulling="2026-04-23 13:34:58.094407849 +0000 UTC m=+171.072316420" observedRunningTime="2026-04-23 13:34:59.082913384 +0000 UTC m=+172.060821987" watchObservedRunningTime="2026-04-23 13:34:59.083913284 +0000 UTC m=+172.061821875" Apr 23 13:35:00.580142 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:35:00.580104 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6pz6w" Apr 23 13:35:02.052407 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:35:02.052373 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-rrmk8" Apr 23 13:35:08.626574 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:35:08.626539 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-986fd6f74-qp86z"] Apr 23 13:35:08.630425 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:35:08.630399 2577 patch_prober.go:28] interesting pod/image-registry-986fd6f74-qp86z container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 13:35:08.630567 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:35:08.630441 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-986fd6f74-qp86z" podUID="2de7f8d7-9da6-4fa8-b5a9-c641f96806e0" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:35:12.279918 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:35:12.279878 2577 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-769587dc6f-7jf96" podUID="70f94fd0-7f09-4697-9cbe-0c891f7f9740" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 23 13:35:18.630411 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:35:18.630379 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-986fd6f74-qp86z" Apr 23 13:35:22.279645 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:35:22.279607 2577 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-769587dc6f-7jf96" podUID="70f94fd0-7f09-4697-9cbe-0c891f7f9740" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 23 13:35:32.279407 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:35:32.279369 2577 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-769587dc6f-7jf96" podUID="70f94fd0-7f09-4697-9cbe-0c891f7f9740" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 23 13:35:32.279869 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:35:32.279429 2577 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-769587dc6f-7jf96" Apr 23 13:35:32.279935 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:35:32.279906 2577 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"a0daec5d95e6c72a8f2691024f20b9044b0bb95eb36c16103265fcf5a7d04370"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-769587dc6f-7jf96" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 23 13:35:32.279976 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:35:32.279952 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-769587dc6f-7jf96" podUID="70f94fd0-7f09-4697-9cbe-0c891f7f9740" containerName="service-proxy" containerID="cri-o://a0daec5d95e6c72a8f2691024f20b9044b0bb95eb36c16103265fcf5a7d04370" gracePeriod=30 Apr 23 13:35:33.151262 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:35:33.151230 2577 generic.go:358] "Generic (PLEG): container finished" podID="70f94fd0-7f09-4697-9cbe-0c891f7f9740" containerID="a0daec5d95e6c72a8f2691024f20b9044b0bb95eb36c16103265fcf5a7d04370" exitCode=2 Apr 23 13:35:33.151445 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:35:33.151300 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-769587dc6f-7jf96" event={"ID":"70f94fd0-7f09-4697-9cbe-0c891f7f9740","Type":"ContainerDied","Data":"a0daec5d95e6c72a8f2691024f20b9044b0bb95eb36c16103265fcf5a7d04370"} Apr 23 13:35:33.151445 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:35:33.151336 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-769587dc6f-7jf96" event={"ID":"70f94fd0-7f09-4697-9cbe-0c891f7f9740","Type":"ContainerStarted","Data":"86965b8bc8da970b9608442f827e36ed5df6f96d68e61307b20162c21c96c4ad"} Apr 23 13:35:33.644334 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:35:33.644276 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-986fd6f74-qp86z" podUID="2de7f8d7-9da6-4fa8-b5a9-c641f96806e0" containerName="registry" containerID="cri-o://f9de026c2f9c4c382f0e05c6abb9f509f19510c56645a4e375042a19e226396a" gracePeriod=30 Apr 23 13:35:33.875246 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:35:33.875225 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-986fd6f74-qp86z" Apr 23 13:35:33.971620 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:35:33.971590 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-47jnp\" (UniqueName: \"kubernetes.io/projected/2de7f8d7-9da6-4fa8-b5a9-c641f96806e0-kube-api-access-47jnp\") pod \"2de7f8d7-9da6-4fa8-b5a9-c641f96806e0\" (UID: \"2de7f8d7-9da6-4fa8-b5a9-c641f96806e0\") " Apr 23 13:35:33.971842 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:35:33.971636 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2de7f8d7-9da6-4fa8-b5a9-c641f96806e0-registry-certificates\") pod \"2de7f8d7-9da6-4fa8-b5a9-c641f96806e0\" (UID: \"2de7f8d7-9da6-4fa8-b5a9-c641f96806e0\") " Apr 23 13:35:33.971842 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:35:33.971664 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2de7f8d7-9da6-4fa8-b5a9-c641f96806e0-ca-trust-extracted\") pod \"2de7f8d7-9da6-4fa8-b5a9-c641f96806e0\" (UID: \"2de7f8d7-9da6-4fa8-b5a9-c641f96806e0\") " Apr 23 13:35:33.971842 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:35:33.971685 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2de7f8d7-9da6-4fa8-b5a9-c641f96806e0-registry-tls\") pod \"2de7f8d7-9da6-4fa8-b5a9-c641f96806e0\" (UID: \"2de7f8d7-9da6-4fa8-b5a9-c641f96806e0\") " Apr 23 13:35:33.971842 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:35:33.971724 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2de7f8d7-9da6-4fa8-b5a9-c641f96806e0-trusted-ca\") pod \"2de7f8d7-9da6-4fa8-b5a9-c641f96806e0\" (UID: \"2de7f8d7-9da6-4fa8-b5a9-c641f96806e0\") " Apr 23 13:35:33.971842 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:35:33.971763 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2de7f8d7-9da6-4fa8-b5a9-c641f96806e0-bound-sa-token\") pod \"2de7f8d7-9da6-4fa8-b5a9-c641f96806e0\" (UID: \"2de7f8d7-9da6-4fa8-b5a9-c641f96806e0\") " Apr 23 13:35:33.972096 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:35:33.971889 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/2de7f8d7-9da6-4fa8-b5a9-c641f96806e0-image-registry-private-configuration\") pod \"2de7f8d7-9da6-4fa8-b5a9-c641f96806e0\" (UID: \"2de7f8d7-9da6-4fa8-b5a9-c641f96806e0\") " Apr 23 13:35:33.972096 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:35:33.971952 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2de7f8d7-9da6-4fa8-b5a9-c641f96806e0-installation-pull-secrets\") pod \"2de7f8d7-9da6-4fa8-b5a9-c641f96806e0\" (UID: \"2de7f8d7-9da6-4fa8-b5a9-c641f96806e0\") " Apr 23 13:35:33.972201 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:35:33.972113 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2de7f8d7-9da6-4fa8-b5a9-c641f96806e0-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "2de7f8d7-9da6-4fa8-b5a9-c641f96806e0" (UID: "2de7f8d7-9da6-4fa8-b5a9-c641f96806e0"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:35:33.972201 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:35:33.972186 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2de7f8d7-9da6-4fa8-b5a9-c641f96806e0-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "2de7f8d7-9da6-4fa8-b5a9-c641f96806e0" (UID: "2de7f8d7-9da6-4fa8-b5a9-c641f96806e0"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:35:33.972308 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:35:33.972271 2577 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2de7f8d7-9da6-4fa8-b5a9-c641f96806e0-registry-certificates\") on node \"ip-10-0-132-207.ec2.internal\" DevicePath \"\"" Apr 23 13:35:33.972308 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:35:33.972292 2577 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2de7f8d7-9da6-4fa8-b5a9-c641f96806e0-trusted-ca\") on node \"ip-10-0-132-207.ec2.internal\" DevicePath \"\"" Apr 23 13:35:33.974135 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:35:33.974065 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2de7f8d7-9da6-4fa8-b5a9-c641f96806e0-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "2de7f8d7-9da6-4fa8-b5a9-c641f96806e0" (UID: "2de7f8d7-9da6-4fa8-b5a9-c641f96806e0"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:35:33.974245 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:35:33.974196 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2de7f8d7-9da6-4fa8-b5a9-c641f96806e0-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "2de7f8d7-9da6-4fa8-b5a9-c641f96806e0" (UID: "2de7f8d7-9da6-4fa8-b5a9-c641f96806e0"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:35:33.974245 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:35:33.974203 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2de7f8d7-9da6-4fa8-b5a9-c641f96806e0-kube-api-access-47jnp" (OuterVolumeSpecName: "kube-api-access-47jnp") pod "2de7f8d7-9da6-4fa8-b5a9-c641f96806e0" (UID: "2de7f8d7-9da6-4fa8-b5a9-c641f96806e0"). InnerVolumeSpecName "kube-api-access-47jnp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:35:33.974432 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:35:33.974410 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2de7f8d7-9da6-4fa8-b5a9-c641f96806e0-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "2de7f8d7-9da6-4fa8-b5a9-c641f96806e0" (UID: "2de7f8d7-9da6-4fa8-b5a9-c641f96806e0"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:35:33.974526 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:35:33.974507 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2de7f8d7-9da6-4fa8-b5a9-c641f96806e0-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "2de7f8d7-9da6-4fa8-b5a9-c641f96806e0" (UID: "2de7f8d7-9da6-4fa8-b5a9-c641f96806e0"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:35:33.980226 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:35:33.980203 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2de7f8d7-9da6-4fa8-b5a9-c641f96806e0-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "2de7f8d7-9da6-4fa8-b5a9-c641f96806e0" (UID: "2de7f8d7-9da6-4fa8-b5a9-c641f96806e0"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:35:34.072869 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:35:34.072826 2577 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2de7f8d7-9da6-4fa8-b5a9-c641f96806e0-bound-sa-token\") on node \"ip-10-0-132-207.ec2.internal\" DevicePath \"\"" Apr 23 13:35:34.072869 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:35:34.072864 2577 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/2de7f8d7-9da6-4fa8-b5a9-c641f96806e0-image-registry-private-configuration\") on node \"ip-10-0-132-207.ec2.internal\" DevicePath \"\"" Apr 23 13:35:34.072869 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:35:34.072875 2577 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2de7f8d7-9da6-4fa8-b5a9-c641f96806e0-installation-pull-secrets\") on node \"ip-10-0-132-207.ec2.internal\" DevicePath \"\"" Apr 23 13:35:34.072869 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:35:34.072884 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-47jnp\" (UniqueName: \"kubernetes.io/projected/2de7f8d7-9da6-4fa8-b5a9-c641f96806e0-kube-api-access-47jnp\") on node \"ip-10-0-132-207.ec2.internal\" DevicePath \"\"" Apr 23 13:35:34.073101 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:35:34.072895 2577 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2de7f8d7-9da6-4fa8-b5a9-c641f96806e0-ca-trust-extracted\") on node \"ip-10-0-132-207.ec2.internal\" DevicePath \"\"" Apr 23 13:35:34.073101 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:35:34.072905 2577 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2de7f8d7-9da6-4fa8-b5a9-c641f96806e0-registry-tls\") on node \"ip-10-0-132-207.ec2.internal\" DevicePath \"\"" Apr 23 13:35:34.155010 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:35:34.154975 2577 generic.go:358] "Generic (PLEG): container finished" podID="2de7f8d7-9da6-4fa8-b5a9-c641f96806e0" containerID="f9de026c2f9c4c382f0e05c6abb9f509f19510c56645a4e375042a19e226396a" exitCode=0 Apr 23 13:35:34.155150 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:35:34.155036 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-986fd6f74-qp86z" Apr 23 13:35:34.155150 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:35:34.155062 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-986fd6f74-qp86z" event={"ID":"2de7f8d7-9da6-4fa8-b5a9-c641f96806e0","Type":"ContainerDied","Data":"f9de026c2f9c4c382f0e05c6abb9f509f19510c56645a4e375042a19e226396a"} Apr 23 13:35:34.155150 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:35:34.155104 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-986fd6f74-qp86z" event={"ID":"2de7f8d7-9da6-4fa8-b5a9-c641f96806e0","Type":"ContainerDied","Data":"16632a03c6bbc367719eb5c9bd02ac29fc95bcdec25f863b649de99a3d835e28"} Apr 23 13:35:34.155150 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:35:34.155120 2577 scope.go:117] "RemoveContainer" containerID="f9de026c2f9c4c382f0e05c6abb9f509f19510c56645a4e375042a19e226396a" Apr 23 13:35:34.165845 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:35:34.165824 2577 scope.go:117] "RemoveContainer" containerID="f9de026c2f9c4c382f0e05c6abb9f509f19510c56645a4e375042a19e226396a" Apr 23 13:35:34.166135 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:35:34.166116 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9de026c2f9c4c382f0e05c6abb9f509f19510c56645a4e375042a19e226396a\": container with ID starting with f9de026c2f9c4c382f0e05c6abb9f509f19510c56645a4e375042a19e226396a not found: ID does not exist" containerID="f9de026c2f9c4c382f0e05c6abb9f509f19510c56645a4e375042a19e226396a" Apr 23 13:35:34.166199 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:35:34.166145 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9de026c2f9c4c382f0e05c6abb9f509f19510c56645a4e375042a19e226396a"} err="failed to get container status \"f9de026c2f9c4c382f0e05c6abb9f509f19510c56645a4e375042a19e226396a\": rpc error: code = NotFound desc = could not find container \"f9de026c2f9c4c382f0e05c6abb9f509f19510c56645a4e375042a19e226396a\": container with ID starting with f9de026c2f9c4c382f0e05c6abb9f509f19510c56645a4e375042a19e226396a not found: ID does not exist" Apr 23 13:35:34.176172 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:35:34.176149 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-986fd6f74-qp86z"] Apr 23 13:35:34.182325 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:35:34.182305 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-986fd6f74-qp86z"] Apr 23 13:35:35.583066 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:35:35.583035 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2de7f8d7-9da6-4fa8-b5a9-c641f96806e0" path="/var/lib/kubelet/pods/2de7f8d7-9da6-4fa8-b5a9-c641f96806e0/volumes" Apr 23 13:36:19.485016 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:36:19.484979 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/821df7f9-3f87-4f86-a7e9-82cad302fff0-metrics-certs\") pod \"network-metrics-daemon-6pz6w\" (UID: \"821df7f9-3f87-4f86-a7e9-82cad302fff0\") " pod="openshift-multus/network-metrics-daemon-6pz6w" Apr 23 13:36:19.487362 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:36:19.487339 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/821df7f9-3f87-4f86-a7e9-82cad302fff0-metrics-certs\") pod \"network-metrics-daemon-6pz6w\" (UID: \"821df7f9-3f87-4f86-a7e9-82cad302fff0\") " pod="openshift-multus/network-metrics-daemon-6pz6w" Apr 23 13:36:19.784165 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:36:19.784084 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-lj9pj\"" Apr 23 13:36:19.792428 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:36:19.792407 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6pz6w" Apr 23 13:36:19.902741 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:36:19.902710 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-6pz6w"] Apr 23 13:36:19.905562 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:36:19.905531 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod821df7f9_3f87_4f86_a7e9_82cad302fff0.slice/crio-e7ddbf60541473210109574a9f9a9108626a9beec86f3153511eb504f18e0ac3 WatchSource:0}: Error finding container e7ddbf60541473210109574a9f9a9108626a9beec86f3153511eb504f18e0ac3: Status 404 returned error can't find the container with id e7ddbf60541473210109574a9f9a9108626a9beec86f3153511eb504f18e0ac3 Apr 23 13:36:20.268804 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:36:20.268766 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-6pz6w" event={"ID":"821df7f9-3f87-4f86-a7e9-82cad302fff0","Type":"ContainerStarted","Data":"e7ddbf60541473210109574a9f9a9108626a9beec86f3153511eb504f18e0ac3"} Apr 23 13:36:21.275424 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:36:21.275390 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-6pz6w" event={"ID":"821df7f9-3f87-4f86-a7e9-82cad302fff0","Type":"ContainerStarted","Data":"863b792b95a9814c24e23bf6c531574d2d7b72cc9c442b49e4e38eb6c94c7f5e"} Apr 23 13:36:21.275780 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:36:21.275432 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-6pz6w" event={"ID":"821df7f9-3f87-4f86-a7e9-82cad302fff0","Type":"ContainerStarted","Data":"2d8be321b1bfebb345add55f328b78767683c5c30c7d4339df5a776e8661c9fb"} Apr 23 13:36:21.292112 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:36:21.292075 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-6pz6w" podStartSLOduration=253.34173642 podStartE2EDuration="4m14.29206354s" podCreationTimestamp="2026-04-23 13:32:07 +0000 UTC" firstStartedPulling="2026-04-23 13:36:19.907361719 +0000 UTC m=+252.885270289" lastFinishedPulling="2026-04-23 13:36:20.857688839 +0000 UTC m=+253.835597409" observedRunningTime="2026-04-23 13:36:21.290524878 +0000 UTC m=+254.268433470" watchObservedRunningTime="2026-04-23 13:36:21.29206354 +0000 UTC m=+254.269972132" Apr 23 13:37:07.473257 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:37:07.473231 2577 kubelet.go:1628] "Image garbage collection succeeded" Apr 23 13:39:02.633242 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:39:02.633206 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-wdggt"] Apr 23 13:39:02.633663 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:39:02.633425 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2de7f8d7-9da6-4fa8-b5a9-c641f96806e0" containerName="registry" Apr 23 13:39:02.633663 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:39:02.633436 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="2de7f8d7-9da6-4fa8-b5a9-c641f96806e0" containerName="registry" Apr 23 13:39:02.633663 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:39:02.633501 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="2de7f8d7-9da6-4fa8-b5a9-c641f96806e0" containerName="registry" Apr 23 13:39:02.636348 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:39:02.636325 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-wdggt" Apr 23 13:39:02.639068 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:39:02.639045 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 23 13:39:02.639192 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:39:02.639173 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-gvzn5\"" Apr 23 13:39:02.639259 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:39:02.639209 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 23 13:39:02.639319 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:39:02.639303 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 23 13:39:02.647742 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:39:02.647721 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-wdggt"] Apr 23 13:39:02.787906 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:39:02.787877 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/440a422d-60c1-4826-8b70-97d23d707157-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-wdggt\" (UID: \"440a422d-60c1-4826-8b70-97d23d707157\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-wdggt" Apr 23 13:39:02.788050 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:39:02.787919 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mw26\" (UniqueName: \"kubernetes.io/projected/440a422d-60c1-4826-8b70-97d23d707157-kube-api-access-9mw26\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-wdggt\" (UID: \"440a422d-60c1-4826-8b70-97d23d707157\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-wdggt" Apr 23 13:39:02.888624 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:39:02.888567 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/440a422d-60c1-4826-8b70-97d23d707157-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-wdggt\" (UID: \"440a422d-60c1-4826-8b70-97d23d707157\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-wdggt" Apr 23 13:39:02.888624 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:39:02.888608 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9mw26\" (UniqueName: \"kubernetes.io/projected/440a422d-60c1-4826-8b70-97d23d707157-kube-api-access-9mw26\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-wdggt\" (UID: \"440a422d-60c1-4826-8b70-97d23d707157\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-wdggt" Apr 23 13:39:02.890830 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:39:02.890807 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/440a422d-60c1-4826-8b70-97d23d707157-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-wdggt\" (UID: \"440a422d-60c1-4826-8b70-97d23d707157\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-wdggt" Apr 23 13:39:02.898952 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:39:02.898924 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mw26\" (UniqueName: \"kubernetes.io/projected/440a422d-60c1-4826-8b70-97d23d707157-kube-api-access-9mw26\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-wdggt\" (UID: \"440a422d-60c1-4826-8b70-97d23d707157\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-wdggt" Apr 23 13:39:02.945965 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:39:02.945941 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-wdggt" Apr 23 13:39:03.057938 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:39:03.057916 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-wdggt"] Apr 23 13:39:03.060448 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:39:03.060411 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod440a422d_60c1_4826_8b70_97d23d707157.slice/crio-bc41d7b2b792d18e4f7fa691e16600ba75480bbb1c0b8bdc427453566459d956 WatchSource:0}: Error finding container bc41d7b2b792d18e4f7fa691e16600ba75480bbb1c0b8bdc427453566459d956: Status 404 returned error can't find the container with id bc41d7b2b792d18e4f7fa691e16600ba75480bbb1c0b8bdc427453566459d956 Apr 23 13:39:03.062064 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:39:03.062047 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 13:39:03.670776 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:39:03.670738 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-wdggt" event={"ID":"440a422d-60c1-4826-8b70-97d23d707157","Type":"ContainerStarted","Data":"bc41d7b2b792d18e4f7fa691e16600ba75480bbb1c0b8bdc427453566459d956"} Apr 23 13:39:07.683610 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:39:07.683569 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-wdggt" event={"ID":"440a422d-60c1-4826-8b70-97d23d707157","Type":"ContainerStarted","Data":"e283d277838f6b2be54ed492eb5298e4420b0d1c27c92d1c1b0ce983d3d47bb8"} Apr 23 13:39:07.683977 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:39:07.683646 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-wdggt" Apr 23 13:39:07.712092 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:39:07.712048 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-wdggt" podStartSLOduration=1.884168914 podStartE2EDuration="5.7120359s" podCreationTimestamp="2026-04-23 13:39:02 +0000 UTC" firstStartedPulling="2026-04-23 13:39:03.062176682 +0000 UTC m=+416.040085253" lastFinishedPulling="2026-04-23 13:39:06.890043669 +0000 UTC m=+419.867952239" observedRunningTime="2026-04-23 13:39:07.711695512 +0000 UTC m=+420.689604104" watchObservedRunningTime="2026-04-23 13:39:07.7120359 +0000 UTC m=+420.689944492" Apr 23 13:39:28.688534 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:39:28.688437 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-wdggt" Apr 23 13:40:14.197719 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:40:14.197684 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-2shc8"] Apr 23 13:40:14.199575 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:40:14.199555 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-2shc8" Apr 23 13:40:14.202376 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:40:14.202356 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-5kggg\"" Apr 23 13:40:14.202376 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:40:14.202360 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 23 13:40:14.202569 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:40:14.202408 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 23 13:40:14.203389 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:40:14.203375 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 23 13:40:14.209685 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:40:14.209667 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-2shc8"] Apr 23 13:40:14.231751 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:40:14.231726 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-86cc847c5c-sr6pk"] Apr 23 13:40:14.233701 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:40:14.233686 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-sr6pk" Apr 23 13:40:14.236463 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:40:14.236442 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 23 13:40:14.236579 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:40:14.236530 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-brc9m\"" Apr 23 13:40:14.243433 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:40:14.243414 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-sr6pk"] Apr 23 13:40:14.272885 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:40:14.272862 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/909b49ca-874d-4fa9-a88d-1205920e24fb-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-2shc8\" (UID: \"909b49ca-874d-4fa9-a88d-1205920e24fb\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-2shc8" Apr 23 13:40:14.272986 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:40:14.272925 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nd899\" (UniqueName: \"kubernetes.io/projected/909b49ca-874d-4fa9-a88d-1205920e24fb-kube-api-access-nd899\") pod \"llmisvc-controller-manager-68cc5db7c4-2shc8\" (UID: \"909b49ca-874d-4fa9-a88d-1205920e24fb\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-2shc8" Apr 23 13:40:14.272986 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:40:14.272971 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/6fc263fb-458d-40a2-a680-52de2c3b4007-data\") pod \"seaweedfs-86cc847c5c-sr6pk\" (UID: \"6fc263fb-458d-40a2-a680-52de2c3b4007\") " pod="kserve/seaweedfs-86cc847c5c-sr6pk" Apr 23 13:40:14.273075 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:40:14.272998 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8zm7\" (UniqueName: \"kubernetes.io/projected/6fc263fb-458d-40a2-a680-52de2c3b4007-kube-api-access-b8zm7\") pod \"seaweedfs-86cc847c5c-sr6pk\" (UID: \"6fc263fb-458d-40a2-a680-52de2c3b4007\") " pod="kserve/seaweedfs-86cc847c5c-sr6pk" Apr 23 13:40:14.374259 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:40:14.374226 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/909b49ca-874d-4fa9-a88d-1205920e24fb-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-2shc8\" (UID: \"909b49ca-874d-4fa9-a88d-1205920e24fb\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-2shc8" Apr 23 13:40:14.374394 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:40:14.374285 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nd899\" (UniqueName: \"kubernetes.io/projected/909b49ca-874d-4fa9-a88d-1205920e24fb-kube-api-access-nd899\") pod \"llmisvc-controller-manager-68cc5db7c4-2shc8\" (UID: \"909b49ca-874d-4fa9-a88d-1205920e24fb\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-2shc8" Apr 23 13:40:14.374394 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:40:14.374303 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/6fc263fb-458d-40a2-a680-52de2c3b4007-data\") pod \"seaweedfs-86cc847c5c-sr6pk\" (UID: \"6fc263fb-458d-40a2-a680-52de2c3b4007\") " pod="kserve/seaweedfs-86cc847c5c-sr6pk" Apr 23 13:40:14.374529 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:40:14.374415 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b8zm7\" (UniqueName: \"kubernetes.io/projected/6fc263fb-458d-40a2-a680-52de2c3b4007-kube-api-access-b8zm7\") pod \"seaweedfs-86cc847c5c-sr6pk\" (UID: \"6fc263fb-458d-40a2-a680-52de2c3b4007\") " pod="kserve/seaweedfs-86cc847c5c-sr6pk" Apr 23 13:40:14.374629 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:40:14.374613 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/6fc263fb-458d-40a2-a680-52de2c3b4007-data\") pod \"seaweedfs-86cc847c5c-sr6pk\" (UID: \"6fc263fb-458d-40a2-a680-52de2c3b4007\") " pod="kserve/seaweedfs-86cc847c5c-sr6pk" Apr 23 13:40:14.376647 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:40:14.376624 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/909b49ca-874d-4fa9-a88d-1205920e24fb-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-2shc8\" (UID: \"909b49ca-874d-4fa9-a88d-1205920e24fb\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-2shc8" Apr 23 13:40:14.383013 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:40:14.382989 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nd899\" (UniqueName: \"kubernetes.io/projected/909b49ca-874d-4fa9-a88d-1205920e24fb-kube-api-access-nd899\") pod \"llmisvc-controller-manager-68cc5db7c4-2shc8\" (UID: \"909b49ca-874d-4fa9-a88d-1205920e24fb\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-2shc8" Apr 23 13:40:14.384028 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:40:14.384005 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8zm7\" (UniqueName: \"kubernetes.io/projected/6fc263fb-458d-40a2-a680-52de2c3b4007-kube-api-access-b8zm7\") pod \"seaweedfs-86cc847c5c-sr6pk\" (UID: \"6fc263fb-458d-40a2-a680-52de2c3b4007\") " pod="kserve/seaweedfs-86cc847c5c-sr6pk" Apr 23 13:40:14.508959 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:40:14.508891 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-2shc8" Apr 23 13:40:14.542128 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:40:14.542103 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-sr6pk" Apr 23 13:40:14.645189 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:40:14.645151 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-2shc8"] Apr 23 13:40:14.647629 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:40:14.647597 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod909b49ca_874d_4fa9_a88d_1205920e24fb.slice/crio-322791979b5a6f1495783a57046d4acae60fed61f7c04c99e05385e0736d053f WatchSource:0}: Error finding container 322791979b5a6f1495783a57046d4acae60fed61f7c04c99e05385e0736d053f: Status 404 returned error can't find the container with id 322791979b5a6f1495783a57046d4acae60fed61f7c04c99e05385e0736d053f Apr 23 13:40:14.674837 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:40:14.674815 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-sr6pk"] Apr 23 13:40:14.677574 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:40:14.677548 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6fc263fb_458d_40a2_a680_52de2c3b4007.slice/crio-614f2f5ea327ea36e9cc6ced7bdbbcd5f51927e1f501dfcdade074784860fb9a WatchSource:0}: Error finding container 614f2f5ea327ea36e9cc6ced7bdbbcd5f51927e1f501dfcdade074784860fb9a: Status 404 returned error can't find the container with id 614f2f5ea327ea36e9cc6ced7bdbbcd5f51927e1f501dfcdade074784860fb9a Apr 23 13:40:14.853678 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:40:14.853603 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-sr6pk" event={"ID":"6fc263fb-458d-40a2-a680-52de2c3b4007","Type":"ContainerStarted","Data":"614f2f5ea327ea36e9cc6ced7bdbbcd5f51927e1f501dfcdade074784860fb9a"} Apr 23 13:40:14.854528 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:40:14.854508 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-2shc8" event={"ID":"909b49ca-874d-4fa9-a88d-1205920e24fb","Type":"ContainerStarted","Data":"322791979b5a6f1495783a57046d4acae60fed61f7c04c99e05385e0736d053f"} Apr 23 13:40:18.869110 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:40:18.869049 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-sr6pk" event={"ID":"6fc263fb-458d-40a2-a680-52de2c3b4007","Type":"ContainerStarted","Data":"2922120b5ff59e48dfa9d3ad3e0c72d3b61c8d52c6e814c3c0144afeb4136a60"} Apr 23 13:40:18.869592 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:40:18.869123 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-86cc847c5c-sr6pk" Apr 23 13:40:18.870344 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:40:18.870315 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-2shc8" event={"ID":"909b49ca-874d-4fa9-a88d-1205920e24fb","Type":"ContainerStarted","Data":"82d80ebe16c64fc9c62b956285d7d469bd5110cbd327df61dd1a0d8e8e63c633"} Apr 23 13:40:18.870461 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:40:18.870429 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-2shc8" Apr 23 13:40:18.883894 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:40:18.883847 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-86cc847c5c-sr6pk" podStartSLOduration=1.769877862 podStartE2EDuration="4.883835374s" podCreationTimestamp="2026-04-23 13:40:14 +0000 UTC" firstStartedPulling="2026-04-23 13:40:14.678771087 +0000 UTC m=+487.656679657" lastFinishedPulling="2026-04-23 13:40:17.792728599 +0000 UTC m=+490.770637169" observedRunningTime="2026-04-23 13:40:18.883502523 +0000 UTC m=+491.861411105" watchObservedRunningTime="2026-04-23 13:40:18.883835374 +0000 UTC m=+491.861744002" Apr 23 13:40:24.875282 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:40:24.875250 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-86cc847c5c-sr6pk" Apr 23 13:40:24.892135 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:40:24.892091 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-68cc5db7c4-2shc8" podStartSLOduration=7.799544448 podStartE2EDuration="10.892079231s" podCreationTimestamp="2026-04-23 13:40:14 +0000 UTC" firstStartedPulling="2026-04-23 13:40:14.648948272 +0000 UTC m=+487.626856842" lastFinishedPulling="2026-04-23 13:40:17.741483052 +0000 UTC m=+490.719391625" observedRunningTime="2026-04-23 13:40:18.899268831 +0000 UTC m=+491.877177424" watchObservedRunningTime="2026-04-23 13:40:24.892079231 +0000 UTC m=+497.869987830" Apr 23 13:40:49.875414 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:40:49.875384 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-2shc8" Apr 23 13:41:24.752139 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:41:24.752110 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-tdtsd"] Apr 23 13:41:24.753933 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:41:24.753917 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-tdtsd" Apr 23 13:41:24.756451 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:41:24.756429 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 23 13:41:24.756584 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:41:24.756469 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-8lf7c\"" Apr 23 13:41:24.761895 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:41:24.761873 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-tdtsd"] Apr 23 13:41:24.768800 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:41:24.768777 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5cdd1980-12e8-4103-8738-ba37e5119ae4-tls-certs\") pod \"model-serving-api-86f7b4b499-tdtsd\" (UID: \"5cdd1980-12e8-4103-8738-ba37e5119ae4\") " pod="kserve/model-serving-api-86f7b4b499-tdtsd" Apr 23 13:41:24.768901 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:41:24.768815 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62qsg\" (UniqueName: \"kubernetes.io/projected/5cdd1980-12e8-4103-8738-ba37e5119ae4-kube-api-access-62qsg\") pod \"model-serving-api-86f7b4b499-tdtsd\" (UID: \"5cdd1980-12e8-4103-8738-ba37e5119ae4\") " pod="kserve/model-serving-api-86f7b4b499-tdtsd" Apr 23 13:41:24.869163 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:41:24.869130 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-62qsg\" (UniqueName: \"kubernetes.io/projected/5cdd1980-12e8-4103-8738-ba37e5119ae4-kube-api-access-62qsg\") pod \"model-serving-api-86f7b4b499-tdtsd\" (UID: \"5cdd1980-12e8-4103-8738-ba37e5119ae4\") " pod="kserve/model-serving-api-86f7b4b499-tdtsd" Apr 23 13:41:24.869329 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:41:24.869199 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5cdd1980-12e8-4103-8738-ba37e5119ae4-tls-certs\") pod \"model-serving-api-86f7b4b499-tdtsd\" (UID: \"5cdd1980-12e8-4103-8738-ba37e5119ae4\") " pod="kserve/model-serving-api-86f7b4b499-tdtsd" Apr 23 13:41:24.871690 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:41:24.871666 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5cdd1980-12e8-4103-8738-ba37e5119ae4-tls-certs\") pod \"model-serving-api-86f7b4b499-tdtsd\" (UID: \"5cdd1980-12e8-4103-8738-ba37e5119ae4\") " pod="kserve/model-serving-api-86f7b4b499-tdtsd" Apr 23 13:41:24.876819 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:41:24.876783 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-62qsg\" (UniqueName: \"kubernetes.io/projected/5cdd1980-12e8-4103-8738-ba37e5119ae4-kube-api-access-62qsg\") pod \"model-serving-api-86f7b4b499-tdtsd\" (UID: \"5cdd1980-12e8-4103-8738-ba37e5119ae4\") " pod="kserve/model-serving-api-86f7b4b499-tdtsd" Apr 23 13:41:25.064528 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:41:25.064419 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-tdtsd" Apr 23 13:41:25.189021 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:41:25.188989 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-tdtsd"] Apr 23 13:41:25.192722 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:41:25.192691 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5cdd1980_12e8_4103_8738_ba37e5119ae4.slice/crio-11f19a0101700030e975c0d8b1f5ccc804a23f60ba999b8137ab2a336f14242f WatchSource:0}: Error finding container 11f19a0101700030e975c0d8b1f5ccc804a23f60ba999b8137ab2a336f14242f: Status 404 returned error can't find the container with id 11f19a0101700030e975c0d8b1f5ccc804a23f60ba999b8137ab2a336f14242f Apr 23 13:41:26.058033 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:41:26.057986 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-tdtsd" event={"ID":"5cdd1980-12e8-4103-8738-ba37e5119ae4","Type":"ContainerStarted","Data":"11f19a0101700030e975c0d8b1f5ccc804a23f60ba999b8137ab2a336f14242f"} Apr 23 13:41:28.064785 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:41:28.064743 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-tdtsd" event={"ID":"5cdd1980-12e8-4103-8738-ba37e5119ae4","Type":"ContainerStarted","Data":"c1ba2082e192729a0a8f90534a7dd3b66f70c68819655e2ec263650332b8c7b8"} Apr 23 13:41:28.065146 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:41:28.064880 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-tdtsd" Apr 23 13:41:28.081115 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:41:28.081071 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-tdtsd" podStartSLOduration=1.7772998549999999 podStartE2EDuration="4.081059852s" podCreationTimestamp="2026-04-23 13:41:24 +0000 UTC" firstStartedPulling="2026-04-23 13:41:25.19493788 +0000 UTC m=+558.172846457" lastFinishedPulling="2026-04-23 13:41:27.49869787 +0000 UTC m=+560.476606454" observedRunningTime="2026-04-23 13:41:28.080244046 +0000 UTC m=+561.058152641" watchObservedRunningTime="2026-04-23 13:41:28.081059852 +0000 UTC m=+561.058968451" Apr 23 13:41:39.084669 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:41:39.084634 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-tdtsd" Apr 23 13:42:00.288197 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:42:00.288157 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7c88655ddf-p6px8"] Apr 23 13:42:00.290549 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:42:00.290532 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7c88655ddf-p6px8" Apr 23 13:42:00.292952 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:42:00.292925 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-graph-1-predictor-serving-cert\"" Apr 23 13:42:00.293131 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:42:00.293117 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 23 13:42:00.293187 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:42:00.293131 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-tk4rw\"" Apr 23 13:42:00.294239 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:42:00.294221 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 23 13:42:00.294330 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:42:00.294277 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\"" Apr 23 13:42:00.299119 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:42:00.299098 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7c88655ddf-p6px8"] Apr 23 13:42:00.403682 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:42:00.403654 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qsgg\" (UniqueName: \"kubernetes.io/projected/01b5a2d1-4b2c-4634-a26c-0eb0f0d31d1e-kube-api-access-7qsgg\") pod \"isvc-sklearn-graph-1-predictor-7c88655ddf-p6px8\" (UID: \"01b5a2d1-4b2c-4634-a26c-0eb0f0d31d1e\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7c88655ddf-p6px8" Apr 23 13:42:00.403837 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:42:00.403693 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/01b5a2d1-4b2c-4634-a26c-0eb0f0d31d1e-isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-1-predictor-7c88655ddf-p6px8\" (UID: \"01b5a2d1-4b2c-4634-a26c-0eb0f0d31d1e\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7c88655ddf-p6px8" Apr 23 13:42:00.403837 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:42:00.403793 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/01b5a2d1-4b2c-4634-a26c-0eb0f0d31d1e-proxy-tls\") pod \"isvc-sklearn-graph-1-predictor-7c88655ddf-p6px8\" (UID: \"01b5a2d1-4b2c-4634-a26c-0eb0f0d31d1e\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7c88655ddf-p6px8" Apr 23 13:42:00.403934 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:42:00.403840 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/01b5a2d1-4b2c-4634-a26c-0eb0f0d31d1e-kserve-provision-location\") pod \"isvc-sklearn-graph-1-predictor-7c88655ddf-p6px8\" (UID: \"01b5a2d1-4b2c-4634-a26c-0eb0f0d31d1e\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7c88655ddf-p6px8" Apr 23 13:42:00.504665 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:42:00.504633 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/01b5a2d1-4b2c-4634-a26c-0eb0f0d31d1e-proxy-tls\") pod \"isvc-sklearn-graph-1-predictor-7c88655ddf-p6px8\" (UID: \"01b5a2d1-4b2c-4634-a26c-0eb0f0d31d1e\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7c88655ddf-p6px8" Apr 23 13:42:00.504799 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:42:00.504676 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/01b5a2d1-4b2c-4634-a26c-0eb0f0d31d1e-kserve-provision-location\") pod \"isvc-sklearn-graph-1-predictor-7c88655ddf-p6px8\" (UID: \"01b5a2d1-4b2c-4634-a26c-0eb0f0d31d1e\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7c88655ddf-p6px8" Apr 23 13:42:00.504799 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:42:00.504703 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7qsgg\" (UniqueName: \"kubernetes.io/projected/01b5a2d1-4b2c-4634-a26c-0eb0f0d31d1e-kube-api-access-7qsgg\") pod \"isvc-sklearn-graph-1-predictor-7c88655ddf-p6px8\" (UID: \"01b5a2d1-4b2c-4634-a26c-0eb0f0d31d1e\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7c88655ddf-p6px8" Apr 23 13:42:00.504799 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:42:00.504741 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/01b5a2d1-4b2c-4634-a26c-0eb0f0d31d1e-isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-1-predictor-7c88655ddf-p6px8\" (UID: \"01b5a2d1-4b2c-4634-a26c-0eb0f0d31d1e\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7c88655ddf-p6px8" Apr 23 13:42:00.504799 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:42:00.504785 2577 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-serving-cert: secret "isvc-sklearn-graph-1-predictor-serving-cert" not found Apr 23 13:42:00.504941 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:42:00.504849 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/01b5a2d1-4b2c-4634-a26c-0eb0f0d31d1e-proxy-tls podName:01b5a2d1-4b2c-4634-a26c-0eb0f0d31d1e nodeName:}" failed. No retries permitted until 2026-04-23 13:42:01.004831917 +0000 UTC m=+593.982740499 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/01b5a2d1-4b2c-4634-a26c-0eb0f0d31d1e-proxy-tls") pod "isvc-sklearn-graph-1-predictor-7c88655ddf-p6px8" (UID: "01b5a2d1-4b2c-4634-a26c-0eb0f0d31d1e") : secret "isvc-sklearn-graph-1-predictor-serving-cert" not found Apr 23 13:42:00.505104 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:42:00.505083 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/01b5a2d1-4b2c-4634-a26c-0eb0f0d31d1e-kserve-provision-location\") pod \"isvc-sklearn-graph-1-predictor-7c88655ddf-p6px8\" (UID: \"01b5a2d1-4b2c-4634-a26c-0eb0f0d31d1e\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7c88655ddf-p6px8" Apr 23 13:42:00.505377 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:42:00.505330 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/01b5a2d1-4b2c-4634-a26c-0eb0f0d31d1e-isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-1-predictor-7c88655ddf-p6px8\" (UID: \"01b5a2d1-4b2c-4634-a26c-0eb0f0d31d1e\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7c88655ddf-p6px8" Apr 23 13:42:00.514788 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:42:00.514765 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qsgg\" (UniqueName: \"kubernetes.io/projected/01b5a2d1-4b2c-4634-a26c-0eb0f0d31d1e-kube-api-access-7qsgg\") pod \"isvc-sklearn-graph-1-predictor-7c88655ddf-p6px8\" (UID: \"01b5a2d1-4b2c-4634-a26c-0eb0f0d31d1e\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7c88655ddf-p6px8" Apr 23 13:42:00.547306 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:42:00.547259 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-94d25-predictor-f56cbdc6b-6r59g"] Apr 23 13:42:00.549263 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:42:00.549249 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-94d25-predictor-f56cbdc6b-6r59g" Apr 23 13:42:00.551558 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:42:00.551537 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-94d25-kube-rbac-proxy-sar-config\"" Apr 23 13:42:00.551727 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:42:00.551705 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-94d25-predictor-serving-cert\"" Apr 23 13:42:00.559088 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:42:00.559065 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-94d25-predictor-f56cbdc6b-6r59g"] Apr 23 13:42:00.605203 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:42:00.605181 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-94d25-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/555b7e6f-53f2-4b5d-9b94-1286de763e29-error-404-isvc-94d25-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-94d25-predictor-f56cbdc6b-6r59g\" (UID: \"555b7e6f-53f2-4b5d-9b94-1286de763e29\") " pod="kserve-ci-e2e-test/error-404-isvc-94d25-predictor-f56cbdc6b-6r59g" Apr 23 13:42:00.605334 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:42:00.605223 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/555b7e6f-53f2-4b5d-9b94-1286de763e29-proxy-tls\") pod \"error-404-isvc-94d25-predictor-f56cbdc6b-6r59g\" (UID: \"555b7e6f-53f2-4b5d-9b94-1286de763e29\") " pod="kserve-ci-e2e-test/error-404-isvc-94d25-predictor-f56cbdc6b-6r59g" Apr 23 13:42:00.605334 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:42:00.605241 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxjg5\" (UniqueName: \"kubernetes.io/projected/555b7e6f-53f2-4b5d-9b94-1286de763e29-kube-api-access-cxjg5\") pod \"error-404-isvc-94d25-predictor-f56cbdc6b-6r59g\" (UID: \"555b7e6f-53f2-4b5d-9b94-1286de763e29\") " pod="kserve-ci-e2e-test/error-404-isvc-94d25-predictor-f56cbdc6b-6r59g" Apr 23 13:42:00.706060 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:42:00.706023 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/555b7e6f-53f2-4b5d-9b94-1286de763e29-proxy-tls\") pod \"error-404-isvc-94d25-predictor-f56cbdc6b-6r59g\" (UID: \"555b7e6f-53f2-4b5d-9b94-1286de763e29\") " pod="kserve-ci-e2e-test/error-404-isvc-94d25-predictor-f56cbdc6b-6r59g" Apr 23 13:42:00.706175 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:42:00.706064 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cxjg5\" (UniqueName: \"kubernetes.io/projected/555b7e6f-53f2-4b5d-9b94-1286de763e29-kube-api-access-cxjg5\") pod \"error-404-isvc-94d25-predictor-f56cbdc6b-6r59g\" (UID: \"555b7e6f-53f2-4b5d-9b94-1286de763e29\") " pod="kserve-ci-e2e-test/error-404-isvc-94d25-predictor-f56cbdc6b-6r59g" Apr 23 13:42:00.706175 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:42:00.706162 2577 secret.go:189] Couldn't get secret kserve-ci-e2e-test/error-404-isvc-94d25-predictor-serving-cert: secret "error-404-isvc-94d25-predictor-serving-cert" not found Apr 23 13:42:00.706251 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:42:00.706211 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-94d25-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/555b7e6f-53f2-4b5d-9b94-1286de763e29-error-404-isvc-94d25-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-94d25-predictor-f56cbdc6b-6r59g\" (UID: \"555b7e6f-53f2-4b5d-9b94-1286de763e29\") " pod="kserve-ci-e2e-test/error-404-isvc-94d25-predictor-f56cbdc6b-6r59g" Apr 23 13:42:00.706251 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:42:00.706248 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/555b7e6f-53f2-4b5d-9b94-1286de763e29-proxy-tls podName:555b7e6f-53f2-4b5d-9b94-1286de763e29 nodeName:}" failed. No retries permitted until 2026-04-23 13:42:01.206234072 +0000 UTC m=+594.184142643 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/555b7e6f-53f2-4b5d-9b94-1286de763e29-proxy-tls") pod "error-404-isvc-94d25-predictor-f56cbdc6b-6r59g" (UID: "555b7e6f-53f2-4b5d-9b94-1286de763e29") : secret "error-404-isvc-94d25-predictor-serving-cert" not found Apr 23 13:42:00.706776 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:42:00.706757 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-94d25-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/555b7e6f-53f2-4b5d-9b94-1286de763e29-error-404-isvc-94d25-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-94d25-predictor-f56cbdc6b-6r59g\" (UID: \"555b7e6f-53f2-4b5d-9b94-1286de763e29\") " pod="kserve-ci-e2e-test/error-404-isvc-94d25-predictor-f56cbdc6b-6r59g" Apr 23 13:42:00.714785 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:42:00.714764 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxjg5\" (UniqueName: \"kubernetes.io/projected/555b7e6f-53f2-4b5d-9b94-1286de763e29-kube-api-access-cxjg5\") pod \"error-404-isvc-94d25-predictor-f56cbdc6b-6r59g\" (UID: \"555b7e6f-53f2-4b5d-9b94-1286de763e29\") " pod="kserve-ci-e2e-test/error-404-isvc-94d25-predictor-f56cbdc6b-6r59g" Apr 23 13:42:01.008141 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:42:01.008108 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/01b5a2d1-4b2c-4634-a26c-0eb0f0d31d1e-proxy-tls\") pod \"isvc-sklearn-graph-1-predictor-7c88655ddf-p6px8\" (UID: \"01b5a2d1-4b2c-4634-a26c-0eb0f0d31d1e\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7c88655ddf-p6px8" Apr 23 13:42:01.008296 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:42:01.008266 2577 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-serving-cert: secret "isvc-sklearn-graph-1-predictor-serving-cert" not found Apr 23 13:42:01.008337 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:42:01.008323 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/01b5a2d1-4b2c-4634-a26c-0eb0f0d31d1e-proxy-tls podName:01b5a2d1-4b2c-4634-a26c-0eb0f0d31d1e nodeName:}" failed. No retries permitted until 2026-04-23 13:42:02.008309051 +0000 UTC m=+594.986217626 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/01b5a2d1-4b2c-4634-a26c-0eb0f0d31d1e-proxy-tls") pod "isvc-sklearn-graph-1-predictor-7c88655ddf-p6px8" (UID: "01b5a2d1-4b2c-4634-a26c-0eb0f0d31d1e") : secret "isvc-sklearn-graph-1-predictor-serving-cert" not found Apr 23 13:42:01.209775 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:42:01.209744 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/555b7e6f-53f2-4b5d-9b94-1286de763e29-proxy-tls\") pod \"error-404-isvc-94d25-predictor-f56cbdc6b-6r59g\" (UID: \"555b7e6f-53f2-4b5d-9b94-1286de763e29\") " pod="kserve-ci-e2e-test/error-404-isvc-94d25-predictor-f56cbdc6b-6r59g" Apr 23 13:42:01.212120 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:42:01.212096 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/555b7e6f-53f2-4b5d-9b94-1286de763e29-proxy-tls\") pod \"error-404-isvc-94d25-predictor-f56cbdc6b-6r59g\" (UID: \"555b7e6f-53f2-4b5d-9b94-1286de763e29\") " pod="kserve-ci-e2e-test/error-404-isvc-94d25-predictor-f56cbdc6b-6r59g" Apr 23 13:42:01.460001 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:42:01.459972 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-94d25-predictor-f56cbdc6b-6r59g" Apr 23 13:42:01.575759 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:42:01.575732 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-94d25-predictor-f56cbdc6b-6r59g"] Apr 23 13:42:01.579043 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:42:01.579013 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod555b7e6f_53f2_4b5d_9b94_1286de763e29.slice/crio-87cc4e7d87946430580ede323a7562a18f4d54162f4a164ab43cb3e231efff09 WatchSource:0}: Error finding container 87cc4e7d87946430580ede323a7562a18f4d54162f4a164ab43cb3e231efff09: Status 404 returned error can't find the container with id 87cc4e7d87946430580ede323a7562a18f4d54162f4a164ab43cb3e231efff09 Apr 23 13:42:02.016808 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:42:02.016764 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/01b5a2d1-4b2c-4634-a26c-0eb0f0d31d1e-proxy-tls\") pod \"isvc-sklearn-graph-1-predictor-7c88655ddf-p6px8\" (UID: \"01b5a2d1-4b2c-4634-a26c-0eb0f0d31d1e\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7c88655ddf-p6px8" Apr 23 13:42:02.019098 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:42:02.019080 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/01b5a2d1-4b2c-4634-a26c-0eb0f0d31d1e-proxy-tls\") pod \"isvc-sklearn-graph-1-predictor-7c88655ddf-p6px8\" (UID: \"01b5a2d1-4b2c-4634-a26c-0eb0f0d31d1e\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7c88655ddf-p6px8" Apr 23 13:42:02.100725 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:42:02.100699 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7c88655ddf-p6px8" Apr 23 13:42:02.177193 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:42:02.176454 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-94d25-predictor-f56cbdc6b-6r59g" event={"ID":"555b7e6f-53f2-4b5d-9b94-1286de763e29","Type":"ContainerStarted","Data":"87cc4e7d87946430580ede323a7562a18f4d54162f4a164ab43cb3e231efff09"} Apr 23 13:42:02.256225 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:42:02.256181 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7c88655ddf-p6px8"] Apr 23 13:42:03.182702 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:42:03.182638 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7c88655ddf-p6px8" event={"ID":"01b5a2d1-4b2c-4634-a26c-0eb0f0d31d1e","Type":"ContainerStarted","Data":"3619ed39c29f795413a784e04a2733d6671effa4a3c093e63c6339515da6e3d9"} Apr 23 13:42:15.225837 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:42:15.225787 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-94d25-predictor-f56cbdc6b-6r59g" event={"ID":"555b7e6f-53f2-4b5d-9b94-1286de763e29","Type":"ContainerStarted","Data":"d2986728fbfe6ed0ef2813d2524c11ecfe47588b90f7b35e1cbecf47b5324c61"} Apr 23 13:42:15.228262 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:42:15.228130 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7c88655ddf-p6px8" event={"ID":"01b5a2d1-4b2c-4634-a26c-0eb0f0d31d1e","Type":"ContainerStarted","Data":"851cf8c8c28abdd03646537cd343c574c084c2c6d2d649de7c87f7a346763821"} Apr 23 13:42:17.235484 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:42:17.235450 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-94d25-predictor-f56cbdc6b-6r59g" event={"ID":"555b7e6f-53f2-4b5d-9b94-1286de763e29","Type":"ContainerStarted","Data":"c214315856cc6d2c2e7f4009e25cd527ed585264e4d2b4b76966b7d68677fdce"} Apr 23 13:42:17.235906 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:42:17.235611 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-94d25-predictor-f56cbdc6b-6r59g" Apr 23 13:42:17.253177 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:42:17.253129 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-94d25-predictor-f56cbdc6b-6r59g" podStartSLOduration=2.179562859 podStartE2EDuration="17.253116753s" podCreationTimestamp="2026-04-23 13:42:00 +0000 UTC" firstStartedPulling="2026-04-23 13:42:01.580901781 +0000 UTC m=+594.558810351" lastFinishedPulling="2026-04-23 13:42:16.654455676 +0000 UTC m=+609.632364245" observedRunningTime="2026-04-23 13:42:17.251419651 +0000 UTC m=+610.229328243" watchObservedRunningTime="2026-04-23 13:42:17.253116753 +0000 UTC m=+610.231025392" Apr 23 13:42:18.240291 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:42:18.240259 2577 generic.go:358] "Generic (PLEG): container finished" podID="01b5a2d1-4b2c-4634-a26c-0eb0f0d31d1e" containerID="851cf8c8c28abdd03646537cd343c574c084c2c6d2d649de7c87f7a346763821" exitCode=0 Apr 23 13:42:18.240667 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:42:18.240334 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7c88655ddf-p6px8" event={"ID":"01b5a2d1-4b2c-4634-a26c-0eb0f0d31d1e","Type":"ContainerDied","Data":"851cf8c8c28abdd03646537cd343c574c084c2c6d2d649de7c87f7a346763821"} Apr 23 13:42:18.240712 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:42:18.240677 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-94d25-predictor-f56cbdc6b-6r59g" Apr 23 13:42:18.242018 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:42:18.241990 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-94d25-predictor-f56cbdc6b-6r59g" podUID="555b7e6f-53f2-4b5d-9b94-1286de763e29" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.18:8080: connect: connection refused" Apr 23 13:42:19.243481 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:42:19.243434 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-94d25-predictor-f56cbdc6b-6r59g" podUID="555b7e6f-53f2-4b5d-9b94-1286de763e29" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.18:8080: connect: connection refused" Apr 23 13:42:24.248684 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:42:24.248649 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-94d25-predictor-f56cbdc6b-6r59g" Apr 23 13:42:24.249255 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:42:24.249229 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-94d25-predictor-f56cbdc6b-6r59g" podUID="555b7e6f-53f2-4b5d-9b94-1286de763e29" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.18:8080: connect: connection refused" Apr 23 13:42:25.262023 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:42:25.261989 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7c88655ddf-p6px8" event={"ID":"01b5a2d1-4b2c-4634-a26c-0eb0f0d31d1e","Type":"ContainerStarted","Data":"f948017c3b92a4032c0caa28f71555416a3bada60ecf6e9a614b15a39595d004"} Apr 23 13:42:25.262023 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:42:25.262028 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7c88655ddf-p6px8" event={"ID":"01b5a2d1-4b2c-4634-a26c-0eb0f0d31d1e","Type":"ContainerStarted","Data":"95d6ca55a2227a68e4b3b17a4cdbdbbca600d4df52e2baf89218b36141fa233f"} Apr 23 13:42:25.262423 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:42:25.262222 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7c88655ddf-p6px8" Apr 23 13:42:25.280639 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:42:25.280599 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7c88655ddf-p6px8" podStartSLOduration=3.135475332 podStartE2EDuration="25.280588199s" podCreationTimestamp="2026-04-23 13:42:00 +0000 UTC" firstStartedPulling="2026-04-23 13:42:02.264464941 +0000 UTC m=+595.242373513" lastFinishedPulling="2026-04-23 13:42:24.40957781 +0000 UTC m=+617.387486380" observedRunningTime="2026-04-23 13:42:25.279975616 +0000 UTC m=+618.257884219" watchObservedRunningTime="2026-04-23 13:42:25.280588199 +0000 UTC m=+618.258496791" Apr 23 13:42:26.265517 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:42:26.265424 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7c88655ddf-p6px8" Apr 23 13:42:26.266702 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:42:26.266666 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7c88655ddf-p6px8" podUID="01b5a2d1-4b2c-4634-a26c-0eb0f0d31d1e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.17:8080: connect: connection refused" Apr 23 13:42:27.268004 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:42:27.267957 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7c88655ddf-p6px8" podUID="01b5a2d1-4b2c-4634-a26c-0eb0f0d31d1e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.17:8080: connect: connection refused" Apr 23 13:42:32.272406 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:42:32.272371 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7c88655ddf-p6px8" Apr 23 13:42:32.273043 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:42:32.273013 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7c88655ddf-p6px8" podUID="01b5a2d1-4b2c-4634-a26c-0eb0f0d31d1e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.17:8080: connect: connection refused" Apr 23 13:42:34.249860 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:42:34.249706 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-94d25-predictor-f56cbdc6b-6r59g" podUID="555b7e6f-53f2-4b5d-9b94-1286de763e29" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.18:8080: connect: connection refused" Apr 23 13:42:42.273573 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:42:42.273530 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7c88655ddf-p6px8" podUID="01b5a2d1-4b2c-4634-a26c-0eb0f0d31d1e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.17:8080: connect: connection refused" Apr 23 13:42:44.249432 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:42:44.249390 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-94d25-predictor-f56cbdc6b-6r59g" podUID="555b7e6f-53f2-4b5d-9b94-1286de763e29" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.18:8080: connect: connection refused" Apr 23 13:42:52.273237 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:42:52.273197 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7c88655ddf-p6px8" podUID="01b5a2d1-4b2c-4634-a26c-0eb0f0d31d1e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.17:8080: connect: connection refused" Apr 23 13:42:54.249318 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:42:54.249280 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-94d25-predictor-f56cbdc6b-6r59g" podUID="555b7e6f-53f2-4b5d-9b94-1286de763e29" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.18:8080: connect: connection refused" Apr 23 13:43:02.273558 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:43:02.273518 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7c88655ddf-p6px8" podUID="01b5a2d1-4b2c-4634-a26c-0eb0f0d31d1e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.17:8080: connect: connection refused" Apr 23 13:43:04.250318 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:43:04.250290 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-94d25-predictor-f56cbdc6b-6r59g" Apr 23 13:43:12.273452 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:43:12.273415 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7c88655ddf-p6px8" podUID="01b5a2d1-4b2c-4634-a26c-0eb0f0d31d1e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.17:8080: connect: connection refused" Apr 23 13:43:20.142882 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:43:20.142852 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/switch-graph-94d25-5664f9bbb4-v4vxx"] Apr 23 13:43:20.144837 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:43:20.144821 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-94d25-5664f9bbb4-v4vxx" Apr 23 13:43:20.147368 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:43:20.147345 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-94d25-kube-rbac-proxy-sar-config\"" Apr 23 13:43:20.147522 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:43:20.147351 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-94d25-serving-cert\"" Apr 23 13:43:20.152249 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:43:20.152229 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-94d25-5664f9bbb4-v4vxx"] Apr 23 13:43:20.218327 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:43:20.218302 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0f72de3a-e751-4e2d-b589-a4cac2dd1dd1-openshift-service-ca-bundle\") pod \"switch-graph-94d25-5664f9bbb4-v4vxx\" (UID: \"0f72de3a-e751-4e2d-b589-a4cac2dd1dd1\") " pod="kserve-ci-e2e-test/switch-graph-94d25-5664f9bbb4-v4vxx" Apr 23 13:43:20.218457 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:43:20.218341 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0f72de3a-e751-4e2d-b589-a4cac2dd1dd1-proxy-tls\") pod \"switch-graph-94d25-5664f9bbb4-v4vxx\" (UID: \"0f72de3a-e751-4e2d-b589-a4cac2dd1dd1\") " pod="kserve-ci-e2e-test/switch-graph-94d25-5664f9bbb4-v4vxx" Apr 23 13:43:20.319674 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:43:20.319643 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0f72de3a-e751-4e2d-b589-a4cac2dd1dd1-openshift-service-ca-bundle\") pod \"switch-graph-94d25-5664f9bbb4-v4vxx\" (UID: \"0f72de3a-e751-4e2d-b589-a4cac2dd1dd1\") " pod="kserve-ci-e2e-test/switch-graph-94d25-5664f9bbb4-v4vxx" Apr 23 13:43:20.319773 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:43:20.319687 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0f72de3a-e751-4e2d-b589-a4cac2dd1dd1-proxy-tls\") pod \"switch-graph-94d25-5664f9bbb4-v4vxx\" (UID: \"0f72de3a-e751-4e2d-b589-a4cac2dd1dd1\") " pod="kserve-ci-e2e-test/switch-graph-94d25-5664f9bbb4-v4vxx" Apr 23 13:43:20.319815 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:43:20.319777 2577 secret.go:189] Couldn't get secret kserve-ci-e2e-test/switch-graph-94d25-serving-cert: secret "switch-graph-94d25-serving-cert" not found Apr 23 13:43:20.319848 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:43:20.319834 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0f72de3a-e751-4e2d-b589-a4cac2dd1dd1-proxy-tls podName:0f72de3a-e751-4e2d-b589-a4cac2dd1dd1 nodeName:}" failed. No retries permitted until 2026-04-23 13:43:20.819819658 +0000 UTC m=+673.797728228 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/0f72de3a-e751-4e2d-b589-a4cac2dd1dd1-proxy-tls") pod "switch-graph-94d25-5664f9bbb4-v4vxx" (UID: "0f72de3a-e751-4e2d-b589-a4cac2dd1dd1") : secret "switch-graph-94d25-serving-cert" not found Apr 23 13:43:20.320227 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:43:20.320207 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0f72de3a-e751-4e2d-b589-a4cac2dd1dd1-openshift-service-ca-bundle\") pod \"switch-graph-94d25-5664f9bbb4-v4vxx\" (UID: \"0f72de3a-e751-4e2d-b589-a4cac2dd1dd1\") " pod="kserve-ci-e2e-test/switch-graph-94d25-5664f9bbb4-v4vxx" Apr 23 13:43:20.823856 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:43:20.823823 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0f72de3a-e751-4e2d-b589-a4cac2dd1dd1-proxy-tls\") pod \"switch-graph-94d25-5664f9bbb4-v4vxx\" (UID: \"0f72de3a-e751-4e2d-b589-a4cac2dd1dd1\") " pod="kserve-ci-e2e-test/switch-graph-94d25-5664f9bbb4-v4vxx" Apr 23 13:43:20.826064 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:43:20.826030 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0f72de3a-e751-4e2d-b589-a4cac2dd1dd1-proxy-tls\") pod \"switch-graph-94d25-5664f9bbb4-v4vxx\" (UID: \"0f72de3a-e751-4e2d-b589-a4cac2dd1dd1\") " pod="kserve-ci-e2e-test/switch-graph-94d25-5664f9bbb4-v4vxx" Apr 23 13:43:21.055193 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:43:21.055149 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-94d25-5664f9bbb4-v4vxx" Apr 23 13:43:21.170610 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:43:21.170577 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-94d25-5664f9bbb4-v4vxx"] Apr 23 13:43:21.174135 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:43:21.174110 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f72de3a_e751_4e2d_b589_a4cac2dd1dd1.slice/crio-a0b43c5589b66d72e6ab6ce7a867a166312cefbad8e30f5c854cd2eddf3c853d WatchSource:0}: Error finding container a0b43c5589b66d72e6ab6ce7a867a166312cefbad8e30f5c854cd2eddf3c853d: Status 404 returned error can't find the container with id a0b43c5589b66d72e6ab6ce7a867a166312cefbad8e30f5c854cd2eddf3c853d Apr 23 13:43:21.417454 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:43:21.417370 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-94d25-5664f9bbb4-v4vxx" event={"ID":"0f72de3a-e751-4e2d-b589-a4cac2dd1dd1","Type":"ContainerStarted","Data":"a0b43c5589b66d72e6ab6ce7a867a166312cefbad8e30f5c854cd2eddf3c853d"} Apr 23 13:43:22.273172 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:43:22.273137 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7c88655ddf-p6px8" podUID="01b5a2d1-4b2c-4634-a26c-0eb0f0d31d1e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.17:8080: connect: connection refused" Apr 23 13:43:24.427479 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:43:24.427438 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-94d25-5664f9bbb4-v4vxx" event={"ID":"0f72de3a-e751-4e2d-b589-a4cac2dd1dd1","Type":"ContainerStarted","Data":"9980b7977dbe3a941b42e770f5bf15bcf87f4f9a4cf9b7e00c9afac109b94526"} Apr 23 13:43:24.427857 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:43:24.427531 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-94d25-5664f9bbb4-v4vxx" Apr 23 13:43:24.443953 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:43:24.443910 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/switch-graph-94d25-5664f9bbb4-v4vxx" podStartSLOduration=1.8626330260000001 podStartE2EDuration="4.443897447s" podCreationTimestamp="2026-04-23 13:43:20 +0000 UTC" firstStartedPulling="2026-04-23 13:43:21.175951187 +0000 UTC m=+674.153859764" lastFinishedPulling="2026-04-23 13:43:23.757215607 +0000 UTC m=+676.735124185" observedRunningTime="2026-04-23 13:43:24.442704031 +0000 UTC m=+677.420612625" watchObservedRunningTime="2026-04-23 13:43:24.443897447 +0000 UTC m=+677.421806039" Apr 23 13:43:30.436168 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:43:30.436138 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/switch-graph-94d25-5664f9bbb4-v4vxx" Apr 23 13:43:32.274376 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:43:32.274348 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7c88655ddf-p6px8" Apr 23 13:43:34.341089 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:43:34.341057 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-94d25-5664f9bbb4-v4vxx"] Apr 23 13:43:34.341467 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:43:34.341288 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/switch-graph-94d25-5664f9bbb4-v4vxx" podUID="0f72de3a-e751-4e2d-b589-a4cac2dd1dd1" containerName="switch-graph-94d25" containerID="cri-o://9980b7977dbe3a941b42e770f5bf15bcf87f4f9a4cf9b7e00c9afac109b94526" gracePeriod=30 Apr 23 13:43:34.562347 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:43:34.562316 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-94d25-predictor-f56cbdc6b-6r59g"] Apr 23 13:43:34.562654 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:43:34.562615 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-94d25-predictor-f56cbdc6b-6r59g" podUID="555b7e6f-53f2-4b5d-9b94-1286de763e29" containerName="kserve-container" containerID="cri-o://d2986728fbfe6ed0ef2813d2524c11ecfe47588b90f7b35e1cbecf47b5324c61" gracePeriod=30 Apr 23 13:43:34.562747 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:43:34.562641 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-94d25-predictor-f56cbdc6b-6r59g" podUID="555b7e6f-53f2-4b5d-9b94-1286de763e29" containerName="kube-rbac-proxy" containerID="cri-o://c214315856cc6d2c2e7f4009e25cd527ed585264e4d2b4b76966b7d68677fdce" gracePeriod=30 Apr 23 13:43:34.619456 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:43:34.619397 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-d2d70-predictor-557fcdd5cb-lmqhk"] Apr 23 13:43:34.621457 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:43:34.621442 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-d2d70-predictor-557fcdd5cb-lmqhk" Apr 23 13:43:34.624151 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:43:34.624133 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-d2d70-predictor-serving-cert\"" Apr 23 13:43:34.624257 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:43:34.624158 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-d2d70-kube-rbac-proxy-sar-config\"" Apr 23 13:43:34.629804 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:43:34.629785 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-d2d70-predictor-557fcdd5cb-lmqhk"] Apr 23 13:43:34.722730 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:43:34.722698 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a96fbf6d-90ba-4223-868a-f95ba2c8d876-proxy-tls\") pod \"error-404-isvc-d2d70-predictor-557fcdd5cb-lmqhk\" (UID: \"a96fbf6d-90ba-4223-868a-f95ba2c8d876\") " pod="kserve-ci-e2e-test/error-404-isvc-d2d70-predictor-557fcdd5cb-lmqhk" Apr 23 13:43:34.722829 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:43:34.722738 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-d2d70-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a96fbf6d-90ba-4223-868a-f95ba2c8d876-error-404-isvc-d2d70-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-d2d70-predictor-557fcdd5cb-lmqhk\" (UID: \"a96fbf6d-90ba-4223-868a-f95ba2c8d876\") " pod="kserve-ci-e2e-test/error-404-isvc-d2d70-predictor-557fcdd5cb-lmqhk" Apr 23 13:43:34.722888 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:43:34.722828 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8khlw\" (UniqueName: \"kubernetes.io/projected/a96fbf6d-90ba-4223-868a-f95ba2c8d876-kube-api-access-8khlw\") pod \"error-404-isvc-d2d70-predictor-557fcdd5cb-lmqhk\" (UID: \"a96fbf6d-90ba-4223-868a-f95ba2c8d876\") " pod="kserve-ci-e2e-test/error-404-isvc-d2d70-predictor-557fcdd5cb-lmqhk" Apr 23 13:43:34.823455 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:43:34.823420 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8khlw\" (UniqueName: \"kubernetes.io/projected/a96fbf6d-90ba-4223-868a-f95ba2c8d876-kube-api-access-8khlw\") pod \"error-404-isvc-d2d70-predictor-557fcdd5cb-lmqhk\" (UID: \"a96fbf6d-90ba-4223-868a-f95ba2c8d876\") " pod="kserve-ci-e2e-test/error-404-isvc-d2d70-predictor-557fcdd5cb-lmqhk" Apr 23 13:43:34.823618 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:43:34.823483 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a96fbf6d-90ba-4223-868a-f95ba2c8d876-proxy-tls\") pod \"error-404-isvc-d2d70-predictor-557fcdd5cb-lmqhk\" (UID: \"a96fbf6d-90ba-4223-868a-f95ba2c8d876\") " pod="kserve-ci-e2e-test/error-404-isvc-d2d70-predictor-557fcdd5cb-lmqhk" Apr 23 13:43:34.823618 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:43:34.823544 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-d2d70-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a96fbf6d-90ba-4223-868a-f95ba2c8d876-error-404-isvc-d2d70-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-d2d70-predictor-557fcdd5cb-lmqhk\" (UID: \"a96fbf6d-90ba-4223-868a-f95ba2c8d876\") " pod="kserve-ci-e2e-test/error-404-isvc-d2d70-predictor-557fcdd5cb-lmqhk" Apr 23 13:43:34.823712 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:43:34.823645 2577 secret.go:189] Couldn't get secret kserve-ci-e2e-test/error-404-isvc-d2d70-predictor-serving-cert: secret "error-404-isvc-d2d70-predictor-serving-cert" not found Apr 23 13:43:34.823712 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:43:34.823699 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a96fbf6d-90ba-4223-868a-f95ba2c8d876-proxy-tls podName:a96fbf6d-90ba-4223-868a-f95ba2c8d876 nodeName:}" failed. No retries permitted until 2026-04-23 13:43:35.323684054 +0000 UTC m=+688.301592625 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/a96fbf6d-90ba-4223-868a-f95ba2c8d876-proxy-tls") pod "error-404-isvc-d2d70-predictor-557fcdd5cb-lmqhk" (UID: "a96fbf6d-90ba-4223-868a-f95ba2c8d876") : secret "error-404-isvc-d2d70-predictor-serving-cert" not found Apr 23 13:43:34.824183 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:43:34.824165 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-d2d70-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a96fbf6d-90ba-4223-868a-f95ba2c8d876-error-404-isvc-d2d70-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-d2d70-predictor-557fcdd5cb-lmqhk\" (UID: \"a96fbf6d-90ba-4223-868a-f95ba2c8d876\") " pod="kserve-ci-e2e-test/error-404-isvc-d2d70-predictor-557fcdd5cb-lmqhk" Apr 23 13:43:34.834267 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:43:34.834243 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8khlw\" (UniqueName: \"kubernetes.io/projected/a96fbf6d-90ba-4223-868a-f95ba2c8d876-kube-api-access-8khlw\") pod \"error-404-isvc-d2d70-predictor-557fcdd5cb-lmqhk\" (UID: \"a96fbf6d-90ba-4223-868a-f95ba2c8d876\") " pod="kserve-ci-e2e-test/error-404-isvc-d2d70-predictor-557fcdd5cb-lmqhk" Apr 23 13:43:35.326948 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:43:35.326911 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a96fbf6d-90ba-4223-868a-f95ba2c8d876-proxy-tls\") pod \"error-404-isvc-d2d70-predictor-557fcdd5cb-lmqhk\" (UID: \"a96fbf6d-90ba-4223-868a-f95ba2c8d876\") " pod="kserve-ci-e2e-test/error-404-isvc-d2d70-predictor-557fcdd5cb-lmqhk" Apr 23 13:43:35.329226 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:43:35.329207 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a96fbf6d-90ba-4223-868a-f95ba2c8d876-proxy-tls\") pod \"error-404-isvc-d2d70-predictor-557fcdd5cb-lmqhk\" (UID: \"a96fbf6d-90ba-4223-868a-f95ba2c8d876\") " pod="kserve-ci-e2e-test/error-404-isvc-d2d70-predictor-557fcdd5cb-lmqhk" Apr 23 13:43:35.434525 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:43:35.434474 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-94d25-5664f9bbb4-v4vxx" podUID="0f72de3a-e751-4e2d-b589-a4cac2dd1dd1" containerName="switch-graph-94d25" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:43:35.457508 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:43:35.457462 2577 generic.go:358] "Generic (PLEG): container finished" podID="555b7e6f-53f2-4b5d-9b94-1286de763e29" containerID="c214315856cc6d2c2e7f4009e25cd527ed585264e4d2b4b76966b7d68677fdce" exitCode=2 Apr 23 13:43:35.457636 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:43:35.457535 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-94d25-predictor-f56cbdc6b-6r59g" event={"ID":"555b7e6f-53f2-4b5d-9b94-1286de763e29","Type":"ContainerDied","Data":"c214315856cc6d2c2e7f4009e25cd527ed585264e4d2b4b76966b7d68677fdce"} Apr 23 13:43:35.530900 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:43:35.530871 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-d2d70-predictor-557fcdd5cb-lmqhk" Apr 23 13:43:35.651763 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:43:35.651744 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-d2d70-predictor-557fcdd5cb-lmqhk"] Apr 23 13:43:35.654363 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:43:35.654338 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda96fbf6d_90ba_4223_868a_f95ba2c8d876.slice/crio-7862ada0c54f0f641322559d6692711b2effa7bac123ba19005b73064d823020 WatchSource:0}: Error finding container 7862ada0c54f0f641322559d6692711b2effa7bac123ba19005b73064d823020: Status 404 returned error can't find the container with id 7862ada0c54f0f641322559d6692711b2effa7bac123ba19005b73064d823020 Apr 23 13:43:36.463691 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:43:36.463660 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-d2d70-predictor-557fcdd5cb-lmqhk" event={"ID":"a96fbf6d-90ba-4223-868a-f95ba2c8d876","Type":"ContainerStarted","Data":"1ccd34452f113bed06d0b36fd7eb83387941b21f35674a56b0911a2ba7390e0d"} Apr 23 13:43:36.464077 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:43:36.463697 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-d2d70-predictor-557fcdd5cb-lmqhk" event={"ID":"a96fbf6d-90ba-4223-868a-f95ba2c8d876","Type":"ContainerStarted","Data":"ac3c7a53a4cd908b585f30e111c0cfb61d2e44dff91a8413119ed501cf9a771d"} Apr 23 13:43:36.464077 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:43:36.463711 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-d2d70-predictor-557fcdd5cb-lmqhk" event={"ID":"a96fbf6d-90ba-4223-868a-f95ba2c8d876","Type":"ContainerStarted","Data":"7862ada0c54f0f641322559d6692711b2effa7bac123ba19005b73064d823020"} Apr 23 13:43:36.464077 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:43:36.463773 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-d2d70-predictor-557fcdd5cb-lmqhk" Apr 23 13:43:36.482222 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:43:36.482174 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-d2d70-predictor-557fcdd5cb-lmqhk" podStartSLOduration=2.482161772 podStartE2EDuration="2.482161772s" podCreationTimestamp="2026-04-23 13:43:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:43:36.48079743 +0000 UTC m=+689.458706021" watchObservedRunningTime="2026-04-23 13:43:36.482161772 +0000 UTC m=+689.460070342" Apr 23 13:43:37.468630 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:43:37.468601 2577 generic.go:358] "Generic (PLEG): container finished" podID="555b7e6f-53f2-4b5d-9b94-1286de763e29" containerID="d2986728fbfe6ed0ef2813d2524c11ecfe47588b90f7b35e1cbecf47b5324c61" exitCode=0 Apr 23 13:43:37.468969 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:43:37.468684 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-94d25-predictor-f56cbdc6b-6r59g" event={"ID":"555b7e6f-53f2-4b5d-9b94-1286de763e29","Type":"ContainerDied","Data":"d2986728fbfe6ed0ef2813d2524c11ecfe47588b90f7b35e1cbecf47b5324c61"} Apr 23 13:43:37.468969 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:43:37.468935 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-d2d70-predictor-557fcdd5cb-lmqhk" Apr 23 13:43:37.469985 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:43:37.469964 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-d2d70-predictor-557fcdd5cb-lmqhk" podUID="a96fbf6d-90ba-4223-868a-f95ba2c8d876" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.20:8080: connect: connection refused" Apr 23 13:43:37.501387 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:43:37.501370 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-94d25-predictor-f56cbdc6b-6r59g" Apr 23 13:43:37.642337 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:43:37.642254 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cxjg5\" (UniqueName: \"kubernetes.io/projected/555b7e6f-53f2-4b5d-9b94-1286de763e29-kube-api-access-cxjg5\") pod \"555b7e6f-53f2-4b5d-9b94-1286de763e29\" (UID: \"555b7e6f-53f2-4b5d-9b94-1286de763e29\") " Apr 23 13:43:37.642337 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:43:37.642288 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/555b7e6f-53f2-4b5d-9b94-1286de763e29-proxy-tls\") pod \"555b7e6f-53f2-4b5d-9b94-1286de763e29\" (UID: \"555b7e6f-53f2-4b5d-9b94-1286de763e29\") " Apr 23 13:43:37.642337 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:43:37.642313 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-94d25-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/555b7e6f-53f2-4b5d-9b94-1286de763e29-error-404-isvc-94d25-kube-rbac-proxy-sar-config\") pod \"555b7e6f-53f2-4b5d-9b94-1286de763e29\" (UID: \"555b7e6f-53f2-4b5d-9b94-1286de763e29\") " Apr 23 13:43:37.642757 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:43:37.642730 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/555b7e6f-53f2-4b5d-9b94-1286de763e29-error-404-isvc-94d25-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-94d25-kube-rbac-proxy-sar-config") pod "555b7e6f-53f2-4b5d-9b94-1286de763e29" (UID: "555b7e6f-53f2-4b5d-9b94-1286de763e29"). InnerVolumeSpecName "error-404-isvc-94d25-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:43:37.644803 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:43:37.644776 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/555b7e6f-53f2-4b5d-9b94-1286de763e29-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "555b7e6f-53f2-4b5d-9b94-1286de763e29" (UID: "555b7e6f-53f2-4b5d-9b94-1286de763e29"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:43:37.644911 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:43:37.644793 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/555b7e6f-53f2-4b5d-9b94-1286de763e29-kube-api-access-cxjg5" (OuterVolumeSpecName: "kube-api-access-cxjg5") pod "555b7e6f-53f2-4b5d-9b94-1286de763e29" (UID: "555b7e6f-53f2-4b5d-9b94-1286de763e29"). InnerVolumeSpecName "kube-api-access-cxjg5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:43:37.743140 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:43:37.743104 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cxjg5\" (UniqueName: \"kubernetes.io/projected/555b7e6f-53f2-4b5d-9b94-1286de763e29-kube-api-access-cxjg5\") on node \"ip-10-0-132-207.ec2.internal\" DevicePath \"\"" Apr 23 13:43:37.743140 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:43:37.743135 2577 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/555b7e6f-53f2-4b5d-9b94-1286de763e29-proxy-tls\") on node \"ip-10-0-132-207.ec2.internal\" DevicePath \"\"" Apr 23 13:43:37.743140 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:43:37.743146 2577 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-94d25-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/555b7e6f-53f2-4b5d-9b94-1286de763e29-error-404-isvc-94d25-kube-rbac-proxy-sar-config\") on node \"ip-10-0-132-207.ec2.internal\" DevicePath \"\"" Apr 23 13:43:38.472734 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:43:38.472688 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-94d25-predictor-f56cbdc6b-6r59g" event={"ID":"555b7e6f-53f2-4b5d-9b94-1286de763e29","Type":"ContainerDied","Data":"87cc4e7d87946430580ede323a7562a18f4d54162f4a164ab43cb3e231efff09"} Apr 23 13:43:38.472734 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:43:38.472743 2577 scope.go:117] "RemoveContainer" containerID="c214315856cc6d2c2e7f4009e25cd527ed585264e4d2b4b76966b7d68677fdce" Apr 23 13:43:38.473231 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:43:38.472709 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-94d25-predictor-f56cbdc6b-6r59g" Apr 23 13:43:38.473231 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:43:38.473046 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-d2d70-predictor-557fcdd5cb-lmqhk" podUID="a96fbf6d-90ba-4223-868a-f95ba2c8d876" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.20:8080: connect: connection refused" Apr 23 13:43:38.480803 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:43:38.480782 2577 scope.go:117] "RemoveContainer" containerID="d2986728fbfe6ed0ef2813d2524c11ecfe47588b90f7b35e1cbecf47b5324c61" Apr 23 13:43:38.517732 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:43:38.517709 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-94d25-predictor-f56cbdc6b-6r59g"] Apr 23 13:43:38.531933 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:43:38.531913 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-94d25-predictor-f56cbdc6b-6r59g"] Apr 23 13:43:39.583526 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:43:39.583476 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="555b7e6f-53f2-4b5d-9b94-1286de763e29" path="/var/lib/kubelet/pods/555b7e6f-53f2-4b5d-9b94-1286de763e29/volumes" Apr 23 13:43:40.434299 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:43:40.434262 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-94d25-5664f9bbb4-v4vxx" podUID="0f72de3a-e751-4e2d-b589-a4cac2dd1dd1" containerName="switch-graph-94d25" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:43:43.478251 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:43:43.478220 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-d2d70-predictor-557fcdd5cb-lmqhk" Apr 23 13:43:43.478799 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:43:43.478773 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-d2d70-predictor-557fcdd5cb-lmqhk" podUID="a96fbf6d-90ba-4223-868a-f95ba2c8d876" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.20:8080: connect: connection refused" Apr 23 13:43:45.434228 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:43:45.434182 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-94d25-5664f9bbb4-v4vxx" podUID="0f72de3a-e751-4e2d-b589-a4cac2dd1dd1" containerName="switch-graph-94d25" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:43:45.434683 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:43:45.434308 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-94d25-5664f9bbb4-v4vxx" Apr 23 13:43:50.434364 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:43:50.434316 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-94d25-5664f9bbb4-v4vxx" podUID="0f72de3a-e751-4e2d-b589-a4cac2dd1dd1" containerName="switch-graph-94d25" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:43:53.479654 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:43:53.479613 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-d2d70-predictor-557fcdd5cb-lmqhk" podUID="a96fbf6d-90ba-4223-868a-f95ba2c8d876" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.20:8080: connect: connection refused" Apr 23 13:43:55.434287 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:43:55.434240 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-94d25-5664f9bbb4-v4vxx" podUID="0f72de3a-e751-4e2d-b589-a4cac2dd1dd1" containerName="switch-graph-94d25" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:44:00.126249 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:00.126174 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/model-chainer-5649869f87-hrjlf"] Apr 23 13:44:00.126597 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:00.126439 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="555b7e6f-53f2-4b5d-9b94-1286de763e29" containerName="kube-rbac-proxy" Apr 23 13:44:00.126597 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:00.126449 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="555b7e6f-53f2-4b5d-9b94-1286de763e29" containerName="kube-rbac-proxy" Apr 23 13:44:00.126597 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:00.126469 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="555b7e6f-53f2-4b5d-9b94-1286de763e29" containerName="kserve-container" Apr 23 13:44:00.126597 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:00.126475 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="555b7e6f-53f2-4b5d-9b94-1286de763e29" containerName="kserve-container" Apr 23 13:44:00.126597 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:00.126539 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="555b7e6f-53f2-4b5d-9b94-1286de763e29" containerName="kserve-container" Apr 23 13:44:00.126597 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:00.126548 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="555b7e6f-53f2-4b5d-9b94-1286de763e29" containerName="kube-rbac-proxy" Apr 23 13:44:00.129238 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:00.129223 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-5649869f87-hrjlf" Apr 23 13:44:00.132328 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:00.132307 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-serving-cert\"" Apr 23 13:44:00.132558 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:00.132529 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-kube-rbac-proxy-sar-config\"" Apr 23 13:44:00.147783 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:00.147759 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-5649869f87-hrjlf"] Apr 23 13:44:00.304124 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:00.304092 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/04bceceb-c66a-4738-bf8f-ea4f233212cc-proxy-tls\") pod \"model-chainer-5649869f87-hrjlf\" (UID: \"04bceceb-c66a-4738-bf8f-ea4f233212cc\") " pod="kserve-ci-e2e-test/model-chainer-5649869f87-hrjlf" Apr 23 13:44:00.304284 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:00.304145 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/04bceceb-c66a-4738-bf8f-ea4f233212cc-openshift-service-ca-bundle\") pod \"model-chainer-5649869f87-hrjlf\" (UID: \"04bceceb-c66a-4738-bf8f-ea4f233212cc\") " pod="kserve-ci-e2e-test/model-chainer-5649869f87-hrjlf" Apr 23 13:44:00.405224 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:00.405153 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/04bceceb-c66a-4738-bf8f-ea4f233212cc-openshift-service-ca-bundle\") pod \"model-chainer-5649869f87-hrjlf\" (UID: \"04bceceb-c66a-4738-bf8f-ea4f233212cc\") " pod="kserve-ci-e2e-test/model-chainer-5649869f87-hrjlf" Apr 23 13:44:00.405339 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:00.405236 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/04bceceb-c66a-4738-bf8f-ea4f233212cc-proxy-tls\") pod \"model-chainer-5649869f87-hrjlf\" (UID: \"04bceceb-c66a-4738-bf8f-ea4f233212cc\") " pod="kserve-ci-e2e-test/model-chainer-5649869f87-hrjlf" Apr 23 13:44:00.405885 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:00.405866 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/04bceceb-c66a-4738-bf8f-ea4f233212cc-openshift-service-ca-bundle\") pod \"model-chainer-5649869f87-hrjlf\" (UID: \"04bceceb-c66a-4738-bf8f-ea4f233212cc\") " pod="kserve-ci-e2e-test/model-chainer-5649869f87-hrjlf" Apr 23 13:44:00.407689 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:00.407671 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/04bceceb-c66a-4738-bf8f-ea4f233212cc-proxy-tls\") pod \"model-chainer-5649869f87-hrjlf\" (UID: \"04bceceb-c66a-4738-bf8f-ea4f233212cc\") " pod="kserve-ci-e2e-test/model-chainer-5649869f87-hrjlf" Apr 23 13:44:00.434249 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:00.434221 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-94d25-5664f9bbb4-v4vxx" podUID="0f72de3a-e751-4e2d-b589-a4cac2dd1dd1" containerName="switch-graph-94d25" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:44:00.438468 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:00.438452 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-5649869f87-hrjlf" Apr 23 13:44:00.556102 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:00.556081 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-5649869f87-hrjlf"] Apr 23 13:44:00.558584 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:44:00.558550 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04bceceb_c66a_4738_bf8f_ea4f233212cc.slice/crio-4047e7660993966de3529b2aa85ed0926539cae8de1189b220729cb82230eb26 WatchSource:0}: Error finding container 4047e7660993966de3529b2aa85ed0926539cae8de1189b220729cb82230eb26: Status 404 returned error can't find the container with id 4047e7660993966de3529b2aa85ed0926539cae8de1189b220729cb82230eb26 Apr 23 13:44:01.534926 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:01.534888 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-5649869f87-hrjlf" event={"ID":"04bceceb-c66a-4738-bf8f-ea4f233212cc","Type":"ContainerStarted","Data":"6f75d23c64fffa40a9e7383499455a672f185863bb938130ea6b1f8e6bee59d6"} Apr 23 13:44:01.534926 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:01.534929 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-5649869f87-hrjlf" event={"ID":"04bceceb-c66a-4738-bf8f-ea4f233212cc","Type":"ContainerStarted","Data":"4047e7660993966de3529b2aa85ed0926539cae8de1189b220729cb82230eb26"} Apr 23 13:44:01.535337 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:01.535111 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-5649869f87-hrjlf" Apr 23 13:44:01.551869 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:01.551822 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/model-chainer-5649869f87-hrjlf" podStartSLOduration=1.5518118950000002 podStartE2EDuration="1.551811895s" podCreationTimestamp="2026-04-23 13:44:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:44:01.5494672 +0000 UTC m=+714.527375792" watchObservedRunningTime="2026-04-23 13:44:01.551811895 +0000 UTC m=+714.529720486" Apr 23 13:44:03.479164 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:03.479129 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-d2d70-predictor-557fcdd5cb-lmqhk" podUID="a96fbf6d-90ba-4223-868a-f95ba2c8d876" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.20:8080: connect: connection refused" Apr 23 13:44:04.474583 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:04.474559 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-94d25-5664f9bbb4-v4vxx" Apr 23 13:44:04.545107 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:04.545068 2577 generic.go:358] "Generic (PLEG): container finished" podID="0f72de3a-e751-4e2d-b589-a4cac2dd1dd1" containerID="9980b7977dbe3a941b42e770f5bf15bcf87f4f9a4cf9b7e00c9afac109b94526" exitCode=0 Apr 23 13:44:04.545449 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:04.545127 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-94d25-5664f9bbb4-v4vxx" Apr 23 13:44:04.545449 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:04.545136 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-94d25-5664f9bbb4-v4vxx" event={"ID":"0f72de3a-e751-4e2d-b589-a4cac2dd1dd1","Type":"ContainerDied","Data":"9980b7977dbe3a941b42e770f5bf15bcf87f4f9a4cf9b7e00c9afac109b94526"} Apr 23 13:44:04.545449 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:04.545170 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-94d25-5664f9bbb4-v4vxx" event={"ID":"0f72de3a-e751-4e2d-b589-a4cac2dd1dd1","Type":"ContainerDied","Data":"a0b43c5589b66d72e6ab6ce7a867a166312cefbad8e30f5c854cd2eddf3c853d"} Apr 23 13:44:04.545449 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:04.545189 2577 scope.go:117] "RemoveContainer" containerID="9980b7977dbe3a941b42e770f5bf15bcf87f4f9a4cf9b7e00c9afac109b94526" Apr 23 13:44:04.552322 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:04.552303 2577 scope.go:117] "RemoveContainer" containerID="9980b7977dbe3a941b42e770f5bf15bcf87f4f9a4cf9b7e00c9afac109b94526" Apr 23 13:44:04.552594 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:44:04.552573 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9980b7977dbe3a941b42e770f5bf15bcf87f4f9a4cf9b7e00c9afac109b94526\": container with ID starting with 9980b7977dbe3a941b42e770f5bf15bcf87f4f9a4cf9b7e00c9afac109b94526 not found: ID does not exist" containerID="9980b7977dbe3a941b42e770f5bf15bcf87f4f9a4cf9b7e00c9afac109b94526" Apr 23 13:44:04.552669 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:04.552602 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9980b7977dbe3a941b42e770f5bf15bcf87f4f9a4cf9b7e00c9afac109b94526"} err="failed to get container status \"9980b7977dbe3a941b42e770f5bf15bcf87f4f9a4cf9b7e00c9afac109b94526\": rpc error: code = NotFound desc = could not find container \"9980b7977dbe3a941b42e770f5bf15bcf87f4f9a4cf9b7e00c9afac109b94526\": container with ID starting with 9980b7977dbe3a941b42e770f5bf15bcf87f4f9a4cf9b7e00c9afac109b94526 not found: ID does not exist" Apr 23 13:44:04.640110 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:04.640040 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0f72de3a-e751-4e2d-b589-a4cac2dd1dd1-openshift-service-ca-bundle\") pod \"0f72de3a-e751-4e2d-b589-a4cac2dd1dd1\" (UID: \"0f72de3a-e751-4e2d-b589-a4cac2dd1dd1\") " Apr 23 13:44:04.640110 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:04.640084 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0f72de3a-e751-4e2d-b589-a4cac2dd1dd1-proxy-tls\") pod \"0f72de3a-e751-4e2d-b589-a4cac2dd1dd1\" (UID: \"0f72de3a-e751-4e2d-b589-a4cac2dd1dd1\") " Apr 23 13:44:04.640478 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:04.640451 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f72de3a-e751-4e2d-b589-a4cac2dd1dd1-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "0f72de3a-e751-4e2d-b589-a4cac2dd1dd1" (UID: "0f72de3a-e751-4e2d-b589-a4cac2dd1dd1"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:44:04.642111 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:04.642090 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f72de3a-e751-4e2d-b589-a4cac2dd1dd1-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0f72de3a-e751-4e2d-b589-a4cac2dd1dd1" (UID: "0f72de3a-e751-4e2d-b589-a4cac2dd1dd1"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:44:04.740592 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:04.740561 2577 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0f72de3a-e751-4e2d-b589-a4cac2dd1dd1-openshift-service-ca-bundle\") on node \"ip-10-0-132-207.ec2.internal\" DevicePath \"\"" Apr 23 13:44:04.740592 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:04.740588 2577 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0f72de3a-e751-4e2d-b589-a4cac2dd1dd1-proxy-tls\") on node \"ip-10-0-132-207.ec2.internal\" DevicePath \"\"" Apr 23 13:44:04.865440 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:04.865382 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-94d25-5664f9bbb4-v4vxx"] Apr 23 13:44:04.867644 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:04.867623 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/switch-graph-94d25-5664f9bbb4-v4vxx"] Apr 23 13:44:05.582766 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:05.582734 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f72de3a-e751-4e2d-b589-a4cac2dd1dd1" path="/var/lib/kubelet/pods/0f72de3a-e751-4e2d-b589-a4cac2dd1dd1/volumes" Apr 23 13:44:07.544084 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:07.544058 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/model-chainer-5649869f87-hrjlf" Apr 23 13:44:10.228703 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:10.228664 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-5649869f87-hrjlf"] Apr 23 13:44:10.229139 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:10.228956 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/model-chainer-5649869f87-hrjlf" podUID="04bceceb-c66a-4738-bf8f-ea4f233212cc" containerName="model-chainer" containerID="cri-o://6f75d23c64fffa40a9e7383499455a672f185863bb938130ea6b1f8e6bee59d6" gracePeriod=30 Apr 23 13:44:10.369867 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:10.369837 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7c88655ddf-p6px8"] Apr 23 13:44:10.370174 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:10.370152 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7c88655ddf-p6px8" podUID="01b5a2d1-4b2c-4634-a26c-0eb0f0d31d1e" containerName="kserve-container" containerID="cri-o://95d6ca55a2227a68e4b3b17a4cdbdbbca600d4df52e2baf89218b36141fa233f" gracePeriod=30 Apr 23 13:44:10.370255 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:10.370175 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7c88655ddf-p6px8" podUID="01b5a2d1-4b2c-4634-a26c-0eb0f0d31d1e" containerName="kube-rbac-proxy" containerID="cri-o://f948017c3b92a4032c0caa28f71555416a3bada60ecf6e9a614b15a39595d004" gracePeriod=30 Apr 23 13:44:10.486467 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:10.486402 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-f7bdd-predictor-84546c4756-jjbxj"] Apr 23 13:44:10.486718 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:10.486705 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0f72de3a-e751-4e2d-b589-a4cac2dd1dd1" containerName="switch-graph-94d25" Apr 23 13:44:10.486762 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:10.486720 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f72de3a-e751-4e2d-b589-a4cac2dd1dd1" containerName="switch-graph-94d25" Apr 23 13:44:10.486799 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:10.486779 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="0f72de3a-e751-4e2d-b589-a4cac2dd1dd1" containerName="switch-graph-94d25" Apr 23 13:44:10.489539 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:10.489521 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-f7bdd-predictor-84546c4756-jjbxj" Apr 23 13:44:10.492020 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:10.491999 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-f7bdd-predictor-serving-cert\"" Apr 23 13:44:10.492125 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:10.492001 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-f7bdd-kube-rbac-proxy-sar-config\"" Apr 23 13:44:10.496576 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:10.496555 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-f7bdd-predictor-84546c4756-jjbxj"] Apr 23 13:44:10.563438 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:10.563411 2577 generic.go:358] "Generic (PLEG): container finished" podID="01b5a2d1-4b2c-4634-a26c-0eb0f0d31d1e" containerID="f948017c3b92a4032c0caa28f71555416a3bada60ecf6e9a614b15a39595d004" exitCode=2 Apr 23 13:44:10.563571 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:10.563478 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7c88655ddf-p6px8" event={"ID":"01b5a2d1-4b2c-4634-a26c-0eb0f0d31d1e","Type":"ContainerDied","Data":"f948017c3b92a4032c0caa28f71555416a3bada60ecf6e9a614b15a39595d004"} Apr 23 13:44:10.583070 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:10.583041 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9586a1e0-fab0-42ee-8108-aca7842dcef2-proxy-tls\") pod \"error-404-isvc-f7bdd-predictor-84546c4756-jjbxj\" (UID: \"9586a1e0-fab0-42ee-8108-aca7842dcef2\") " pod="kserve-ci-e2e-test/error-404-isvc-f7bdd-predictor-84546c4756-jjbxj" Apr 23 13:44:10.583175 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:10.583105 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-f7bdd-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9586a1e0-fab0-42ee-8108-aca7842dcef2-error-404-isvc-f7bdd-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-f7bdd-predictor-84546c4756-jjbxj\" (UID: \"9586a1e0-fab0-42ee-8108-aca7842dcef2\") " pod="kserve-ci-e2e-test/error-404-isvc-f7bdd-predictor-84546c4756-jjbxj" Apr 23 13:44:10.583175 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:10.583142 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlqnk\" (UniqueName: \"kubernetes.io/projected/9586a1e0-fab0-42ee-8108-aca7842dcef2-kube-api-access-wlqnk\") pod \"error-404-isvc-f7bdd-predictor-84546c4756-jjbxj\" (UID: \"9586a1e0-fab0-42ee-8108-aca7842dcef2\") " pod="kserve-ci-e2e-test/error-404-isvc-f7bdd-predictor-84546c4756-jjbxj" Apr 23 13:44:10.684154 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:10.684128 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9586a1e0-fab0-42ee-8108-aca7842dcef2-proxy-tls\") pod \"error-404-isvc-f7bdd-predictor-84546c4756-jjbxj\" (UID: \"9586a1e0-fab0-42ee-8108-aca7842dcef2\") " pod="kserve-ci-e2e-test/error-404-isvc-f7bdd-predictor-84546c4756-jjbxj" Apr 23 13:44:10.684273 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:10.684192 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-f7bdd-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9586a1e0-fab0-42ee-8108-aca7842dcef2-error-404-isvc-f7bdd-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-f7bdd-predictor-84546c4756-jjbxj\" (UID: \"9586a1e0-fab0-42ee-8108-aca7842dcef2\") " pod="kserve-ci-e2e-test/error-404-isvc-f7bdd-predictor-84546c4756-jjbxj" Apr 23 13:44:10.684273 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:10.684227 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wlqnk\" (UniqueName: \"kubernetes.io/projected/9586a1e0-fab0-42ee-8108-aca7842dcef2-kube-api-access-wlqnk\") pod \"error-404-isvc-f7bdd-predictor-84546c4756-jjbxj\" (UID: \"9586a1e0-fab0-42ee-8108-aca7842dcef2\") " pod="kserve-ci-e2e-test/error-404-isvc-f7bdd-predictor-84546c4756-jjbxj" Apr 23 13:44:10.684880 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:10.684857 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-f7bdd-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9586a1e0-fab0-42ee-8108-aca7842dcef2-error-404-isvc-f7bdd-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-f7bdd-predictor-84546c4756-jjbxj\" (UID: \"9586a1e0-fab0-42ee-8108-aca7842dcef2\") " pod="kserve-ci-e2e-test/error-404-isvc-f7bdd-predictor-84546c4756-jjbxj" Apr 23 13:44:10.686478 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:10.686458 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9586a1e0-fab0-42ee-8108-aca7842dcef2-proxy-tls\") pod \"error-404-isvc-f7bdd-predictor-84546c4756-jjbxj\" (UID: \"9586a1e0-fab0-42ee-8108-aca7842dcef2\") " pod="kserve-ci-e2e-test/error-404-isvc-f7bdd-predictor-84546c4756-jjbxj" Apr 23 13:44:10.692477 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:10.692449 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlqnk\" (UniqueName: \"kubernetes.io/projected/9586a1e0-fab0-42ee-8108-aca7842dcef2-kube-api-access-wlqnk\") pod \"error-404-isvc-f7bdd-predictor-84546c4756-jjbxj\" (UID: \"9586a1e0-fab0-42ee-8108-aca7842dcef2\") " pod="kserve-ci-e2e-test/error-404-isvc-f7bdd-predictor-84546c4756-jjbxj" Apr 23 13:44:10.800095 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:10.800032 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-f7bdd-predictor-84546c4756-jjbxj" Apr 23 13:44:10.918172 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:10.918145 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-f7bdd-predictor-84546c4756-jjbxj"] Apr 23 13:44:10.920531 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:44:10.920503 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9586a1e0_fab0_42ee_8108_aca7842dcef2.slice/crio-72b7ada95bfdb582f16abcea9459bb721fc7cf901ae6a0cbb8415069ae312514 WatchSource:0}: Error finding container 72b7ada95bfdb582f16abcea9459bb721fc7cf901ae6a0cbb8415069ae312514: Status 404 returned error can't find the container with id 72b7ada95bfdb582f16abcea9459bb721fc7cf901ae6a0cbb8415069ae312514 Apr 23 13:44:10.922274 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:10.922253 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 13:44:11.568757 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:11.568717 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-f7bdd-predictor-84546c4756-jjbxj" event={"ID":"9586a1e0-fab0-42ee-8108-aca7842dcef2","Type":"ContainerStarted","Data":"5dbf385f79331e5ffc2ba2d91a98abbbb745924748705618b15fbe3e319c1d24"} Apr 23 13:44:11.568757 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:11.568760 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-f7bdd-predictor-84546c4756-jjbxj" event={"ID":"9586a1e0-fab0-42ee-8108-aca7842dcef2","Type":"ContainerStarted","Data":"91a6b0acb99f53048054186a255b01fbbf4eca671b320a4030f6c7b71a463aa5"} Apr 23 13:44:11.569268 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:11.568774 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-f7bdd-predictor-84546c4756-jjbxj" event={"ID":"9586a1e0-fab0-42ee-8108-aca7842dcef2","Type":"ContainerStarted","Data":"72b7ada95bfdb582f16abcea9459bb721fc7cf901ae6a0cbb8415069ae312514"} Apr 23 13:44:11.569268 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:11.568863 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-f7bdd-predictor-84546c4756-jjbxj" Apr 23 13:44:11.584699 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:11.584648 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-f7bdd-predictor-84546c4756-jjbxj" podStartSLOduration=1.584632096 podStartE2EDuration="1.584632096s" podCreationTimestamp="2026-04-23 13:44:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:44:11.583896308 +0000 UTC m=+724.561804902" watchObservedRunningTime="2026-04-23 13:44:11.584632096 +0000 UTC m=+724.562540690" Apr 23 13:44:12.268324 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:12.268281 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7c88655ddf-p6px8" podUID="01b5a2d1-4b2c-4634-a26c-0eb0f0d31d1e" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.17:8643/healthz\": dial tcp 10.132.0.17:8643: connect: connection refused" Apr 23 13:44:12.273325 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:12.273291 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7c88655ddf-p6px8" podUID="01b5a2d1-4b2c-4634-a26c-0eb0f0d31d1e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.17:8080: connect: connection refused" Apr 23 13:44:12.542841 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:12.542753 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-5649869f87-hrjlf" podUID="04bceceb-c66a-4738-bf8f-ea4f233212cc" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:44:12.572434 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:12.572402 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-f7bdd-predictor-84546c4756-jjbxj" Apr 23 13:44:12.574011 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:12.573972 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-f7bdd-predictor-84546c4756-jjbxj" podUID="9586a1e0-fab0-42ee-8108-aca7842dcef2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:8080: connect: connection refused" Apr 23 13:44:13.479649 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:13.479613 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-d2d70-predictor-557fcdd5cb-lmqhk" podUID="a96fbf6d-90ba-4223-868a-f95ba2c8d876" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.20:8080: connect: connection refused" Apr 23 13:44:13.581190 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:13.581151 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-f7bdd-predictor-84546c4756-jjbxj" podUID="9586a1e0-fab0-42ee-8108-aca7842dcef2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:8080: connect: connection refused" Apr 23 13:44:14.415443 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:14.415420 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7c88655ddf-p6px8" Apr 23 13:44:14.517398 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:14.517363 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/01b5a2d1-4b2c-4634-a26c-0eb0f0d31d1e-proxy-tls\") pod \"01b5a2d1-4b2c-4634-a26c-0eb0f0d31d1e\" (UID: \"01b5a2d1-4b2c-4634-a26c-0eb0f0d31d1e\") " Apr 23 13:44:14.517573 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:14.517405 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/01b5a2d1-4b2c-4634-a26c-0eb0f0d31d1e-kserve-provision-location\") pod \"01b5a2d1-4b2c-4634-a26c-0eb0f0d31d1e\" (UID: \"01b5a2d1-4b2c-4634-a26c-0eb0f0d31d1e\") " Apr 23 13:44:14.517573 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:14.517434 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/01b5a2d1-4b2c-4634-a26c-0eb0f0d31d1e-isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\") pod \"01b5a2d1-4b2c-4634-a26c-0eb0f0d31d1e\" (UID: \"01b5a2d1-4b2c-4634-a26c-0eb0f0d31d1e\") " Apr 23 13:44:14.517658 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:14.517626 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7qsgg\" (UniqueName: \"kubernetes.io/projected/01b5a2d1-4b2c-4634-a26c-0eb0f0d31d1e-kube-api-access-7qsgg\") pod \"01b5a2d1-4b2c-4634-a26c-0eb0f0d31d1e\" (UID: \"01b5a2d1-4b2c-4634-a26c-0eb0f0d31d1e\") " Apr 23 13:44:14.517773 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:14.517753 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01b5a2d1-4b2c-4634-a26c-0eb0f0d31d1e-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "01b5a2d1-4b2c-4634-a26c-0eb0f0d31d1e" (UID: "01b5a2d1-4b2c-4634-a26c-0eb0f0d31d1e"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:44:14.517813 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:14.517794 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01b5a2d1-4b2c-4634-a26c-0eb0f0d31d1e-isvc-sklearn-graph-1-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-graph-1-kube-rbac-proxy-sar-config") pod "01b5a2d1-4b2c-4634-a26c-0eb0f0d31d1e" (UID: "01b5a2d1-4b2c-4634-a26c-0eb0f0d31d1e"). InnerVolumeSpecName "isvc-sklearn-graph-1-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:44:14.517886 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:14.517873 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/01b5a2d1-4b2c-4634-a26c-0eb0f0d31d1e-kserve-provision-location\") on node \"ip-10-0-132-207.ec2.internal\" DevicePath \"\"" Apr 23 13:44:14.517923 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:14.517890 2577 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/01b5a2d1-4b2c-4634-a26c-0eb0f0d31d1e-isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\") on node \"ip-10-0-132-207.ec2.internal\" DevicePath \"\"" Apr 23 13:44:14.519585 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:14.519561 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01b5a2d1-4b2c-4634-a26c-0eb0f0d31d1e-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "01b5a2d1-4b2c-4634-a26c-0eb0f0d31d1e" (UID: "01b5a2d1-4b2c-4634-a26c-0eb0f0d31d1e"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:44:14.519701 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:14.519681 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01b5a2d1-4b2c-4634-a26c-0eb0f0d31d1e-kube-api-access-7qsgg" (OuterVolumeSpecName: "kube-api-access-7qsgg") pod "01b5a2d1-4b2c-4634-a26c-0eb0f0d31d1e" (UID: "01b5a2d1-4b2c-4634-a26c-0eb0f0d31d1e"). InnerVolumeSpecName "kube-api-access-7qsgg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:44:14.585371 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:14.585339 2577 generic.go:358] "Generic (PLEG): container finished" podID="01b5a2d1-4b2c-4634-a26c-0eb0f0d31d1e" containerID="95d6ca55a2227a68e4b3b17a4cdbdbbca600d4df52e2baf89218b36141fa233f" exitCode=0 Apr 23 13:44:14.585787 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:14.585395 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7c88655ddf-p6px8" event={"ID":"01b5a2d1-4b2c-4634-a26c-0eb0f0d31d1e","Type":"ContainerDied","Data":"95d6ca55a2227a68e4b3b17a4cdbdbbca600d4df52e2baf89218b36141fa233f"} Apr 23 13:44:14.585787 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:14.585411 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7c88655ddf-p6px8" Apr 23 13:44:14.585787 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:14.585425 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7c88655ddf-p6px8" event={"ID":"01b5a2d1-4b2c-4634-a26c-0eb0f0d31d1e","Type":"ContainerDied","Data":"3619ed39c29f795413a784e04a2733d6671effa4a3c093e63c6339515da6e3d9"} Apr 23 13:44:14.585787 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:14.585443 2577 scope.go:117] "RemoveContainer" containerID="f948017c3b92a4032c0caa28f71555416a3bada60ecf6e9a614b15a39595d004" Apr 23 13:44:14.593122 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:14.593099 2577 scope.go:117] "RemoveContainer" containerID="95d6ca55a2227a68e4b3b17a4cdbdbbca600d4df52e2baf89218b36141fa233f" Apr 23 13:44:14.599836 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:14.599799 2577 scope.go:117] "RemoveContainer" containerID="851cf8c8c28abdd03646537cd343c574c084c2c6d2d649de7c87f7a346763821" Apr 23 13:44:14.606363 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:14.606345 2577 scope.go:117] "RemoveContainer" containerID="f948017c3b92a4032c0caa28f71555416a3bada60ecf6e9a614b15a39595d004" Apr 23 13:44:14.606716 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:44:14.606689 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f948017c3b92a4032c0caa28f71555416a3bada60ecf6e9a614b15a39595d004\": container with ID starting with f948017c3b92a4032c0caa28f71555416a3bada60ecf6e9a614b15a39595d004 not found: ID does not exist" containerID="f948017c3b92a4032c0caa28f71555416a3bada60ecf6e9a614b15a39595d004" Apr 23 13:44:14.606789 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:14.606714 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f948017c3b92a4032c0caa28f71555416a3bada60ecf6e9a614b15a39595d004"} err="failed to get container status \"f948017c3b92a4032c0caa28f71555416a3bada60ecf6e9a614b15a39595d004\": rpc error: code = NotFound desc = could not find container \"f948017c3b92a4032c0caa28f71555416a3bada60ecf6e9a614b15a39595d004\": container with ID starting with f948017c3b92a4032c0caa28f71555416a3bada60ecf6e9a614b15a39595d004 not found: ID does not exist" Apr 23 13:44:14.606789 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:14.606735 2577 scope.go:117] "RemoveContainer" containerID="95d6ca55a2227a68e4b3b17a4cdbdbbca600d4df52e2baf89218b36141fa233f" Apr 23 13:44:14.606984 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:44:14.606962 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95d6ca55a2227a68e4b3b17a4cdbdbbca600d4df52e2baf89218b36141fa233f\": container with ID starting with 95d6ca55a2227a68e4b3b17a4cdbdbbca600d4df52e2baf89218b36141fa233f not found: ID does not exist" containerID="95d6ca55a2227a68e4b3b17a4cdbdbbca600d4df52e2baf89218b36141fa233f" Apr 23 13:44:14.607023 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:14.606991 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95d6ca55a2227a68e4b3b17a4cdbdbbca600d4df52e2baf89218b36141fa233f"} err="failed to get container status \"95d6ca55a2227a68e4b3b17a4cdbdbbca600d4df52e2baf89218b36141fa233f\": rpc error: code = NotFound desc = could not find container \"95d6ca55a2227a68e4b3b17a4cdbdbbca600d4df52e2baf89218b36141fa233f\": container with ID starting with 95d6ca55a2227a68e4b3b17a4cdbdbbca600d4df52e2baf89218b36141fa233f not found: ID does not exist" Apr 23 13:44:14.607023 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:14.607012 2577 scope.go:117] "RemoveContainer" containerID="851cf8c8c28abdd03646537cd343c574c084c2c6d2d649de7c87f7a346763821" Apr 23 13:44:14.607259 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:44:14.607237 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"851cf8c8c28abdd03646537cd343c574c084c2c6d2d649de7c87f7a346763821\": container with ID starting with 851cf8c8c28abdd03646537cd343c574c084c2c6d2d649de7c87f7a346763821 not found: ID does not exist" containerID="851cf8c8c28abdd03646537cd343c574c084c2c6d2d649de7c87f7a346763821" Apr 23 13:44:14.607312 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:14.607269 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"851cf8c8c28abdd03646537cd343c574c084c2c6d2d649de7c87f7a346763821"} err="failed to get container status \"851cf8c8c28abdd03646537cd343c574c084c2c6d2d649de7c87f7a346763821\": rpc error: code = NotFound desc = could not find container \"851cf8c8c28abdd03646537cd343c574c084c2c6d2d649de7c87f7a346763821\": container with ID starting with 851cf8c8c28abdd03646537cd343c574c084c2c6d2d649de7c87f7a346763821 not found: ID does not exist" Apr 23 13:44:14.607568 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:14.607553 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7c88655ddf-p6px8"] Apr 23 13:44:14.610755 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:14.610736 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-7c88655ddf-p6px8"] Apr 23 13:44:14.618585 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:14.618565 2577 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/01b5a2d1-4b2c-4634-a26c-0eb0f0d31d1e-proxy-tls\") on node \"ip-10-0-132-207.ec2.internal\" DevicePath \"\"" Apr 23 13:44:14.618659 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:14.618587 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7qsgg\" (UniqueName: \"kubernetes.io/projected/01b5a2d1-4b2c-4634-a26c-0eb0f0d31d1e-kube-api-access-7qsgg\") on node \"ip-10-0-132-207.ec2.internal\" DevicePath \"\"" Apr 23 13:44:15.583879 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:15.583842 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01b5a2d1-4b2c-4634-a26c-0eb0f0d31d1e" path="/var/lib/kubelet/pods/01b5a2d1-4b2c-4634-a26c-0eb0f0d31d1e/volumes" Apr 23 13:44:17.543419 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:17.543377 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-5649869f87-hrjlf" podUID="04bceceb-c66a-4738-bf8f-ea4f233212cc" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:44:18.585249 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:18.585223 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-f7bdd-predictor-84546c4756-jjbxj" Apr 23 13:44:18.585674 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:18.585643 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-f7bdd-predictor-84546c4756-jjbxj" podUID="9586a1e0-fab0-42ee-8108-aca7842dcef2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:8080: connect: connection refused" Apr 23 13:44:22.542298 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:22.542259 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-5649869f87-hrjlf" podUID="04bceceb-c66a-4738-bf8f-ea4f233212cc" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:44:22.542677 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:22.542365 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-5649869f87-hrjlf" Apr 23 13:44:23.479596 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:23.479568 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-d2d70-predictor-557fcdd5cb-lmqhk" Apr 23 13:44:27.542025 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:27.541991 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-5649869f87-hrjlf" podUID="04bceceb-c66a-4738-bf8f-ea4f233212cc" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:44:28.586180 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:28.586145 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-f7bdd-predictor-84546c4756-jjbxj" podUID="9586a1e0-fab0-42ee-8108-aca7842dcef2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:8080: connect: connection refused" Apr 23 13:44:32.542047 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:32.542010 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-5649869f87-hrjlf" podUID="04bceceb-c66a-4738-bf8f-ea4f233212cc" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:44:34.555869 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:34.555839 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/switch-graph-d2d70-6d98679489-grd7b"] Apr 23 13:44:34.556234 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:34.556111 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="01b5a2d1-4b2c-4634-a26c-0eb0f0d31d1e" containerName="kube-rbac-proxy" Apr 23 13:44:34.556234 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:34.556122 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="01b5a2d1-4b2c-4634-a26c-0eb0f0d31d1e" containerName="kube-rbac-proxy" Apr 23 13:44:34.556234 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:34.556132 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="01b5a2d1-4b2c-4634-a26c-0eb0f0d31d1e" containerName="kserve-container" Apr 23 13:44:34.556234 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:34.556138 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="01b5a2d1-4b2c-4634-a26c-0eb0f0d31d1e" containerName="kserve-container" Apr 23 13:44:34.556234 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:34.556148 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="01b5a2d1-4b2c-4634-a26c-0eb0f0d31d1e" containerName="storage-initializer" Apr 23 13:44:34.556234 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:34.556154 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="01b5a2d1-4b2c-4634-a26c-0eb0f0d31d1e" containerName="storage-initializer" Apr 23 13:44:34.556234 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:34.556202 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="01b5a2d1-4b2c-4634-a26c-0eb0f0d31d1e" containerName="kserve-container" Apr 23 13:44:34.556234 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:34.556209 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="01b5a2d1-4b2c-4634-a26c-0eb0f0d31d1e" containerName="kube-rbac-proxy" Apr 23 13:44:34.558979 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:34.558963 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-d2d70-6d98679489-grd7b" Apr 23 13:44:34.561363 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:34.561343 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-d2d70-kube-rbac-proxy-sar-config\"" Apr 23 13:44:34.561455 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:34.561364 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-d2d70-serving-cert\"" Apr 23 13:44:34.569892 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:34.569865 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-d2d70-6d98679489-grd7b"] Apr 23 13:44:34.655126 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:34.655095 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43db906c-91ff-4eac-84ed-6ee746033d17-openshift-service-ca-bundle\") pod \"switch-graph-d2d70-6d98679489-grd7b\" (UID: \"43db906c-91ff-4eac-84ed-6ee746033d17\") " pod="kserve-ci-e2e-test/switch-graph-d2d70-6d98679489-grd7b" Apr 23 13:44:34.655259 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:34.655131 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/43db906c-91ff-4eac-84ed-6ee746033d17-proxy-tls\") pod \"switch-graph-d2d70-6d98679489-grd7b\" (UID: \"43db906c-91ff-4eac-84ed-6ee746033d17\") " pod="kserve-ci-e2e-test/switch-graph-d2d70-6d98679489-grd7b" Apr 23 13:44:34.756051 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:34.756023 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43db906c-91ff-4eac-84ed-6ee746033d17-openshift-service-ca-bundle\") pod \"switch-graph-d2d70-6d98679489-grd7b\" (UID: \"43db906c-91ff-4eac-84ed-6ee746033d17\") " pod="kserve-ci-e2e-test/switch-graph-d2d70-6d98679489-grd7b" Apr 23 13:44:34.756152 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:34.756055 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/43db906c-91ff-4eac-84ed-6ee746033d17-proxy-tls\") pod \"switch-graph-d2d70-6d98679489-grd7b\" (UID: \"43db906c-91ff-4eac-84ed-6ee746033d17\") " pod="kserve-ci-e2e-test/switch-graph-d2d70-6d98679489-grd7b" Apr 23 13:44:34.756195 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:44:34.756169 2577 secret.go:189] Couldn't get secret kserve-ci-e2e-test/switch-graph-d2d70-serving-cert: secret "switch-graph-d2d70-serving-cert" not found Apr 23 13:44:34.756236 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:44:34.756226 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/43db906c-91ff-4eac-84ed-6ee746033d17-proxy-tls podName:43db906c-91ff-4eac-84ed-6ee746033d17 nodeName:}" failed. No retries permitted until 2026-04-23 13:44:35.256207953 +0000 UTC m=+748.234116523 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/43db906c-91ff-4eac-84ed-6ee746033d17-proxy-tls") pod "switch-graph-d2d70-6d98679489-grd7b" (UID: "43db906c-91ff-4eac-84ed-6ee746033d17") : secret "switch-graph-d2d70-serving-cert" not found Apr 23 13:44:34.756664 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:34.756645 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43db906c-91ff-4eac-84ed-6ee746033d17-openshift-service-ca-bundle\") pod \"switch-graph-d2d70-6d98679489-grd7b\" (UID: \"43db906c-91ff-4eac-84ed-6ee746033d17\") " pod="kserve-ci-e2e-test/switch-graph-d2d70-6d98679489-grd7b" Apr 23 13:44:35.258822 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:35.258788 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/43db906c-91ff-4eac-84ed-6ee746033d17-proxy-tls\") pod \"switch-graph-d2d70-6d98679489-grd7b\" (UID: \"43db906c-91ff-4eac-84ed-6ee746033d17\") " pod="kserve-ci-e2e-test/switch-graph-d2d70-6d98679489-grd7b" Apr 23 13:44:35.261062 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:35.261040 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/43db906c-91ff-4eac-84ed-6ee746033d17-proxy-tls\") pod \"switch-graph-d2d70-6d98679489-grd7b\" (UID: \"43db906c-91ff-4eac-84ed-6ee746033d17\") " pod="kserve-ci-e2e-test/switch-graph-d2d70-6d98679489-grd7b" Apr 23 13:44:35.469864 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:35.469820 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-d2d70-6d98679489-grd7b" Apr 23 13:44:35.585319 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:35.585297 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-d2d70-6d98679489-grd7b"] Apr 23 13:44:35.587700 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:44:35.587676 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43db906c_91ff_4eac_84ed_6ee746033d17.slice/crio-b53f90ff87ead83b867543694f79b6f7140c0e1b433b9f42d4acee0342592934 WatchSource:0}: Error finding container b53f90ff87ead83b867543694f79b6f7140c0e1b433b9f42d4acee0342592934: Status 404 returned error can't find the container with id b53f90ff87ead83b867543694f79b6f7140c0e1b433b9f42d4acee0342592934 Apr 23 13:44:35.641549 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:35.641525 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-d2d70-6d98679489-grd7b" event={"ID":"43db906c-91ff-4eac-84ed-6ee746033d17","Type":"ContainerStarted","Data":"b53f90ff87ead83b867543694f79b6f7140c0e1b433b9f42d4acee0342592934"} Apr 23 13:44:36.647387 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:36.647351 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-d2d70-6d98679489-grd7b" event={"ID":"43db906c-91ff-4eac-84ed-6ee746033d17","Type":"ContainerStarted","Data":"a46b7dcebb728b71027a659ce76b83cce91b4fbadf33322da87ef3c781fc56da"} Apr 23 13:44:36.647781 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:36.647561 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-d2d70-6d98679489-grd7b" Apr 23 13:44:36.664952 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:36.664911 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/switch-graph-d2d70-6d98679489-grd7b" podStartSLOduration=2.6648998969999997 podStartE2EDuration="2.664899897s" podCreationTimestamp="2026-04-23 13:44:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:44:36.662640508 +0000 UTC m=+749.640549101" watchObservedRunningTime="2026-04-23 13:44:36.664899897 +0000 UTC m=+749.642808489" Apr 23 13:44:37.542604 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:37.542560 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-5649869f87-hrjlf" podUID="04bceceb-c66a-4738-bf8f-ea4f233212cc" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:44:38.585681 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:38.585644 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-f7bdd-predictor-84546c4756-jjbxj" podUID="9586a1e0-fab0-42ee-8108-aca7842dcef2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:8080: connect: connection refused" Apr 23 13:44:40.659029 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:40.658948 2577 generic.go:358] "Generic (PLEG): container finished" podID="04bceceb-c66a-4738-bf8f-ea4f233212cc" containerID="6f75d23c64fffa40a9e7383499455a672f185863bb938130ea6b1f8e6bee59d6" exitCode=0 Apr 23 13:44:40.659362 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:40.659027 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-5649869f87-hrjlf" event={"ID":"04bceceb-c66a-4738-bf8f-ea4f233212cc","Type":"ContainerDied","Data":"6f75d23c64fffa40a9e7383499455a672f185863bb938130ea6b1f8e6bee59d6"} Apr 23 13:44:40.859997 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:40.859976 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-5649869f87-hrjlf" Apr 23 13:44:40.900603 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:40.900563 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/04bceceb-c66a-4738-bf8f-ea4f233212cc-proxy-tls\") pod \"04bceceb-c66a-4738-bf8f-ea4f233212cc\" (UID: \"04bceceb-c66a-4738-bf8f-ea4f233212cc\") " Apr 23 13:44:40.900763 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:40.900675 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/04bceceb-c66a-4738-bf8f-ea4f233212cc-openshift-service-ca-bundle\") pod \"04bceceb-c66a-4738-bf8f-ea4f233212cc\" (UID: \"04bceceb-c66a-4738-bf8f-ea4f233212cc\") " Apr 23 13:44:40.901021 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:40.900998 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04bceceb-c66a-4738-bf8f-ea4f233212cc-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "04bceceb-c66a-4738-bf8f-ea4f233212cc" (UID: "04bceceb-c66a-4738-bf8f-ea4f233212cc"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:44:40.902532 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:40.902516 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04bceceb-c66a-4738-bf8f-ea4f233212cc-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "04bceceb-c66a-4738-bf8f-ea4f233212cc" (UID: "04bceceb-c66a-4738-bf8f-ea4f233212cc"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:44:41.001388 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:41.001358 2577 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/04bceceb-c66a-4738-bf8f-ea4f233212cc-openshift-service-ca-bundle\") on node \"ip-10-0-132-207.ec2.internal\" DevicePath \"\"" Apr 23 13:44:41.001388 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:41.001387 2577 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/04bceceb-c66a-4738-bf8f-ea4f233212cc-proxy-tls\") on node \"ip-10-0-132-207.ec2.internal\" DevicePath \"\"" Apr 23 13:44:41.663217 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:41.663186 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-5649869f87-hrjlf" event={"ID":"04bceceb-c66a-4738-bf8f-ea4f233212cc","Type":"ContainerDied","Data":"4047e7660993966de3529b2aa85ed0926539cae8de1189b220729cb82230eb26"} Apr 23 13:44:41.663587 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:41.663222 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-5649869f87-hrjlf" Apr 23 13:44:41.663587 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:41.663226 2577 scope.go:117] "RemoveContainer" containerID="6f75d23c64fffa40a9e7383499455a672f185863bb938130ea6b1f8e6bee59d6" Apr 23 13:44:41.680676 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:41.680651 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-5649869f87-hrjlf"] Apr 23 13:44:41.683746 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:41.683721 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/model-chainer-5649869f87-hrjlf"] Apr 23 13:44:42.656200 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:42.656167 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/switch-graph-d2d70-6d98679489-grd7b" Apr 23 13:44:43.583598 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:43.583564 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04bceceb-c66a-4738-bf8f-ea4f233212cc" path="/var/lib/kubelet/pods/04bceceb-c66a-4738-bf8f-ea4f233212cc/volumes" Apr 23 13:44:48.586376 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:48.586336 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-f7bdd-predictor-84546c4756-jjbxj" podUID="9586a1e0-fab0-42ee-8108-aca7842dcef2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:8080: connect: connection refused" Apr 23 13:44:58.586682 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:44:58.586649 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-f7bdd-predictor-84546c4756-jjbxj" Apr 23 13:45:10.391825 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:45:10.391791 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/sequence-graph-f7bdd-7ccdd96579-trdwk"] Apr 23 13:45:10.392248 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:45:10.392064 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="04bceceb-c66a-4738-bf8f-ea4f233212cc" containerName="model-chainer" Apr 23 13:45:10.392248 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:45:10.392074 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="04bceceb-c66a-4738-bf8f-ea4f233212cc" containerName="model-chainer" Apr 23 13:45:10.392248 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:45:10.392142 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="04bceceb-c66a-4738-bf8f-ea4f233212cc" containerName="model-chainer" Apr 23 13:45:10.396285 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:45:10.396264 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-f7bdd-7ccdd96579-trdwk" Apr 23 13:45:10.399008 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:45:10.398987 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-f7bdd-serving-cert\"" Apr 23 13:45:10.399197 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:45:10.398990 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-f7bdd-kube-rbac-proxy-sar-config\"" Apr 23 13:45:10.400799 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:45:10.400780 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-f7bdd-7ccdd96579-trdwk"] Apr 23 13:45:10.499822 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:45:10.499791 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ae2dabb8-a691-4df0-bcf8-b03f712dbf61-proxy-tls\") pod \"sequence-graph-f7bdd-7ccdd96579-trdwk\" (UID: \"ae2dabb8-a691-4df0-bcf8-b03f712dbf61\") " pod="kserve-ci-e2e-test/sequence-graph-f7bdd-7ccdd96579-trdwk" Apr 23 13:45:10.499959 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:45:10.499846 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ae2dabb8-a691-4df0-bcf8-b03f712dbf61-openshift-service-ca-bundle\") pod \"sequence-graph-f7bdd-7ccdd96579-trdwk\" (UID: \"ae2dabb8-a691-4df0-bcf8-b03f712dbf61\") " pod="kserve-ci-e2e-test/sequence-graph-f7bdd-7ccdd96579-trdwk" Apr 23 13:45:10.600657 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:45:10.600631 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ae2dabb8-a691-4df0-bcf8-b03f712dbf61-openshift-service-ca-bundle\") pod \"sequence-graph-f7bdd-7ccdd96579-trdwk\" (UID: \"ae2dabb8-a691-4df0-bcf8-b03f712dbf61\") " pod="kserve-ci-e2e-test/sequence-graph-f7bdd-7ccdd96579-trdwk" Apr 23 13:45:10.600810 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:45:10.600687 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ae2dabb8-a691-4df0-bcf8-b03f712dbf61-proxy-tls\") pod \"sequence-graph-f7bdd-7ccdd96579-trdwk\" (UID: \"ae2dabb8-a691-4df0-bcf8-b03f712dbf61\") " pod="kserve-ci-e2e-test/sequence-graph-f7bdd-7ccdd96579-trdwk" Apr 23 13:45:10.600810 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:45:10.600771 2577 secret.go:189] Couldn't get secret kserve-ci-e2e-test/sequence-graph-f7bdd-serving-cert: secret "sequence-graph-f7bdd-serving-cert" not found Apr 23 13:45:10.600881 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:45:10.600820 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae2dabb8-a691-4df0-bcf8-b03f712dbf61-proxy-tls podName:ae2dabb8-a691-4df0-bcf8-b03f712dbf61 nodeName:}" failed. No retries permitted until 2026-04-23 13:45:11.100804985 +0000 UTC m=+784.078713554 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/ae2dabb8-a691-4df0-bcf8-b03f712dbf61-proxy-tls") pod "sequence-graph-f7bdd-7ccdd96579-trdwk" (UID: "ae2dabb8-a691-4df0-bcf8-b03f712dbf61") : secret "sequence-graph-f7bdd-serving-cert" not found Apr 23 13:45:10.601239 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:45:10.601220 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ae2dabb8-a691-4df0-bcf8-b03f712dbf61-openshift-service-ca-bundle\") pod \"sequence-graph-f7bdd-7ccdd96579-trdwk\" (UID: \"ae2dabb8-a691-4df0-bcf8-b03f712dbf61\") " pod="kserve-ci-e2e-test/sequence-graph-f7bdd-7ccdd96579-trdwk" Apr 23 13:45:11.104202 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:45:11.104166 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ae2dabb8-a691-4df0-bcf8-b03f712dbf61-proxy-tls\") pod \"sequence-graph-f7bdd-7ccdd96579-trdwk\" (UID: \"ae2dabb8-a691-4df0-bcf8-b03f712dbf61\") " pod="kserve-ci-e2e-test/sequence-graph-f7bdd-7ccdd96579-trdwk" Apr 23 13:45:11.106453 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:45:11.106429 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ae2dabb8-a691-4df0-bcf8-b03f712dbf61-proxy-tls\") pod \"sequence-graph-f7bdd-7ccdd96579-trdwk\" (UID: \"ae2dabb8-a691-4df0-bcf8-b03f712dbf61\") " pod="kserve-ci-e2e-test/sequence-graph-f7bdd-7ccdd96579-trdwk" Apr 23 13:45:11.307553 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:45:11.307520 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-f7bdd-7ccdd96579-trdwk" Apr 23 13:45:11.421343 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:45:11.421308 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-f7bdd-7ccdd96579-trdwk"] Apr 23 13:45:11.425592 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:45:11.425561 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae2dabb8_a691_4df0_bcf8_b03f712dbf61.slice/crio-834d147dea0cb529462077f0143339b32e7cb27402dcb77936009418eb28b991 WatchSource:0}: Error finding container 834d147dea0cb529462077f0143339b32e7cb27402dcb77936009418eb28b991: Status 404 returned error can't find the container with id 834d147dea0cb529462077f0143339b32e7cb27402dcb77936009418eb28b991 Apr 23 13:45:11.749931 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:45:11.749892 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-f7bdd-7ccdd96579-trdwk" event={"ID":"ae2dabb8-a691-4df0-bcf8-b03f712dbf61","Type":"ContainerStarted","Data":"ee29710fdc063998aec3cdd8c41efc68f82d63038f0a48972fad606096b9da06"} Apr 23 13:45:11.749931 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:45:11.749933 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-f7bdd-7ccdd96579-trdwk" event={"ID":"ae2dabb8-a691-4df0-bcf8-b03f712dbf61","Type":"ContainerStarted","Data":"834d147dea0cb529462077f0143339b32e7cb27402dcb77936009418eb28b991"} Apr 23 13:45:11.750160 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:45:11.750043 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-f7bdd-7ccdd96579-trdwk" Apr 23 13:45:11.766212 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:45:11.766169 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/sequence-graph-f7bdd-7ccdd96579-trdwk" podStartSLOduration=1.766158305 podStartE2EDuration="1.766158305s" podCreationTimestamp="2026-04-23 13:45:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:45:11.764642715 +0000 UTC m=+784.742551308" watchObservedRunningTime="2026-04-23 13:45:11.766158305 +0000 UTC m=+784.744066913" Apr 23 13:45:17.758873 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:45:17.758848 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sequence-graph-f7bdd-7ccdd96579-trdwk" Apr 23 13:52:49.309227 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:52:49.309196 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-d2d70-6d98679489-grd7b"] Apr 23 13:52:49.311511 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:52:49.309427 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/switch-graph-d2d70-6d98679489-grd7b" podUID="43db906c-91ff-4eac-84ed-6ee746033d17" containerName="switch-graph-d2d70" containerID="cri-o://a46b7dcebb728b71027a659ce76b83cce91b4fbadf33322da87ef3c781fc56da" gracePeriod=30 Apr 23 13:52:49.477155 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:52:49.477125 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-d2d70-predictor-557fcdd5cb-lmqhk"] Apr 23 13:52:49.477464 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:52:49.477438 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-d2d70-predictor-557fcdd5cb-lmqhk" podUID="a96fbf6d-90ba-4223-868a-f95ba2c8d876" containerName="kserve-container" containerID="cri-o://ac3c7a53a4cd908b585f30e111c0cfb61d2e44dff91a8413119ed501cf9a771d" gracePeriod=30 Apr 23 13:52:49.477590 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:52:49.477501 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-d2d70-predictor-557fcdd5cb-lmqhk" podUID="a96fbf6d-90ba-4223-868a-f95ba2c8d876" containerName="kube-rbac-proxy" containerID="cri-o://1ccd34452f113bed06d0b36fd7eb83387941b21f35674a56b0911a2ba7390e0d" gracePeriod=30 Apr 23 13:52:49.590384 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:52:49.590300 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-2d7b2-predictor-595799759b-8gwcl"] Apr 23 13:52:49.593990 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:52:49.593974 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-2d7b2-predictor-595799759b-8gwcl" Apr 23 13:52:49.597610 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:52:49.597590 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-2d7b2-predictor-serving-cert\"" Apr 23 13:52:49.597722 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:52:49.597595 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-2d7b2-kube-rbac-proxy-sar-config\"" Apr 23 13:52:49.619560 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:52:49.619538 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-2d7b2-predictor-595799759b-8gwcl"] Apr 23 13:52:49.745253 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:52:49.745218 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-2d7b2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/31bac30d-bf67-403f-b7e9-a5577ec06b89-error-404-isvc-2d7b2-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-2d7b2-predictor-595799759b-8gwcl\" (UID: \"31bac30d-bf67-403f-b7e9-a5577ec06b89\") " pod="kserve-ci-e2e-test/error-404-isvc-2d7b2-predictor-595799759b-8gwcl" Apr 23 13:52:49.745253 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:52:49.745254 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bhbw\" (UniqueName: \"kubernetes.io/projected/31bac30d-bf67-403f-b7e9-a5577ec06b89-kube-api-access-2bhbw\") pod \"error-404-isvc-2d7b2-predictor-595799759b-8gwcl\" (UID: \"31bac30d-bf67-403f-b7e9-a5577ec06b89\") " pod="kserve-ci-e2e-test/error-404-isvc-2d7b2-predictor-595799759b-8gwcl" Apr 23 13:52:49.745447 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:52:49.745275 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31bac30d-bf67-403f-b7e9-a5577ec06b89-proxy-tls\") pod \"error-404-isvc-2d7b2-predictor-595799759b-8gwcl\" (UID: \"31bac30d-bf67-403f-b7e9-a5577ec06b89\") " pod="kserve-ci-e2e-test/error-404-isvc-2d7b2-predictor-595799759b-8gwcl" Apr 23 13:52:49.846604 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:52:49.846516 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-2d7b2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/31bac30d-bf67-403f-b7e9-a5577ec06b89-error-404-isvc-2d7b2-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-2d7b2-predictor-595799759b-8gwcl\" (UID: \"31bac30d-bf67-403f-b7e9-a5577ec06b89\") " pod="kserve-ci-e2e-test/error-404-isvc-2d7b2-predictor-595799759b-8gwcl" Apr 23 13:52:49.846604 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:52:49.846555 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2bhbw\" (UniqueName: \"kubernetes.io/projected/31bac30d-bf67-403f-b7e9-a5577ec06b89-kube-api-access-2bhbw\") pod \"error-404-isvc-2d7b2-predictor-595799759b-8gwcl\" (UID: \"31bac30d-bf67-403f-b7e9-a5577ec06b89\") " pod="kserve-ci-e2e-test/error-404-isvc-2d7b2-predictor-595799759b-8gwcl" Apr 23 13:52:49.846604 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:52:49.846576 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31bac30d-bf67-403f-b7e9-a5577ec06b89-proxy-tls\") pod \"error-404-isvc-2d7b2-predictor-595799759b-8gwcl\" (UID: \"31bac30d-bf67-403f-b7e9-a5577ec06b89\") " pod="kserve-ci-e2e-test/error-404-isvc-2d7b2-predictor-595799759b-8gwcl" Apr 23 13:52:49.847212 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:52:49.847183 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-2d7b2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/31bac30d-bf67-403f-b7e9-a5577ec06b89-error-404-isvc-2d7b2-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-2d7b2-predictor-595799759b-8gwcl\" (UID: \"31bac30d-bf67-403f-b7e9-a5577ec06b89\") " pod="kserve-ci-e2e-test/error-404-isvc-2d7b2-predictor-595799759b-8gwcl" Apr 23 13:52:49.848942 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:52:49.848922 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31bac30d-bf67-403f-b7e9-a5577ec06b89-proxy-tls\") pod \"error-404-isvc-2d7b2-predictor-595799759b-8gwcl\" (UID: \"31bac30d-bf67-403f-b7e9-a5577ec06b89\") " pod="kserve-ci-e2e-test/error-404-isvc-2d7b2-predictor-595799759b-8gwcl" Apr 23 13:52:49.853757 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:52:49.853737 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bhbw\" (UniqueName: \"kubernetes.io/projected/31bac30d-bf67-403f-b7e9-a5577ec06b89-kube-api-access-2bhbw\") pod \"error-404-isvc-2d7b2-predictor-595799759b-8gwcl\" (UID: \"31bac30d-bf67-403f-b7e9-a5577ec06b89\") " pod="kserve-ci-e2e-test/error-404-isvc-2d7b2-predictor-595799759b-8gwcl" Apr 23 13:52:49.903477 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:52:49.903450 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-2d7b2-predictor-595799759b-8gwcl" Apr 23 13:52:50.010285 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:52:50.010243 2577 generic.go:358] "Generic (PLEG): container finished" podID="a96fbf6d-90ba-4223-868a-f95ba2c8d876" containerID="1ccd34452f113bed06d0b36fd7eb83387941b21f35674a56b0911a2ba7390e0d" exitCode=2 Apr 23 13:52:50.010426 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:52:50.010301 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-d2d70-predictor-557fcdd5cb-lmqhk" event={"ID":"a96fbf6d-90ba-4223-868a-f95ba2c8d876","Type":"ContainerDied","Data":"1ccd34452f113bed06d0b36fd7eb83387941b21f35674a56b0911a2ba7390e0d"} Apr 23 13:52:50.025225 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:52:50.025196 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-2d7b2-predictor-595799759b-8gwcl"] Apr 23 13:52:50.026008 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:52:50.025986 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod31bac30d_bf67_403f_b7e9_a5577ec06b89.slice/crio-fe7797b6d673400d2e0c20a89cf100f3199f6dca78ed96ad78e76eacbe1d1100 WatchSource:0}: Error finding container fe7797b6d673400d2e0c20a89cf100f3199f6dca78ed96ad78e76eacbe1d1100: Status 404 returned error can't find the container with id fe7797b6d673400d2e0c20a89cf100f3199f6dca78ed96ad78e76eacbe1d1100 Apr 23 13:52:50.028480 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:52:50.028459 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 13:52:51.014897 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:52:51.014858 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-2d7b2-predictor-595799759b-8gwcl" event={"ID":"31bac30d-bf67-403f-b7e9-a5577ec06b89","Type":"ContainerStarted","Data":"e0bd6056fb5aab8347f7e527bfc93811fb9647ba966ac417c62576b9a71c935c"} Apr 23 13:52:51.014897 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:52:51.014897 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-2d7b2-predictor-595799759b-8gwcl" event={"ID":"31bac30d-bf67-403f-b7e9-a5577ec06b89","Type":"ContainerStarted","Data":"b7143d1c785e1ad6ed656093d0a6e90cf6b15d4a86c14c040e1d20be4a25f3d5"} Apr 23 13:52:51.014897 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:52:51.014908 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-2d7b2-predictor-595799759b-8gwcl" event={"ID":"31bac30d-bf67-403f-b7e9-a5577ec06b89","Type":"ContainerStarted","Data":"fe7797b6d673400d2e0c20a89cf100f3199f6dca78ed96ad78e76eacbe1d1100"} Apr 23 13:52:51.015403 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:52:51.015117 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-2d7b2-predictor-595799759b-8gwcl" Apr 23 13:52:51.015403 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:52:51.015206 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-2d7b2-predictor-595799759b-8gwcl" Apr 23 13:52:51.016481 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:52:51.016454 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-2d7b2-predictor-595799759b-8gwcl" podUID="31bac30d-bf67-403f-b7e9-a5577ec06b89" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 23 13:52:51.032943 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:52:51.032904 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-2d7b2-predictor-595799759b-8gwcl" podStartSLOduration=2.032891187 podStartE2EDuration="2.032891187s" podCreationTimestamp="2026-04-23 13:52:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:52:51.031227922 +0000 UTC m=+1244.009136513" watchObservedRunningTime="2026-04-23 13:52:51.032891187 +0000 UTC m=+1244.010799848" Apr 23 13:52:52.018100 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:52:52.018063 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-2d7b2-predictor-595799759b-8gwcl" podUID="31bac30d-bf67-403f-b7e9-a5577ec06b89" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 23 13:52:52.414796 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:52:52.414774 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-d2d70-predictor-557fcdd5cb-lmqhk" Apr 23 13:52:52.565301 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:52:52.565213 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a96fbf6d-90ba-4223-868a-f95ba2c8d876-proxy-tls\") pod \"a96fbf6d-90ba-4223-868a-f95ba2c8d876\" (UID: \"a96fbf6d-90ba-4223-868a-f95ba2c8d876\") " Apr 23 13:52:52.565301 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:52:52.565265 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8khlw\" (UniqueName: \"kubernetes.io/projected/a96fbf6d-90ba-4223-868a-f95ba2c8d876-kube-api-access-8khlw\") pod \"a96fbf6d-90ba-4223-868a-f95ba2c8d876\" (UID: \"a96fbf6d-90ba-4223-868a-f95ba2c8d876\") " Apr 23 13:52:52.565529 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:52:52.565314 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-d2d70-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a96fbf6d-90ba-4223-868a-f95ba2c8d876-error-404-isvc-d2d70-kube-rbac-proxy-sar-config\") pod \"a96fbf6d-90ba-4223-868a-f95ba2c8d876\" (UID: \"a96fbf6d-90ba-4223-868a-f95ba2c8d876\") " Apr 23 13:52:52.565776 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:52:52.565746 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a96fbf6d-90ba-4223-868a-f95ba2c8d876-error-404-isvc-d2d70-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-d2d70-kube-rbac-proxy-sar-config") pod "a96fbf6d-90ba-4223-868a-f95ba2c8d876" (UID: "a96fbf6d-90ba-4223-868a-f95ba2c8d876"). InnerVolumeSpecName "error-404-isvc-d2d70-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:52:52.567306 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:52:52.567272 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a96fbf6d-90ba-4223-868a-f95ba2c8d876-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "a96fbf6d-90ba-4223-868a-f95ba2c8d876" (UID: "a96fbf6d-90ba-4223-868a-f95ba2c8d876"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:52:52.567431 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:52:52.567405 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a96fbf6d-90ba-4223-868a-f95ba2c8d876-kube-api-access-8khlw" (OuterVolumeSpecName: "kube-api-access-8khlw") pod "a96fbf6d-90ba-4223-868a-f95ba2c8d876" (UID: "a96fbf6d-90ba-4223-868a-f95ba2c8d876"). InnerVolumeSpecName "kube-api-access-8khlw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:52:52.653559 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:52:52.653528 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-d2d70-6d98679489-grd7b" podUID="43db906c-91ff-4eac-84ed-6ee746033d17" containerName="switch-graph-d2d70" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:52:52.666684 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:52:52.666665 2577 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a96fbf6d-90ba-4223-868a-f95ba2c8d876-proxy-tls\") on node \"ip-10-0-132-207.ec2.internal\" DevicePath \"\"" Apr 23 13:52:52.666684 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:52:52.666684 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8khlw\" (UniqueName: \"kubernetes.io/projected/a96fbf6d-90ba-4223-868a-f95ba2c8d876-kube-api-access-8khlw\") on node \"ip-10-0-132-207.ec2.internal\" DevicePath \"\"" Apr 23 13:52:52.666795 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:52:52.666695 2577 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-d2d70-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a96fbf6d-90ba-4223-868a-f95ba2c8d876-error-404-isvc-d2d70-kube-rbac-proxy-sar-config\") on node \"ip-10-0-132-207.ec2.internal\" DevicePath \"\"" Apr 23 13:52:53.022542 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:52:53.022484 2577 generic.go:358] "Generic (PLEG): container finished" podID="a96fbf6d-90ba-4223-868a-f95ba2c8d876" containerID="ac3c7a53a4cd908b585f30e111c0cfb61d2e44dff91a8413119ed501cf9a771d" exitCode=0 Apr 23 13:52:53.022916 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:52:53.022573 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-d2d70-predictor-557fcdd5cb-lmqhk" Apr 23 13:52:53.022916 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:52:53.022597 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-d2d70-predictor-557fcdd5cb-lmqhk" event={"ID":"a96fbf6d-90ba-4223-868a-f95ba2c8d876","Type":"ContainerDied","Data":"ac3c7a53a4cd908b585f30e111c0cfb61d2e44dff91a8413119ed501cf9a771d"} Apr 23 13:52:53.022916 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:52:53.022639 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-d2d70-predictor-557fcdd5cb-lmqhk" event={"ID":"a96fbf6d-90ba-4223-868a-f95ba2c8d876","Type":"ContainerDied","Data":"7862ada0c54f0f641322559d6692711b2effa7bac123ba19005b73064d823020"} Apr 23 13:52:53.022916 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:52:53.022657 2577 scope.go:117] "RemoveContainer" containerID="1ccd34452f113bed06d0b36fd7eb83387941b21f35674a56b0911a2ba7390e0d" Apr 23 13:52:53.032074 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:52:53.031945 2577 scope.go:117] "RemoveContainer" containerID="ac3c7a53a4cd908b585f30e111c0cfb61d2e44dff91a8413119ed501cf9a771d" Apr 23 13:52:53.038941 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:52:53.038925 2577 scope.go:117] "RemoveContainer" containerID="1ccd34452f113bed06d0b36fd7eb83387941b21f35674a56b0911a2ba7390e0d" Apr 23 13:52:53.039203 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:52:53.039185 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ccd34452f113bed06d0b36fd7eb83387941b21f35674a56b0911a2ba7390e0d\": container with ID starting with 1ccd34452f113bed06d0b36fd7eb83387941b21f35674a56b0911a2ba7390e0d not found: ID does not exist" containerID="1ccd34452f113bed06d0b36fd7eb83387941b21f35674a56b0911a2ba7390e0d" Apr 23 13:52:53.039244 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:52:53.039212 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ccd34452f113bed06d0b36fd7eb83387941b21f35674a56b0911a2ba7390e0d"} err="failed to get container status \"1ccd34452f113bed06d0b36fd7eb83387941b21f35674a56b0911a2ba7390e0d\": rpc error: code = NotFound desc = could not find container \"1ccd34452f113bed06d0b36fd7eb83387941b21f35674a56b0911a2ba7390e0d\": container with ID starting with 1ccd34452f113bed06d0b36fd7eb83387941b21f35674a56b0911a2ba7390e0d not found: ID does not exist" Apr 23 13:52:53.039244 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:52:53.039230 2577 scope.go:117] "RemoveContainer" containerID="ac3c7a53a4cd908b585f30e111c0cfb61d2e44dff91a8413119ed501cf9a771d" Apr 23 13:52:53.039460 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:52:53.039442 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac3c7a53a4cd908b585f30e111c0cfb61d2e44dff91a8413119ed501cf9a771d\": container with ID starting with ac3c7a53a4cd908b585f30e111c0cfb61d2e44dff91a8413119ed501cf9a771d not found: ID does not exist" containerID="ac3c7a53a4cd908b585f30e111c0cfb61d2e44dff91a8413119ed501cf9a771d" Apr 23 13:52:53.039537 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:52:53.039467 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac3c7a53a4cd908b585f30e111c0cfb61d2e44dff91a8413119ed501cf9a771d"} err="failed to get container status \"ac3c7a53a4cd908b585f30e111c0cfb61d2e44dff91a8413119ed501cf9a771d\": rpc error: code = NotFound desc = could not find container \"ac3c7a53a4cd908b585f30e111c0cfb61d2e44dff91a8413119ed501cf9a771d\": container with ID starting with ac3c7a53a4cd908b585f30e111c0cfb61d2e44dff91a8413119ed501cf9a771d not found: ID does not exist" Apr 23 13:52:53.044588 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:52:53.044569 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-d2d70-predictor-557fcdd5cb-lmqhk"] Apr 23 13:52:53.049756 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:52:53.049736 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-d2d70-predictor-557fcdd5cb-lmqhk"] Apr 23 13:52:53.584483 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:52:53.584446 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a96fbf6d-90ba-4223-868a-f95ba2c8d876" path="/var/lib/kubelet/pods/a96fbf6d-90ba-4223-868a-f95ba2c8d876/volumes" Apr 23 13:52:57.022777 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:52:57.022705 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-2d7b2-predictor-595799759b-8gwcl" Apr 23 13:52:57.023212 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:52:57.023182 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-2d7b2-predictor-595799759b-8gwcl" podUID="31bac30d-bf67-403f-b7e9-a5577ec06b89" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 23 13:52:57.654086 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:52:57.654047 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-d2d70-6d98679489-grd7b" podUID="43db906c-91ff-4eac-84ed-6ee746033d17" containerName="switch-graph-d2d70" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:53:02.653521 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:02.653463 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-d2d70-6d98679489-grd7b" podUID="43db906c-91ff-4eac-84ed-6ee746033d17" containerName="switch-graph-d2d70" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:53:02.653971 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:02.653606 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-d2d70-6d98679489-grd7b" Apr 23 13:53:07.024139 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:07.024097 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-2d7b2-predictor-595799759b-8gwcl" podUID="31bac30d-bf67-403f-b7e9-a5577ec06b89" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 23 13:53:07.658246 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:07.658205 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-d2d70-6d98679489-grd7b" podUID="43db906c-91ff-4eac-84ed-6ee746033d17" containerName="switch-graph-d2d70" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:53:12.654151 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:12.654102 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-d2d70-6d98679489-grd7b" podUID="43db906c-91ff-4eac-84ed-6ee746033d17" containerName="switch-graph-d2d70" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:53:17.024077 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:17.024042 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-2d7b2-predictor-595799759b-8gwcl" podUID="31bac30d-bf67-403f-b7e9-a5577ec06b89" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 23 13:53:17.654192 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:17.654151 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-d2d70-6d98679489-grd7b" podUID="43db906c-91ff-4eac-84ed-6ee746033d17" containerName="switch-graph-d2d70" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:53:19.448507 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:19.448471 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-d2d70-6d98679489-grd7b" Apr 23 13:53:19.475972 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:19.475942 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/43db906c-91ff-4eac-84ed-6ee746033d17-proxy-tls\") pod \"43db906c-91ff-4eac-84ed-6ee746033d17\" (UID: \"43db906c-91ff-4eac-84ed-6ee746033d17\") " Apr 23 13:53:19.476096 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:19.476053 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43db906c-91ff-4eac-84ed-6ee746033d17-openshift-service-ca-bundle\") pod \"43db906c-91ff-4eac-84ed-6ee746033d17\" (UID: \"43db906c-91ff-4eac-84ed-6ee746033d17\") " Apr 23 13:53:19.476400 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:19.476376 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43db906c-91ff-4eac-84ed-6ee746033d17-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "43db906c-91ff-4eac-84ed-6ee746033d17" (UID: "43db906c-91ff-4eac-84ed-6ee746033d17"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:53:19.477922 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:19.477901 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43db906c-91ff-4eac-84ed-6ee746033d17-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "43db906c-91ff-4eac-84ed-6ee746033d17" (UID: "43db906c-91ff-4eac-84ed-6ee746033d17"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:53:19.577082 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:19.577006 2577 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/43db906c-91ff-4eac-84ed-6ee746033d17-proxy-tls\") on node \"ip-10-0-132-207.ec2.internal\" DevicePath \"\"" Apr 23 13:53:19.577082 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:19.577034 2577 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43db906c-91ff-4eac-84ed-6ee746033d17-openshift-service-ca-bundle\") on node \"ip-10-0-132-207.ec2.internal\" DevicePath \"\"" Apr 23 13:53:20.102412 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:20.102373 2577 generic.go:358] "Generic (PLEG): container finished" podID="43db906c-91ff-4eac-84ed-6ee746033d17" containerID="a46b7dcebb728b71027a659ce76b83cce91b4fbadf33322da87ef3c781fc56da" exitCode=0 Apr 23 13:53:20.102685 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:20.102451 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-d2d70-6d98679489-grd7b" Apr 23 13:53:20.102685 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:20.102453 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-d2d70-6d98679489-grd7b" event={"ID":"43db906c-91ff-4eac-84ed-6ee746033d17","Type":"ContainerDied","Data":"a46b7dcebb728b71027a659ce76b83cce91b4fbadf33322da87ef3c781fc56da"} Apr 23 13:53:20.102685 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:20.102526 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-d2d70-6d98679489-grd7b" event={"ID":"43db906c-91ff-4eac-84ed-6ee746033d17","Type":"ContainerDied","Data":"b53f90ff87ead83b867543694f79b6f7140c0e1b433b9f42d4acee0342592934"} Apr 23 13:53:20.102685 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:20.102541 2577 scope.go:117] "RemoveContainer" containerID="a46b7dcebb728b71027a659ce76b83cce91b4fbadf33322da87ef3c781fc56da" Apr 23 13:53:20.109799 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:20.109780 2577 scope.go:117] "RemoveContainer" containerID="a46b7dcebb728b71027a659ce76b83cce91b4fbadf33322da87ef3c781fc56da" Apr 23 13:53:20.110019 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:53:20.109997 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a46b7dcebb728b71027a659ce76b83cce91b4fbadf33322da87ef3c781fc56da\": container with ID starting with a46b7dcebb728b71027a659ce76b83cce91b4fbadf33322da87ef3c781fc56da not found: ID does not exist" containerID="a46b7dcebb728b71027a659ce76b83cce91b4fbadf33322da87ef3c781fc56da" Apr 23 13:53:20.110079 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:20.110030 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a46b7dcebb728b71027a659ce76b83cce91b4fbadf33322da87ef3c781fc56da"} err="failed to get container status \"a46b7dcebb728b71027a659ce76b83cce91b4fbadf33322da87ef3c781fc56da\": rpc error: code = NotFound desc = could not find container \"a46b7dcebb728b71027a659ce76b83cce91b4fbadf33322da87ef3c781fc56da\": container with ID starting with a46b7dcebb728b71027a659ce76b83cce91b4fbadf33322da87ef3c781fc56da not found: ID does not exist" Apr 23 13:53:20.118047 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:20.118023 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-d2d70-6d98679489-grd7b"] Apr 23 13:53:20.121460 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:20.121439 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/switch-graph-d2d70-6d98679489-grd7b"] Apr 23 13:53:21.583169 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:21.583136 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43db906c-91ff-4eac-84ed-6ee746033d17" path="/var/lib/kubelet/pods/43db906c-91ff-4eac-84ed-6ee746033d17/volumes" Apr 23 13:53:25.103937 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:25.103908 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-f7bdd-7ccdd96579-trdwk"] Apr 23 13:53:25.104314 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:25.104121 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sequence-graph-f7bdd-7ccdd96579-trdwk" podUID="ae2dabb8-a691-4df0-bcf8-b03f712dbf61" containerName="sequence-graph-f7bdd" containerID="cri-o://ee29710fdc063998aec3cdd8c41efc68f82d63038f0a48972fad606096b9da06" gracePeriod=30 Apr 23 13:53:25.306639 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:25.306609 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-f7bdd-predictor-84546c4756-jjbxj"] Apr 23 13:53:25.306929 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:25.306879 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-f7bdd-predictor-84546c4756-jjbxj" podUID="9586a1e0-fab0-42ee-8108-aca7842dcef2" containerName="kserve-container" containerID="cri-o://91a6b0acb99f53048054186a255b01fbbf4eca671b320a4030f6c7b71a463aa5" gracePeriod=30 Apr 23 13:53:25.306929 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:25.306921 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-f7bdd-predictor-84546c4756-jjbxj" podUID="9586a1e0-fab0-42ee-8108-aca7842dcef2" containerName="kube-rbac-proxy" containerID="cri-o://5dbf385f79331e5ffc2ba2d91a98abbbb745924748705618b15fbe3e319c1d24" gracePeriod=30 Apr 23 13:53:25.370459 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:25.370402 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-1cfcc-predictor-655bf67bfd-prkpg"] Apr 23 13:53:25.370716 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:25.370704 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="43db906c-91ff-4eac-84ed-6ee746033d17" containerName="switch-graph-d2d70" Apr 23 13:53:25.370763 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:25.370718 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="43db906c-91ff-4eac-84ed-6ee746033d17" containerName="switch-graph-d2d70" Apr 23 13:53:25.370763 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:25.370735 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a96fbf6d-90ba-4223-868a-f95ba2c8d876" containerName="kserve-container" Apr 23 13:53:25.370763 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:25.370741 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="a96fbf6d-90ba-4223-868a-f95ba2c8d876" containerName="kserve-container" Apr 23 13:53:25.370763 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:25.370754 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a96fbf6d-90ba-4223-868a-f95ba2c8d876" containerName="kube-rbac-proxy" Apr 23 13:53:25.370763 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:25.370759 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="a96fbf6d-90ba-4223-868a-f95ba2c8d876" containerName="kube-rbac-proxy" Apr 23 13:53:25.370907 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:25.370802 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="a96fbf6d-90ba-4223-868a-f95ba2c8d876" containerName="kube-rbac-proxy" Apr 23 13:53:25.370907 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:25.370809 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="43db906c-91ff-4eac-84ed-6ee746033d17" containerName="switch-graph-d2d70" Apr 23 13:53:25.370907 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:25.370817 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="a96fbf6d-90ba-4223-868a-f95ba2c8d876" containerName="kserve-container" Apr 23 13:53:25.375177 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:25.375162 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-1cfcc-predictor-655bf67bfd-prkpg" Apr 23 13:53:25.377624 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:25.377607 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-1cfcc-predictor-serving-cert\"" Apr 23 13:53:25.377900 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:25.377879 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-1cfcc-kube-rbac-proxy-sar-config\"" Apr 23 13:53:25.381986 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:25.381809 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-1cfcc-predictor-655bf67bfd-prkpg"] Apr 23 13:53:25.421642 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:25.421616 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmpdl\" (UniqueName: \"kubernetes.io/projected/38b4a054-ea79-474e-ab9a-7cdac8139d19-kube-api-access-nmpdl\") pod \"error-404-isvc-1cfcc-predictor-655bf67bfd-prkpg\" (UID: \"38b4a054-ea79-474e-ab9a-7cdac8139d19\") " pod="kserve-ci-e2e-test/error-404-isvc-1cfcc-predictor-655bf67bfd-prkpg" Apr 23 13:53:25.421754 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:25.421725 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-1cfcc-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/38b4a054-ea79-474e-ab9a-7cdac8139d19-error-404-isvc-1cfcc-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-1cfcc-predictor-655bf67bfd-prkpg\" (UID: \"38b4a054-ea79-474e-ab9a-7cdac8139d19\") " pod="kserve-ci-e2e-test/error-404-isvc-1cfcc-predictor-655bf67bfd-prkpg" Apr 23 13:53:25.421816 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:25.421798 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/38b4a054-ea79-474e-ab9a-7cdac8139d19-proxy-tls\") pod \"error-404-isvc-1cfcc-predictor-655bf67bfd-prkpg\" (UID: \"38b4a054-ea79-474e-ab9a-7cdac8139d19\") " pod="kserve-ci-e2e-test/error-404-isvc-1cfcc-predictor-655bf67bfd-prkpg" Apr 23 13:53:25.522350 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:25.522325 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/38b4a054-ea79-474e-ab9a-7cdac8139d19-proxy-tls\") pod \"error-404-isvc-1cfcc-predictor-655bf67bfd-prkpg\" (UID: \"38b4a054-ea79-474e-ab9a-7cdac8139d19\") " pod="kserve-ci-e2e-test/error-404-isvc-1cfcc-predictor-655bf67bfd-prkpg" Apr 23 13:53:25.522480 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:25.522368 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nmpdl\" (UniqueName: \"kubernetes.io/projected/38b4a054-ea79-474e-ab9a-7cdac8139d19-kube-api-access-nmpdl\") pod \"error-404-isvc-1cfcc-predictor-655bf67bfd-prkpg\" (UID: \"38b4a054-ea79-474e-ab9a-7cdac8139d19\") " pod="kserve-ci-e2e-test/error-404-isvc-1cfcc-predictor-655bf67bfd-prkpg" Apr 23 13:53:25.522480 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:25.522410 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-1cfcc-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/38b4a054-ea79-474e-ab9a-7cdac8139d19-error-404-isvc-1cfcc-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-1cfcc-predictor-655bf67bfd-prkpg\" (UID: \"38b4a054-ea79-474e-ab9a-7cdac8139d19\") " pod="kserve-ci-e2e-test/error-404-isvc-1cfcc-predictor-655bf67bfd-prkpg" Apr 23 13:53:25.522956 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:25.522937 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-1cfcc-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/38b4a054-ea79-474e-ab9a-7cdac8139d19-error-404-isvc-1cfcc-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-1cfcc-predictor-655bf67bfd-prkpg\" (UID: \"38b4a054-ea79-474e-ab9a-7cdac8139d19\") " pod="kserve-ci-e2e-test/error-404-isvc-1cfcc-predictor-655bf67bfd-prkpg" Apr 23 13:53:25.524590 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:25.524567 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/38b4a054-ea79-474e-ab9a-7cdac8139d19-proxy-tls\") pod \"error-404-isvc-1cfcc-predictor-655bf67bfd-prkpg\" (UID: \"38b4a054-ea79-474e-ab9a-7cdac8139d19\") " pod="kserve-ci-e2e-test/error-404-isvc-1cfcc-predictor-655bf67bfd-prkpg" Apr 23 13:53:25.530949 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:25.530926 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmpdl\" (UniqueName: \"kubernetes.io/projected/38b4a054-ea79-474e-ab9a-7cdac8139d19-kube-api-access-nmpdl\") pod \"error-404-isvc-1cfcc-predictor-655bf67bfd-prkpg\" (UID: \"38b4a054-ea79-474e-ab9a-7cdac8139d19\") " pod="kserve-ci-e2e-test/error-404-isvc-1cfcc-predictor-655bf67bfd-prkpg" Apr 23 13:53:25.685879 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:25.685853 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-1cfcc-predictor-655bf67bfd-prkpg" Apr 23 13:53:25.804732 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:25.804698 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-1cfcc-predictor-655bf67bfd-prkpg"] Apr 23 13:53:25.806993 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:53:25.806967 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38b4a054_ea79_474e_ab9a_7cdac8139d19.slice/crio-dec8801bd1f2e35cc6acc3f025974f94570a3bd2e2a046f41812a41b4f714796 WatchSource:0}: Error finding container dec8801bd1f2e35cc6acc3f025974f94570a3bd2e2a046f41812a41b4f714796: Status 404 returned error can't find the container with id dec8801bd1f2e35cc6acc3f025974f94570a3bd2e2a046f41812a41b4f714796 Apr 23 13:53:26.120233 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:26.120155 2577 generic.go:358] "Generic (PLEG): container finished" podID="9586a1e0-fab0-42ee-8108-aca7842dcef2" containerID="5dbf385f79331e5ffc2ba2d91a98abbbb745924748705618b15fbe3e319c1d24" exitCode=2 Apr 23 13:53:26.120233 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:26.120224 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-f7bdd-predictor-84546c4756-jjbxj" event={"ID":"9586a1e0-fab0-42ee-8108-aca7842dcef2","Type":"ContainerDied","Data":"5dbf385f79331e5ffc2ba2d91a98abbbb745924748705618b15fbe3e319c1d24"} Apr 23 13:53:26.121692 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:26.121668 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-1cfcc-predictor-655bf67bfd-prkpg" event={"ID":"38b4a054-ea79-474e-ab9a-7cdac8139d19","Type":"ContainerStarted","Data":"cbff41087885ab2522bb98ac7a318568a186f835d0ebb6712e9b720cb2692cc6"} Apr 23 13:53:26.121798 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:26.121698 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-1cfcc-predictor-655bf67bfd-prkpg" event={"ID":"38b4a054-ea79-474e-ab9a-7cdac8139d19","Type":"ContainerStarted","Data":"709db47bc88eb8435b65285b1d328847ecb41baa7fd3b18467de35bd33d138f8"} Apr 23 13:53:26.121798 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:26.121709 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-1cfcc-predictor-655bf67bfd-prkpg" event={"ID":"38b4a054-ea79-474e-ab9a-7cdac8139d19","Type":"ContainerStarted","Data":"dec8801bd1f2e35cc6acc3f025974f94570a3bd2e2a046f41812a41b4f714796"} Apr 23 13:53:26.121872 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:26.121798 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-1cfcc-predictor-655bf67bfd-prkpg" Apr 23 13:53:26.139114 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:26.139074 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-1cfcc-predictor-655bf67bfd-prkpg" podStartSLOduration=1.13906299 podStartE2EDuration="1.13906299s" podCreationTimestamp="2026-04-23 13:53:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:53:26.137268882 +0000 UTC m=+1279.115177476" watchObservedRunningTime="2026-04-23 13:53:26.13906299 +0000 UTC m=+1279.116971583" Apr 23 13:53:27.023806 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:27.023763 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-2d7b2-predictor-595799759b-8gwcl" podUID="31bac30d-bf67-403f-b7e9-a5577ec06b89" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 23 13:53:27.124550 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:27.124517 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-1cfcc-predictor-655bf67bfd-prkpg" Apr 23 13:53:27.125782 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:27.125757 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-1cfcc-predictor-655bf67bfd-prkpg" podUID="38b4a054-ea79-474e-ab9a-7cdac8139d19" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:8080: connect: connection refused" Apr 23 13:53:27.757401 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:27.757357 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-f7bdd-7ccdd96579-trdwk" podUID="ae2dabb8-a691-4df0-bcf8-b03f712dbf61" containerName="sequence-graph-f7bdd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:53:28.129208 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:28.129180 2577 generic.go:358] "Generic (PLEG): container finished" podID="9586a1e0-fab0-42ee-8108-aca7842dcef2" containerID="91a6b0acb99f53048054186a255b01fbbf4eca671b320a4030f6c7b71a463aa5" exitCode=0 Apr 23 13:53:28.129553 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:28.129256 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-f7bdd-predictor-84546c4756-jjbxj" event={"ID":"9586a1e0-fab0-42ee-8108-aca7842dcef2","Type":"ContainerDied","Data":"91a6b0acb99f53048054186a255b01fbbf4eca671b320a4030f6c7b71a463aa5"} Apr 23 13:53:28.129604 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:28.129546 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-1cfcc-predictor-655bf67bfd-prkpg" podUID="38b4a054-ea79-474e-ab9a-7cdac8139d19" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:8080: connect: connection refused" Apr 23 13:53:28.250646 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:28.250625 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-f7bdd-predictor-84546c4756-jjbxj" Apr 23 13:53:28.341136 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:28.341061 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9586a1e0-fab0-42ee-8108-aca7842dcef2-proxy-tls\") pod \"9586a1e0-fab0-42ee-8108-aca7842dcef2\" (UID: \"9586a1e0-fab0-42ee-8108-aca7842dcef2\") " Apr 23 13:53:28.341136 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:28.341100 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-f7bdd-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9586a1e0-fab0-42ee-8108-aca7842dcef2-error-404-isvc-f7bdd-kube-rbac-proxy-sar-config\") pod \"9586a1e0-fab0-42ee-8108-aca7842dcef2\" (UID: \"9586a1e0-fab0-42ee-8108-aca7842dcef2\") " Apr 23 13:53:28.341136 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:28.341123 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wlqnk\" (UniqueName: \"kubernetes.io/projected/9586a1e0-fab0-42ee-8108-aca7842dcef2-kube-api-access-wlqnk\") pod \"9586a1e0-fab0-42ee-8108-aca7842dcef2\" (UID: \"9586a1e0-fab0-42ee-8108-aca7842dcef2\") " Apr 23 13:53:28.341442 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:28.341420 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9586a1e0-fab0-42ee-8108-aca7842dcef2-error-404-isvc-f7bdd-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-f7bdd-kube-rbac-proxy-sar-config") pod "9586a1e0-fab0-42ee-8108-aca7842dcef2" (UID: "9586a1e0-fab0-42ee-8108-aca7842dcef2"). InnerVolumeSpecName "error-404-isvc-f7bdd-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:53:28.343125 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:28.343100 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9586a1e0-fab0-42ee-8108-aca7842dcef2-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "9586a1e0-fab0-42ee-8108-aca7842dcef2" (UID: "9586a1e0-fab0-42ee-8108-aca7842dcef2"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:53:28.343228 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:28.343202 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9586a1e0-fab0-42ee-8108-aca7842dcef2-kube-api-access-wlqnk" (OuterVolumeSpecName: "kube-api-access-wlqnk") pod "9586a1e0-fab0-42ee-8108-aca7842dcef2" (UID: "9586a1e0-fab0-42ee-8108-aca7842dcef2"). InnerVolumeSpecName "kube-api-access-wlqnk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:53:28.442194 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:28.442164 2577 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9586a1e0-fab0-42ee-8108-aca7842dcef2-proxy-tls\") on node \"ip-10-0-132-207.ec2.internal\" DevicePath \"\"" Apr 23 13:53:28.442194 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:28.442187 2577 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-f7bdd-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9586a1e0-fab0-42ee-8108-aca7842dcef2-error-404-isvc-f7bdd-kube-rbac-proxy-sar-config\") on node \"ip-10-0-132-207.ec2.internal\" DevicePath \"\"" Apr 23 13:53:28.442194 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:28.442197 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wlqnk\" (UniqueName: \"kubernetes.io/projected/9586a1e0-fab0-42ee-8108-aca7842dcef2-kube-api-access-wlqnk\") on node \"ip-10-0-132-207.ec2.internal\" DevicePath \"\"" Apr 23 13:53:29.133418 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:29.133384 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-f7bdd-predictor-84546c4756-jjbxj" event={"ID":"9586a1e0-fab0-42ee-8108-aca7842dcef2","Type":"ContainerDied","Data":"72b7ada95bfdb582f16abcea9459bb721fc7cf901ae6a0cbb8415069ae312514"} Apr 23 13:53:29.133418 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:29.133426 2577 scope.go:117] "RemoveContainer" containerID="5dbf385f79331e5ffc2ba2d91a98abbbb745924748705618b15fbe3e319c1d24" Apr 23 13:53:29.133902 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:29.133427 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-f7bdd-predictor-84546c4756-jjbxj" Apr 23 13:53:29.141459 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:29.141368 2577 scope.go:117] "RemoveContainer" containerID="91a6b0acb99f53048054186a255b01fbbf4eca671b320a4030f6c7b71a463aa5" Apr 23 13:53:29.155126 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:29.155105 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-f7bdd-predictor-84546c4756-jjbxj"] Apr 23 13:53:29.158660 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:29.158640 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-f7bdd-predictor-84546c4756-jjbxj"] Apr 23 13:53:29.583449 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:29.583414 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9586a1e0-fab0-42ee-8108-aca7842dcef2" path="/var/lib/kubelet/pods/9586a1e0-fab0-42ee-8108-aca7842dcef2/volumes" Apr 23 13:53:32.756813 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:32.756767 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-f7bdd-7ccdd96579-trdwk" podUID="ae2dabb8-a691-4df0-bcf8-b03f712dbf61" containerName="sequence-graph-f7bdd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:53:33.133529 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:33.133442 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-1cfcc-predictor-655bf67bfd-prkpg" Apr 23 13:53:33.133952 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:33.133924 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-1cfcc-predictor-655bf67bfd-prkpg" podUID="38b4a054-ea79-474e-ab9a-7cdac8139d19" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:8080: connect: connection refused" Apr 23 13:53:37.023633 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:37.023605 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-2d7b2-predictor-595799759b-8gwcl" Apr 23 13:53:37.757446 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:37.757402 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-f7bdd-7ccdd96579-trdwk" podUID="ae2dabb8-a691-4df0-bcf8-b03f712dbf61" containerName="sequence-graph-f7bdd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:53:37.757631 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:37.757533 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-f7bdd-7ccdd96579-trdwk" Apr 23 13:53:42.757536 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:42.757480 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-f7bdd-7ccdd96579-trdwk" podUID="ae2dabb8-a691-4df0-bcf8-b03f712dbf61" containerName="sequence-graph-f7bdd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:53:43.134050 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:43.133961 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-1cfcc-predictor-655bf67bfd-prkpg" podUID="38b4a054-ea79-474e-ab9a-7cdac8139d19" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:8080: connect: connection refused" Apr 23 13:53:47.756855 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:47.756817 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-f7bdd-7ccdd96579-trdwk" podUID="ae2dabb8-a691-4df0-bcf8-b03f712dbf61" containerName="sequence-graph-f7bdd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:53:49.578429 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:49.578393 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-2d7b2-5475c4b8c9-gxkpr"] Apr 23 13:53:49.578851 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:49.578698 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9586a1e0-fab0-42ee-8108-aca7842dcef2" containerName="kube-rbac-proxy" Apr 23 13:53:49.578851 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:49.578709 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="9586a1e0-fab0-42ee-8108-aca7842dcef2" containerName="kube-rbac-proxy" Apr 23 13:53:49.578851 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:49.578726 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9586a1e0-fab0-42ee-8108-aca7842dcef2" containerName="kserve-container" Apr 23 13:53:49.578851 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:49.578734 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="9586a1e0-fab0-42ee-8108-aca7842dcef2" containerName="kserve-container" Apr 23 13:53:49.578851 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:49.578789 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="9586a1e0-fab0-42ee-8108-aca7842dcef2" containerName="kube-rbac-proxy" Apr 23 13:53:49.578851 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:49.578799 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="9586a1e0-fab0-42ee-8108-aca7842dcef2" containerName="kserve-container" Apr 23 13:53:49.581427 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:49.581406 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-2d7b2-5475c4b8c9-gxkpr" Apr 23 13:53:49.584347 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:49.584328 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"ensemble-graph-2d7b2-serving-cert\"" Apr 23 13:53:49.584579 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:49.584559 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"ensemble-graph-2d7b2-kube-rbac-proxy-sar-config\"" Apr 23 13:53:49.590177 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:49.590154 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-2d7b2-5475c4b8c9-gxkpr"] Apr 23 13:53:49.700851 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:49.700828 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9a400003-12aa-4880-a99e-89a278351fee-openshift-service-ca-bundle\") pod \"ensemble-graph-2d7b2-5475c4b8c9-gxkpr\" (UID: \"9a400003-12aa-4880-a99e-89a278351fee\") " pod="kserve-ci-e2e-test/ensemble-graph-2d7b2-5475c4b8c9-gxkpr" Apr 23 13:53:49.700969 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:49.700860 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9a400003-12aa-4880-a99e-89a278351fee-proxy-tls\") pod \"ensemble-graph-2d7b2-5475c4b8c9-gxkpr\" (UID: \"9a400003-12aa-4880-a99e-89a278351fee\") " pod="kserve-ci-e2e-test/ensemble-graph-2d7b2-5475c4b8c9-gxkpr" Apr 23 13:53:49.801607 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:49.801569 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9a400003-12aa-4880-a99e-89a278351fee-openshift-service-ca-bundle\") pod \"ensemble-graph-2d7b2-5475c4b8c9-gxkpr\" (UID: \"9a400003-12aa-4880-a99e-89a278351fee\") " pod="kserve-ci-e2e-test/ensemble-graph-2d7b2-5475c4b8c9-gxkpr" Apr 23 13:53:49.801607 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:49.801607 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9a400003-12aa-4880-a99e-89a278351fee-proxy-tls\") pod \"ensemble-graph-2d7b2-5475c4b8c9-gxkpr\" (UID: \"9a400003-12aa-4880-a99e-89a278351fee\") " pod="kserve-ci-e2e-test/ensemble-graph-2d7b2-5475c4b8c9-gxkpr" Apr 23 13:53:49.802198 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:49.802176 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9a400003-12aa-4880-a99e-89a278351fee-openshift-service-ca-bundle\") pod \"ensemble-graph-2d7b2-5475c4b8c9-gxkpr\" (UID: \"9a400003-12aa-4880-a99e-89a278351fee\") " pod="kserve-ci-e2e-test/ensemble-graph-2d7b2-5475c4b8c9-gxkpr" Apr 23 13:53:49.803859 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:49.803838 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9a400003-12aa-4880-a99e-89a278351fee-proxy-tls\") pod \"ensemble-graph-2d7b2-5475c4b8c9-gxkpr\" (UID: \"9a400003-12aa-4880-a99e-89a278351fee\") " pod="kserve-ci-e2e-test/ensemble-graph-2d7b2-5475c4b8c9-gxkpr" Apr 23 13:53:49.892154 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:49.892084 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-2d7b2-5475c4b8c9-gxkpr" Apr 23 13:53:50.015247 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:50.015219 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-2d7b2-5475c4b8c9-gxkpr"] Apr 23 13:53:50.017695 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:53:50.017669 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a400003_12aa_4880_a99e_89a278351fee.slice/crio-5ea8e9f75f981d7575dda2d7c2f8967ff17832a22fc20ccf54dec74b97783de4 WatchSource:0}: Error finding container 5ea8e9f75f981d7575dda2d7c2f8967ff17832a22fc20ccf54dec74b97783de4: Status 404 returned error can't find the container with id 5ea8e9f75f981d7575dda2d7c2f8967ff17832a22fc20ccf54dec74b97783de4 Apr 23 13:53:50.199756 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:50.199717 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-2d7b2-5475c4b8c9-gxkpr" event={"ID":"9a400003-12aa-4880-a99e-89a278351fee","Type":"ContainerStarted","Data":"d7cc6ec3b2362b6bf81cbc8aefbc73b606ccfeba44361ccc99dd06e7e2198e7b"} Apr 23 13:53:50.199756 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:50.199757 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-2d7b2-5475c4b8c9-gxkpr" event={"ID":"9a400003-12aa-4880-a99e-89a278351fee","Type":"ContainerStarted","Data":"5ea8e9f75f981d7575dda2d7c2f8967ff17832a22fc20ccf54dec74b97783de4"} Apr 23 13:53:50.199998 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:50.199799 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/ensemble-graph-2d7b2-5475c4b8c9-gxkpr" Apr 23 13:53:50.222821 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:50.222779 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/ensemble-graph-2d7b2-5475c4b8c9-gxkpr" podStartSLOduration=1.222765734 podStartE2EDuration="1.222765734s" podCreationTimestamp="2026-04-23 13:53:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:53:50.221123658 +0000 UTC m=+1303.199032251" watchObservedRunningTime="2026-04-23 13:53:50.222765734 +0000 UTC m=+1303.200674354" Apr 23 13:53:52.756881 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:52.756845 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-f7bdd-7ccdd96579-trdwk" podUID="ae2dabb8-a691-4df0-bcf8-b03f712dbf61" containerName="sequence-graph-f7bdd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:53:53.134255 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:53.134172 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-1cfcc-predictor-655bf67bfd-prkpg" podUID="38b4a054-ea79-474e-ab9a-7cdac8139d19" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:8080: connect: connection refused" Apr 23 13:53:55.222017 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:55.221916 2577 generic.go:358] "Generic (PLEG): container finished" podID="ae2dabb8-a691-4df0-bcf8-b03f712dbf61" containerID="ee29710fdc063998aec3cdd8c41efc68f82d63038f0a48972fad606096b9da06" exitCode=0 Apr 23 13:53:55.222017 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:55.221995 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-f7bdd-7ccdd96579-trdwk" event={"ID":"ae2dabb8-a691-4df0-bcf8-b03f712dbf61","Type":"ContainerDied","Data":"ee29710fdc063998aec3cdd8c41efc68f82d63038f0a48972fad606096b9da06"} Apr 23 13:53:55.235483 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:55.235464 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-f7bdd-7ccdd96579-trdwk" Apr 23 13:53:55.338524 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:55.338482 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ae2dabb8-a691-4df0-bcf8-b03f712dbf61-proxy-tls\") pod \"ae2dabb8-a691-4df0-bcf8-b03f712dbf61\" (UID: \"ae2dabb8-a691-4df0-bcf8-b03f712dbf61\") " Apr 23 13:53:55.338633 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:55.338573 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ae2dabb8-a691-4df0-bcf8-b03f712dbf61-openshift-service-ca-bundle\") pod \"ae2dabb8-a691-4df0-bcf8-b03f712dbf61\" (UID: \"ae2dabb8-a691-4df0-bcf8-b03f712dbf61\") " Apr 23 13:53:55.338908 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:55.338884 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae2dabb8-a691-4df0-bcf8-b03f712dbf61-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "ae2dabb8-a691-4df0-bcf8-b03f712dbf61" (UID: "ae2dabb8-a691-4df0-bcf8-b03f712dbf61"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:53:55.340382 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:55.340362 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae2dabb8-a691-4df0-bcf8-b03f712dbf61-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "ae2dabb8-a691-4df0-bcf8-b03f712dbf61" (UID: "ae2dabb8-a691-4df0-bcf8-b03f712dbf61"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:53:55.439810 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:55.439787 2577 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ae2dabb8-a691-4df0-bcf8-b03f712dbf61-proxy-tls\") on node \"ip-10-0-132-207.ec2.internal\" DevicePath \"\"" Apr 23 13:53:55.439907 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:55.439811 2577 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ae2dabb8-a691-4df0-bcf8-b03f712dbf61-openshift-service-ca-bundle\") on node \"ip-10-0-132-207.ec2.internal\" DevicePath \"\"" Apr 23 13:53:56.208806 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:56.208776 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/ensemble-graph-2d7b2-5475c4b8c9-gxkpr" Apr 23 13:53:56.226113 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:56.226073 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-f7bdd-7ccdd96579-trdwk" event={"ID":"ae2dabb8-a691-4df0-bcf8-b03f712dbf61","Type":"ContainerDied","Data":"834d147dea0cb529462077f0143339b32e7cb27402dcb77936009418eb28b991"} Apr 23 13:53:56.226113 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:56.226088 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-f7bdd-7ccdd96579-trdwk" Apr 23 13:53:56.226113 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:56.226116 2577 scope.go:117] "RemoveContainer" containerID="ee29710fdc063998aec3cdd8c41efc68f82d63038f0a48972fad606096b9da06" Apr 23 13:53:56.246159 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:56.246122 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-f7bdd-7ccdd96579-trdwk"] Apr 23 13:53:56.258462 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:56.258432 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-f7bdd-7ccdd96579-trdwk"] Apr 23 13:53:57.583374 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:57.583337 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae2dabb8-a691-4df0-bcf8-b03f712dbf61" path="/var/lib/kubelet/pods/ae2dabb8-a691-4df0-bcf8-b03f712dbf61/volumes" Apr 23 13:53:59.617792 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:59.617717 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-2d7b2-5475c4b8c9-gxkpr"] Apr 23 13:53:59.618138 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:59.617956 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/ensemble-graph-2d7b2-5475c4b8c9-gxkpr" podUID="9a400003-12aa-4880-a99e-89a278351fee" containerName="ensemble-graph-2d7b2" containerID="cri-o://d7cc6ec3b2362b6bf81cbc8aefbc73b606ccfeba44361ccc99dd06e7e2198e7b" gracePeriod=30 Apr 23 13:53:59.914293 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:59.914212 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-2d7b2-predictor-595799759b-8gwcl"] Apr 23 13:53:59.914588 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:59.914562 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-2d7b2-predictor-595799759b-8gwcl" podUID="31bac30d-bf67-403f-b7e9-a5577ec06b89" containerName="kserve-container" containerID="cri-o://b7143d1c785e1ad6ed656093d0a6e90cf6b15d4a86c14c040e1d20be4a25f3d5" gracePeriod=30 Apr 23 13:53:59.914691 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:59.914584 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-2d7b2-predictor-595799759b-8gwcl" podUID="31bac30d-bf67-403f-b7e9-a5577ec06b89" containerName="kube-rbac-proxy" containerID="cri-o://e0bd6056fb5aab8347f7e527bfc93811fb9647ba966ac417c62576b9a71c935c" gracePeriod=30 Apr 23 13:53:59.959059 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:59.959034 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-37fbd-predictor-7544db6574-c755q"] Apr 23 13:53:59.959338 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:59.959326 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ae2dabb8-a691-4df0-bcf8-b03f712dbf61" containerName="sequence-graph-f7bdd" Apr 23 13:53:59.959387 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:59.959340 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae2dabb8-a691-4df0-bcf8-b03f712dbf61" containerName="sequence-graph-f7bdd" Apr 23 13:53:59.959421 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:59.959396 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="ae2dabb8-a691-4df0-bcf8-b03f712dbf61" containerName="sequence-graph-f7bdd" Apr 23 13:53:59.963592 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:59.963574 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-37fbd-predictor-7544db6574-c755q" Apr 23 13:53:59.966457 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:59.966441 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-37fbd-predictor-serving-cert\"" Apr 23 13:53:59.966457 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:59.966446 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-37fbd-kube-rbac-proxy-sar-config\"" Apr 23 13:53:59.975364 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:53:59.975344 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-37fbd-predictor-7544db6574-c755q"] Apr 23 13:54:00.074308 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:00.074267 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-37fbd-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/014980aa-a818-4b54-8ae1-7d91c7a3e843-error-404-isvc-37fbd-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-37fbd-predictor-7544db6574-c755q\" (UID: \"014980aa-a818-4b54-8ae1-7d91c7a3e843\") " pod="kserve-ci-e2e-test/error-404-isvc-37fbd-predictor-7544db6574-c755q" Apr 23 13:54:00.074446 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:00.074321 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qpln\" (UniqueName: \"kubernetes.io/projected/014980aa-a818-4b54-8ae1-7d91c7a3e843-kube-api-access-8qpln\") pod \"error-404-isvc-37fbd-predictor-7544db6574-c755q\" (UID: \"014980aa-a818-4b54-8ae1-7d91c7a3e843\") " pod="kserve-ci-e2e-test/error-404-isvc-37fbd-predictor-7544db6574-c755q" Apr 23 13:54:00.074446 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:00.074393 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/014980aa-a818-4b54-8ae1-7d91c7a3e843-proxy-tls\") pod \"error-404-isvc-37fbd-predictor-7544db6574-c755q\" (UID: \"014980aa-a818-4b54-8ae1-7d91c7a3e843\") " pod="kserve-ci-e2e-test/error-404-isvc-37fbd-predictor-7544db6574-c755q" Apr 23 13:54:00.175586 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:00.175506 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-37fbd-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/014980aa-a818-4b54-8ae1-7d91c7a3e843-error-404-isvc-37fbd-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-37fbd-predictor-7544db6574-c755q\" (UID: \"014980aa-a818-4b54-8ae1-7d91c7a3e843\") " pod="kserve-ci-e2e-test/error-404-isvc-37fbd-predictor-7544db6574-c755q" Apr 23 13:54:00.175586 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:00.175546 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8qpln\" (UniqueName: \"kubernetes.io/projected/014980aa-a818-4b54-8ae1-7d91c7a3e843-kube-api-access-8qpln\") pod \"error-404-isvc-37fbd-predictor-7544db6574-c755q\" (UID: \"014980aa-a818-4b54-8ae1-7d91c7a3e843\") " pod="kserve-ci-e2e-test/error-404-isvc-37fbd-predictor-7544db6574-c755q" Apr 23 13:54:00.175797 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:00.175605 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/014980aa-a818-4b54-8ae1-7d91c7a3e843-proxy-tls\") pod \"error-404-isvc-37fbd-predictor-7544db6574-c755q\" (UID: \"014980aa-a818-4b54-8ae1-7d91c7a3e843\") " pod="kserve-ci-e2e-test/error-404-isvc-37fbd-predictor-7544db6574-c755q" Apr 23 13:54:00.176171 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:00.176145 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-37fbd-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/014980aa-a818-4b54-8ae1-7d91c7a3e843-error-404-isvc-37fbd-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-37fbd-predictor-7544db6574-c755q\" (UID: \"014980aa-a818-4b54-8ae1-7d91c7a3e843\") " pod="kserve-ci-e2e-test/error-404-isvc-37fbd-predictor-7544db6574-c755q" Apr 23 13:54:00.177883 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:00.177862 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/014980aa-a818-4b54-8ae1-7d91c7a3e843-proxy-tls\") pod \"error-404-isvc-37fbd-predictor-7544db6574-c755q\" (UID: \"014980aa-a818-4b54-8ae1-7d91c7a3e843\") " pod="kserve-ci-e2e-test/error-404-isvc-37fbd-predictor-7544db6574-c755q" Apr 23 13:54:00.184701 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:00.184677 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qpln\" (UniqueName: \"kubernetes.io/projected/014980aa-a818-4b54-8ae1-7d91c7a3e843-kube-api-access-8qpln\") pod \"error-404-isvc-37fbd-predictor-7544db6574-c755q\" (UID: \"014980aa-a818-4b54-8ae1-7d91c7a3e843\") " pod="kserve-ci-e2e-test/error-404-isvc-37fbd-predictor-7544db6574-c755q" Apr 23 13:54:00.239972 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:00.239945 2577 generic.go:358] "Generic (PLEG): container finished" podID="31bac30d-bf67-403f-b7e9-a5577ec06b89" containerID="e0bd6056fb5aab8347f7e527bfc93811fb9647ba966ac417c62576b9a71c935c" exitCode=2 Apr 23 13:54:00.240069 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:00.239978 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-2d7b2-predictor-595799759b-8gwcl" event={"ID":"31bac30d-bf67-403f-b7e9-a5577ec06b89","Type":"ContainerDied","Data":"e0bd6056fb5aab8347f7e527bfc93811fb9647ba966ac417c62576b9a71c935c"} Apr 23 13:54:00.273304 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:00.273283 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-37fbd-predictor-7544db6574-c755q" Apr 23 13:54:00.405699 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:00.405675 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-37fbd-predictor-7544db6574-c755q"] Apr 23 13:54:00.407633 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:54:00.407595 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod014980aa_a818_4b54_8ae1_7d91c7a3e843.slice/crio-f18216fe5348e59a889557c0ed5396406eff9db9e62d80f82b12dc9ec3e4216a WatchSource:0}: Error finding container f18216fe5348e59a889557c0ed5396406eff9db9e62d80f82b12dc9ec3e4216a: Status 404 returned error can't find the container with id f18216fe5348e59a889557c0ed5396406eff9db9e62d80f82b12dc9ec3e4216a Apr 23 13:54:01.207477 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:01.207429 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-2d7b2-5475c4b8c9-gxkpr" podUID="9a400003-12aa-4880-a99e-89a278351fee" containerName="ensemble-graph-2d7b2" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:54:01.244775 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:01.244737 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-37fbd-predictor-7544db6574-c755q" event={"ID":"014980aa-a818-4b54-8ae1-7d91c7a3e843","Type":"ContainerStarted","Data":"8daf0f5fda6b4cb6f682d1d2fe2c5950193ddd8ece6875b3f750cae344f6c8b2"} Apr 23 13:54:01.244775 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:01.244777 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-37fbd-predictor-7544db6574-c755q" event={"ID":"014980aa-a818-4b54-8ae1-7d91c7a3e843","Type":"ContainerStarted","Data":"f2f4f13f09288e783691efa8f72a9c6de6fd795e739fcdfe6a5b9ef79c44e053"} Apr 23 13:54:01.244935 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:01.244791 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-37fbd-predictor-7544db6574-c755q" event={"ID":"014980aa-a818-4b54-8ae1-7d91c7a3e843","Type":"ContainerStarted","Data":"f18216fe5348e59a889557c0ed5396406eff9db9e62d80f82b12dc9ec3e4216a"} Apr 23 13:54:01.244935 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:01.244888 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-37fbd-predictor-7544db6574-c755q" Apr 23 13:54:01.266429 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:01.266266 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-37fbd-predictor-7544db6574-c755q" podStartSLOduration=2.266249397 podStartE2EDuration="2.266249397s" podCreationTimestamp="2026-04-23 13:53:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:54:01.264422148 +0000 UTC m=+1314.242330734" watchObservedRunningTime="2026-04-23 13:54:01.266249397 +0000 UTC m=+1314.244157990" Apr 23 13:54:02.019204 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:02.019164 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-2d7b2-predictor-595799759b-8gwcl" podUID="31bac30d-bf67-403f-b7e9-a5577ec06b89" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.25:8643/healthz\": dial tcp 10.132.0.25:8643: connect: connection refused" Apr 23 13:54:02.248459 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:02.248429 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-37fbd-predictor-7544db6574-c755q" Apr 23 13:54:02.249796 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:02.249763 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-37fbd-predictor-7544db6574-c755q" podUID="014980aa-a818-4b54-8ae1-7d91c7a3e843" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.28:8080: connect: connection refused" Apr 23 13:54:02.870844 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:02.870821 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-2d7b2-predictor-595799759b-8gwcl" Apr 23 13:54:02.999597 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:02.999568 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-2d7b2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/31bac30d-bf67-403f-b7e9-a5577ec06b89-error-404-isvc-2d7b2-kube-rbac-proxy-sar-config\") pod \"31bac30d-bf67-403f-b7e9-a5577ec06b89\" (UID: \"31bac30d-bf67-403f-b7e9-a5577ec06b89\") " Apr 23 13:54:02.999803 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:02.999609 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31bac30d-bf67-403f-b7e9-a5577ec06b89-proxy-tls\") pod \"31bac30d-bf67-403f-b7e9-a5577ec06b89\" (UID: \"31bac30d-bf67-403f-b7e9-a5577ec06b89\") " Apr 23 13:54:02.999803 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:02.999659 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2bhbw\" (UniqueName: \"kubernetes.io/projected/31bac30d-bf67-403f-b7e9-a5577ec06b89-kube-api-access-2bhbw\") pod \"31bac30d-bf67-403f-b7e9-a5577ec06b89\" (UID: \"31bac30d-bf67-403f-b7e9-a5577ec06b89\") " Apr 23 13:54:02.999987 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:02.999964 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31bac30d-bf67-403f-b7e9-a5577ec06b89-error-404-isvc-2d7b2-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-2d7b2-kube-rbac-proxy-sar-config") pod "31bac30d-bf67-403f-b7e9-a5577ec06b89" (UID: "31bac30d-bf67-403f-b7e9-a5577ec06b89"). InnerVolumeSpecName "error-404-isvc-2d7b2-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:54:03.001699 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:03.001669 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31bac30d-bf67-403f-b7e9-a5577ec06b89-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31bac30d-bf67-403f-b7e9-a5577ec06b89" (UID: "31bac30d-bf67-403f-b7e9-a5577ec06b89"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:54:03.001795 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:03.001716 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31bac30d-bf67-403f-b7e9-a5577ec06b89-kube-api-access-2bhbw" (OuterVolumeSpecName: "kube-api-access-2bhbw") pod "31bac30d-bf67-403f-b7e9-a5577ec06b89" (UID: "31bac30d-bf67-403f-b7e9-a5577ec06b89"). InnerVolumeSpecName "kube-api-access-2bhbw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:54:03.100198 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:03.100170 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2bhbw\" (UniqueName: \"kubernetes.io/projected/31bac30d-bf67-403f-b7e9-a5577ec06b89-kube-api-access-2bhbw\") on node \"ip-10-0-132-207.ec2.internal\" DevicePath \"\"" Apr 23 13:54:03.100198 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:03.100193 2577 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-2d7b2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/31bac30d-bf67-403f-b7e9-a5577ec06b89-error-404-isvc-2d7b2-kube-rbac-proxy-sar-config\") on node \"ip-10-0-132-207.ec2.internal\" DevicePath \"\"" Apr 23 13:54:03.100337 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:03.100205 2577 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31bac30d-bf67-403f-b7e9-a5577ec06b89-proxy-tls\") on node \"ip-10-0-132-207.ec2.internal\" DevicePath \"\"" Apr 23 13:54:03.134406 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:03.134379 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-1cfcc-predictor-655bf67bfd-prkpg" podUID="38b4a054-ea79-474e-ab9a-7cdac8139d19" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:8080: connect: connection refused" Apr 23 13:54:03.252193 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:03.252105 2577 generic.go:358] "Generic (PLEG): container finished" podID="31bac30d-bf67-403f-b7e9-a5577ec06b89" containerID="b7143d1c785e1ad6ed656093d0a6e90cf6b15d4a86c14c040e1d20be4a25f3d5" exitCode=0 Apr 23 13:54:03.252193 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:03.252177 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-2d7b2-predictor-595799759b-8gwcl" Apr 23 13:54:03.252690 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:03.252182 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-2d7b2-predictor-595799759b-8gwcl" event={"ID":"31bac30d-bf67-403f-b7e9-a5577ec06b89","Type":"ContainerDied","Data":"b7143d1c785e1ad6ed656093d0a6e90cf6b15d4a86c14c040e1d20be4a25f3d5"} Apr 23 13:54:03.252690 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:03.252227 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-2d7b2-predictor-595799759b-8gwcl" event={"ID":"31bac30d-bf67-403f-b7e9-a5577ec06b89","Type":"ContainerDied","Data":"fe7797b6d673400d2e0c20a89cf100f3199f6dca78ed96ad78e76eacbe1d1100"} Apr 23 13:54:03.252690 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:03.252248 2577 scope.go:117] "RemoveContainer" containerID="e0bd6056fb5aab8347f7e527bfc93811fb9647ba966ac417c62576b9a71c935c" Apr 23 13:54:03.252690 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:03.252678 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-37fbd-predictor-7544db6574-c755q" podUID="014980aa-a818-4b54-8ae1-7d91c7a3e843" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.28:8080: connect: connection refused" Apr 23 13:54:03.260998 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:03.260980 2577 scope.go:117] "RemoveContainer" containerID="b7143d1c785e1ad6ed656093d0a6e90cf6b15d4a86c14c040e1d20be4a25f3d5" Apr 23 13:54:03.269094 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:03.269075 2577 scope.go:117] "RemoveContainer" containerID="e0bd6056fb5aab8347f7e527bfc93811fb9647ba966ac417c62576b9a71c935c" Apr 23 13:54:03.269364 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:54:03.269344 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0bd6056fb5aab8347f7e527bfc93811fb9647ba966ac417c62576b9a71c935c\": container with ID starting with e0bd6056fb5aab8347f7e527bfc93811fb9647ba966ac417c62576b9a71c935c not found: ID does not exist" containerID="e0bd6056fb5aab8347f7e527bfc93811fb9647ba966ac417c62576b9a71c935c" Apr 23 13:54:03.269439 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:03.269372 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0bd6056fb5aab8347f7e527bfc93811fb9647ba966ac417c62576b9a71c935c"} err="failed to get container status \"e0bd6056fb5aab8347f7e527bfc93811fb9647ba966ac417c62576b9a71c935c\": rpc error: code = NotFound desc = could not find container \"e0bd6056fb5aab8347f7e527bfc93811fb9647ba966ac417c62576b9a71c935c\": container with ID starting with e0bd6056fb5aab8347f7e527bfc93811fb9647ba966ac417c62576b9a71c935c not found: ID does not exist" Apr 23 13:54:03.269439 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:03.269388 2577 scope.go:117] "RemoveContainer" containerID="b7143d1c785e1ad6ed656093d0a6e90cf6b15d4a86c14c040e1d20be4a25f3d5" Apr 23 13:54:03.269631 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:54:03.269615 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7143d1c785e1ad6ed656093d0a6e90cf6b15d4a86c14c040e1d20be4a25f3d5\": container with ID starting with b7143d1c785e1ad6ed656093d0a6e90cf6b15d4a86c14c040e1d20be4a25f3d5 not found: ID does not exist" containerID="b7143d1c785e1ad6ed656093d0a6e90cf6b15d4a86c14c040e1d20be4a25f3d5" Apr 23 13:54:03.269695 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:03.269634 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7143d1c785e1ad6ed656093d0a6e90cf6b15d4a86c14c040e1d20be4a25f3d5"} err="failed to get container status \"b7143d1c785e1ad6ed656093d0a6e90cf6b15d4a86c14c040e1d20be4a25f3d5\": rpc error: code = NotFound desc = could not find container \"b7143d1c785e1ad6ed656093d0a6e90cf6b15d4a86c14c040e1d20be4a25f3d5\": container with ID starting with b7143d1c785e1ad6ed656093d0a6e90cf6b15d4a86c14c040e1d20be4a25f3d5 not found: ID does not exist" Apr 23 13:54:03.274086 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:03.273928 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-2d7b2-predictor-595799759b-8gwcl"] Apr 23 13:54:03.275530 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:03.275510 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-2d7b2-predictor-595799759b-8gwcl"] Apr 23 13:54:03.583602 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:03.583527 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31bac30d-bf67-403f-b7e9-a5577ec06b89" path="/var/lib/kubelet/pods/31bac30d-bf67-403f-b7e9-a5577ec06b89/volumes" Apr 23 13:54:06.207953 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:06.207917 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-2d7b2-5475c4b8c9-gxkpr" podUID="9a400003-12aa-4880-a99e-89a278351fee" containerName="ensemble-graph-2d7b2" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:54:08.258105 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:08.258072 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-37fbd-predictor-7544db6574-c755q" Apr 23 13:54:08.258632 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:08.258604 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-37fbd-predictor-7544db6574-c755q" podUID="014980aa-a818-4b54-8ae1-7d91c7a3e843" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.28:8080: connect: connection refused" Apr 23 13:54:11.207532 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:11.207473 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-2d7b2-5475c4b8c9-gxkpr" podUID="9a400003-12aa-4880-a99e-89a278351fee" containerName="ensemble-graph-2d7b2" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:54:11.207973 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:11.207596 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/ensemble-graph-2d7b2-5475c4b8c9-gxkpr" Apr 23 13:54:13.134550 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:13.134520 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-1cfcc-predictor-655bf67bfd-prkpg" Apr 23 13:54:16.206919 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:16.206879 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-2d7b2-5475c4b8c9-gxkpr" podUID="9a400003-12aa-4880-a99e-89a278351fee" containerName="ensemble-graph-2d7b2" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:54:18.259054 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:18.259014 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-37fbd-predictor-7544db6574-c755q" podUID="014980aa-a818-4b54-8ae1-7d91c7a3e843" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.28:8080: connect: connection refused" Apr 23 13:54:21.207245 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:21.207208 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-2d7b2-5475c4b8c9-gxkpr" podUID="9a400003-12aa-4880-a99e-89a278351fee" containerName="ensemble-graph-2d7b2" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:54:25.323703 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:25.323670 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/sequence-graph-1cfcc-57cb468459-llds7"] Apr 23 13:54:25.324145 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:25.323975 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="31bac30d-bf67-403f-b7e9-a5577ec06b89" containerName="kserve-container" Apr 23 13:54:25.324145 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:25.323986 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="31bac30d-bf67-403f-b7e9-a5577ec06b89" containerName="kserve-container" Apr 23 13:54:25.324145 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:25.323995 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="31bac30d-bf67-403f-b7e9-a5577ec06b89" containerName="kube-rbac-proxy" Apr 23 13:54:25.324145 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:25.324001 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="31bac30d-bf67-403f-b7e9-a5577ec06b89" containerName="kube-rbac-proxy" Apr 23 13:54:25.324145 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:25.324048 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="31bac30d-bf67-403f-b7e9-a5577ec06b89" containerName="kube-rbac-proxy" Apr 23 13:54:25.324145 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:25.324055 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="31bac30d-bf67-403f-b7e9-a5577ec06b89" containerName="kserve-container" Apr 23 13:54:25.328191 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:25.328175 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-1cfcc-57cb468459-llds7" Apr 23 13:54:25.330641 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:25.330622 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-1cfcc-serving-cert\"" Apr 23 13:54:25.330937 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:25.330917 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-1cfcc-kube-rbac-proxy-sar-config\"" Apr 23 13:54:25.341595 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:25.341577 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-1cfcc-57cb468459-llds7"] Apr 23 13:54:25.455709 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:25.455684 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/10263db1-913b-4de1-b858-7eccae24824d-openshift-service-ca-bundle\") pod \"sequence-graph-1cfcc-57cb468459-llds7\" (UID: \"10263db1-913b-4de1-b858-7eccae24824d\") " pod="kserve-ci-e2e-test/sequence-graph-1cfcc-57cb468459-llds7" Apr 23 13:54:25.455825 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:25.455713 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/10263db1-913b-4de1-b858-7eccae24824d-proxy-tls\") pod \"sequence-graph-1cfcc-57cb468459-llds7\" (UID: \"10263db1-913b-4de1-b858-7eccae24824d\") " pod="kserve-ci-e2e-test/sequence-graph-1cfcc-57cb468459-llds7" Apr 23 13:54:25.557100 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:25.557076 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/10263db1-913b-4de1-b858-7eccae24824d-openshift-service-ca-bundle\") pod \"sequence-graph-1cfcc-57cb468459-llds7\" (UID: \"10263db1-913b-4de1-b858-7eccae24824d\") " pod="kserve-ci-e2e-test/sequence-graph-1cfcc-57cb468459-llds7" Apr 23 13:54:25.557219 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:25.557109 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/10263db1-913b-4de1-b858-7eccae24824d-proxy-tls\") pod \"sequence-graph-1cfcc-57cb468459-llds7\" (UID: \"10263db1-913b-4de1-b858-7eccae24824d\") " pod="kserve-ci-e2e-test/sequence-graph-1cfcc-57cb468459-llds7" Apr 23 13:54:25.557723 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:25.557701 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/10263db1-913b-4de1-b858-7eccae24824d-openshift-service-ca-bundle\") pod \"sequence-graph-1cfcc-57cb468459-llds7\" (UID: \"10263db1-913b-4de1-b858-7eccae24824d\") " pod="kserve-ci-e2e-test/sequence-graph-1cfcc-57cb468459-llds7" Apr 23 13:54:25.559324 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:25.559308 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/10263db1-913b-4de1-b858-7eccae24824d-proxy-tls\") pod \"sequence-graph-1cfcc-57cb468459-llds7\" (UID: \"10263db1-913b-4de1-b858-7eccae24824d\") " pod="kserve-ci-e2e-test/sequence-graph-1cfcc-57cb468459-llds7" Apr 23 13:54:25.638276 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:25.638225 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-1cfcc-57cb468459-llds7" Apr 23 13:54:25.765229 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:25.765211 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-1cfcc-57cb468459-llds7"] Apr 23 13:54:26.207502 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:26.207403 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-2d7b2-5475c4b8c9-gxkpr" podUID="9a400003-12aa-4880-a99e-89a278351fee" containerName="ensemble-graph-2d7b2" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:54:26.318789 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:26.318752 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-1cfcc-57cb468459-llds7" event={"ID":"10263db1-913b-4de1-b858-7eccae24824d","Type":"ContainerStarted","Data":"12755dd30a8e239a13556b61639a3dfeb41854396b9ca5c830ca668791b52e37"} Apr 23 13:54:26.318789 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:26.318787 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-1cfcc-57cb468459-llds7" event={"ID":"10263db1-913b-4de1-b858-7eccae24824d","Type":"ContainerStarted","Data":"5e52e25725915adbf3f91256756d21a635d806168c23a7bb32d1ff0afac34fb5"} Apr 23 13:54:26.319052 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:26.318810 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-1cfcc-57cb468459-llds7" Apr 23 13:54:28.258556 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:28.258521 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-37fbd-predictor-7544db6574-c755q" podUID="014980aa-a818-4b54-8ae1-7d91c7a3e843" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.28:8080: connect: connection refused" Apr 23 13:54:29.640712 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:54:29.640669 2577 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a400003_12aa_4880_a99e_89a278351fee.slice/crio-conmon-d7cc6ec3b2362b6bf81cbc8aefbc73b606ccfeba44361ccc99dd06e7e2198e7b.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a400003_12aa_4880_a99e_89a278351fee.slice/crio-d7cc6ec3b2362b6bf81cbc8aefbc73b606ccfeba44361ccc99dd06e7e2198e7b.scope\": RecentStats: unable to find data in memory cache]" Apr 23 13:54:29.643571 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:54:29.640912 2577 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a400003_12aa_4880_a99e_89a278351fee.slice/crio-conmon-d7cc6ec3b2362b6bf81cbc8aefbc73b606ccfeba44361ccc99dd06e7e2198e7b.scope\": RecentStats: unable to find data in memory cache]" Apr 23 13:54:29.647380 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:54:29.647334 2577 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a400003_12aa_4880_a99e_89a278351fee.slice/crio-conmon-d7cc6ec3b2362b6bf81cbc8aefbc73b606ccfeba44361ccc99dd06e7e2198e7b.scope\": RecentStats: unable to find data in memory cache]" Apr 23 13:54:29.755774 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:29.755747 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-2d7b2-5475c4b8c9-gxkpr" Apr 23 13:54:29.772918 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:29.772465 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/sequence-graph-1cfcc-57cb468459-llds7" podStartSLOduration=4.772446409 podStartE2EDuration="4.772446409s" podCreationTimestamp="2026-04-23 13:54:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:54:26.341617487 +0000 UTC m=+1339.319526079" watchObservedRunningTime="2026-04-23 13:54:29.772446409 +0000 UTC m=+1342.750355002" Apr 23 13:54:29.895110 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:29.895020 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9a400003-12aa-4880-a99e-89a278351fee-proxy-tls\") pod \"9a400003-12aa-4880-a99e-89a278351fee\" (UID: \"9a400003-12aa-4880-a99e-89a278351fee\") " Apr 23 13:54:29.895110 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:29.895056 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9a400003-12aa-4880-a99e-89a278351fee-openshift-service-ca-bundle\") pod \"9a400003-12aa-4880-a99e-89a278351fee\" (UID: \"9a400003-12aa-4880-a99e-89a278351fee\") " Apr 23 13:54:29.895406 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:29.895372 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a400003-12aa-4880-a99e-89a278351fee-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "9a400003-12aa-4880-a99e-89a278351fee" (UID: "9a400003-12aa-4880-a99e-89a278351fee"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:54:29.896952 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:29.896929 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a400003-12aa-4880-a99e-89a278351fee-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "9a400003-12aa-4880-a99e-89a278351fee" (UID: "9a400003-12aa-4880-a99e-89a278351fee"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:54:29.995698 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:29.995666 2577 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9a400003-12aa-4880-a99e-89a278351fee-proxy-tls\") on node \"ip-10-0-132-207.ec2.internal\" DevicePath \"\"" Apr 23 13:54:29.995698 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:29.995694 2577 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9a400003-12aa-4880-a99e-89a278351fee-openshift-service-ca-bundle\") on node \"ip-10-0-132-207.ec2.internal\" DevicePath \"\"" Apr 23 13:54:30.329542 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:30.329482 2577 generic.go:358] "Generic (PLEG): container finished" podID="9a400003-12aa-4880-a99e-89a278351fee" containerID="d7cc6ec3b2362b6bf81cbc8aefbc73b606ccfeba44361ccc99dd06e7e2198e7b" exitCode=0 Apr 23 13:54:30.329726 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:30.329563 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-2d7b2-5475c4b8c9-gxkpr" Apr 23 13:54:30.329726 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:30.329580 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-2d7b2-5475c4b8c9-gxkpr" event={"ID":"9a400003-12aa-4880-a99e-89a278351fee","Type":"ContainerDied","Data":"d7cc6ec3b2362b6bf81cbc8aefbc73b606ccfeba44361ccc99dd06e7e2198e7b"} Apr 23 13:54:30.329726 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:30.329618 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-2d7b2-5475c4b8c9-gxkpr" event={"ID":"9a400003-12aa-4880-a99e-89a278351fee","Type":"ContainerDied","Data":"5ea8e9f75f981d7575dda2d7c2f8967ff17832a22fc20ccf54dec74b97783de4"} Apr 23 13:54:30.329726 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:30.329633 2577 scope.go:117] "RemoveContainer" containerID="d7cc6ec3b2362b6bf81cbc8aefbc73b606ccfeba44361ccc99dd06e7e2198e7b" Apr 23 13:54:30.337323 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:30.337306 2577 scope.go:117] "RemoveContainer" containerID="d7cc6ec3b2362b6bf81cbc8aefbc73b606ccfeba44361ccc99dd06e7e2198e7b" Apr 23 13:54:30.337554 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:54:30.337536 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7cc6ec3b2362b6bf81cbc8aefbc73b606ccfeba44361ccc99dd06e7e2198e7b\": container with ID starting with d7cc6ec3b2362b6bf81cbc8aefbc73b606ccfeba44361ccc99dd06e7e2198e7b not found: ID does not exist" containerID="d7cc6ec3b2362b6bf81cbc8aefbc73b606ccfeba44361ccc99dd06e7e2198e7b" Apr 23 13:54:30.337606 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:30.337562 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7cc6ec3b2362b6bf81cbc8aefbc73b606ccfeba44361ccc99dd06e7e2198e7b"} err="failed to get container status \"d7cc6ec3b2362b6bf81cbc8aefbc73b606ccfeba44361ccc99dd06e7e2198e7b\": rpc error: code = NotFound desc = could not find container \"d7cc6ec3b2362b6bf81cbc8aefbc73b606ccfeba44361ccc99dd06e7e2198e7b\": container with ID starting with d7cc6ec3b2362b6bf81cbc8aefbc73b606ccfeba44361ccc99dd06e7e2198e7b not found: ID does not exist" Apr 23 13:54:30.350619 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:30.350595 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-2d7b2-5475c4b8c9-gxkpr"] Apr 23 13:54:30.356107 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:30.356086 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-2d7b2-5475c4b8c9-gxkpr"] Apr 23 13:54:31.583465 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:31.583432 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a400003-12aa-4880-a99e-89a278351fee" path="/var/lib/kubelet/pods/9a400003-12aa-4880-a99e-89a278351fee/volumes" Apr 23 13:54:32.327141 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:32.327113 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sequence-graph-1cfcc-57cb468459-llds7" Apr 23 13:54:35.353598 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:35.353569 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-1cfcc-57cb468459-llds7"] Apr 23 13:54:35.353973 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:35.353776 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sequence-graph-1cfcc-57cb468459-llds7" podUID="10263db1-913b-4de1-b858-7eccae24824d" containerName="sequence-graph-1cfcc" containerID="cri-o://12755dd30a8e239a13556b61639a3dfeb41854396b9ca5c830ca668791b52e37" gracePeriod=30 Apr 23 13:54:35.433818 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:35.433778 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-1cfcc-predictor-655bf67bfd-prkpg"] Apr 23 13:54:35.434183 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:35.434153 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-1cfcc-predictor-655bf67bfd-prkpg" podUID="38b4a054-ea79-474e-ab9a-7cdac8139d19" containerName="kserve-container" containerID="cri-o://709db47bc88eb8435b65285b1d328847ecb41baa7fd3b18467de35bd33d138f8" gracePeriod=30 Apr 23 13:54:35.434277 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:35.434191 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-1cfcc-predictor-655bf67bfd-prkpg" podUID="38b4a054-ea79-474e-ab9a-7cdac8139d19" containerName="kube-rbac-proxy" containerID="cri-o://cbff41087885ab2522bb98ac7a318568a186f835d0ebb6712e9b720cb2692cc6" gracePeriod=30 Apr 23 13:54:35.635667 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:35.635595 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-8d150-predictor-568b6759f8-b7rcq"] Apr 23 13:54:35.635889 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:35.635878 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9a400003-12aa-4880-a99e-89a278351fee" containerName="ensemble-graph-2d7b2" Apr 23 13:54:35.635939 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:35.635891 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a400003-12aa-4880-a99e-89a278351fee" containerName="ensemble-graph-2d7b2" Apr 23 13:54:35.635974 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:35.635944 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="9a400003-12aa-4880-a99e-89a278351fee" containerName="ensemble-graph-2d7b2" Apr 23 13:54:35.638806 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:35.638788 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-8d150-predictor-568b6759f8-b7rcq" Apr 23 13:54:35.641400 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:35.641380 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-8d150-predictor-serving-cert\"" Apr 23 13:54:35.641400 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:35.641390 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-8d150-kube-rbac-proxy-sar-config\"" Apr 23 13:54:35.646104 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:35.646085 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-8d150-predictor-568b6759f8-b7rcq"] Apr 23 13:54:35.740131 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:35.740100 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-8d150-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2cd36014-1cf4-473d-9b21-8a05cbcca2f7-error-404-isvc-8d150-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-8d150-predictor-568b6759f8-b7rcq\" (UID: \"2cd36014-1cf4-473d-9b21-8a05cbcca2f7\") " pod="kserve-ci-e2e-test/error-404-isvc-8d150-predictor-568b6759f8-b7rcq" Apr 23 13:54:35.740265 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:35.740157 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2cd36014-1cf4-473d-9b21-8a05cbcca2f7-proxy-tls\") pod \"error-404-isvc-8d150-predictor-568b6759f8-b7rcq\" (UID: \"2cd36014-1cf4-473d-9b21-8a05cbcca2f7\") " pod="kserve-ci-e2e-test/error-404-isvc-8d150-predictor-568b6759f8-b7rcq" Apr 23 13:54:35.740265 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:35.740206 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cjwt\" (UniqueName: \"kubernetes.io/projected/2cd36014-1cf4-473d-9b21-8a05cbcca2f7-kube-api-access-5cjwt\") pod \"error-404-isvc-8d150-predictor-568b6759f8-b7rcq\" (UID: \"2cd36014-1cf4-473d-9b21-8a05cbcca2f7\") " pod="kserve-ci-e2e-test/error-404-isvc-8d150-predictor-568b6759f8-b7rcq" Apr 23 13:54:35.841385 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:35.841353 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2cd36014-1cf4-473d-9b21-8a05cbcca2f7-proxy-tls\") pod \"error-404-isvc-8d150-predictor-568b6759f8-b7rcq\" (UID: \"2cd36014-1cf4-473d-9b21-8a05cbcca2f7\") " pod="kserve-ci-e2e-test/error-404-isvc-8d150-predictor-568b6759f8-b7rcq" Apr 23 13:54:35.841623 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:35.841404 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5cjwt\" (UniqueName: \"kubernetes.io/projected/2cd36014-1cf4-473d-9b21-8a05cbcca2f7-kube-api-access-5cjwt\") pod \"error-404-isvc-8d150-predictor-568b6759f8-b7rcq\" (UID: \"2cd36014-1cf4-473d-9b21-8a05cbcca2f7\") " pod="kserve-ci-e2e-test/error-404-isvc-8d150-predictor-568b6759f8-b7rcq" Apr 23 13:54:35.841623 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:35.841429 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-8d150-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2cd36014-1cf4-473d-9b21-8a05cbcca2f7-error-404-isvc-8d150-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-8d150-predictor-568b6759f8-b7rcq\" (UID: \"2cd36014-1cf4-473d-9b21-8a05cbcca2f7\") " pod="kserve-ci-e2e-test/error-404-isvc-8d150-predictor-568b6759f8-b7rcq" Apr 23 13:54:35.841623 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:54:35.841530 2577 secret.go:189] Couldn't get secret kserve-ci-e2e-test/error-404-isvc-8d150-predictor-serving-cert: secret "error-404-isvc-8d150-predictor-serving-cert" not found Apr 23 13:54:35.841623 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:54:35.841612 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2cd36014-1cf4-473d-9b21-8a05cbcca2f7-proxy-tls podName:2cd36014-1cf4-473d-9b21-8a05cbcca2f7 nodeName:}" failed. No retries permitted until 2026-04-23 13:54:36.341590747 +0000 UTC m=+1349.319499321 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/2cd36014-1cf4-473d-9b21-8a05cbcca2f7-proxy-tls") pod "error-404-isvc-8d150-predictor-568b6759f8-b7rcq" (UID: "2cd36014-1cf4-473d-9b21-8a05cbcca2f7") : secret "error-404-isvc-8d150-predictor-serving-cert" not found Apr 23 13:54:35.842034 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:35.842011 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-8d150-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2cd36014-1cf4-473d-9b21-8a05cbcca2f7-error-404-isvc-8d150-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-8d150-predictor-568b6759f8-b7rcq\" (UID: \"2cd36014-1cf4-473d-9b21-8a05cbcca2f7\") " pod="kserve-ci-e2e-test/error-404-isvc-8d150-predictor-568b6759f8-b7rcq" Apr 23 13:54:35.850475 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:35.850444 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cjwt\" (UniqueName: \"kubernetes.io/projected/2cd36014-1cf4-473d-9b21-8a05cbcca2f7-kube-api-access-5cjwt\") pod \"error-404-isvc-8d150-predictor-568b6759f8-b7rcq\" (UID: \"2cd36014-1cf4-473d-9b21-8a05cbcca2f7\") " pod="kserve-ci-e2e-test/error-404-isvc-8d150-predictor-568b6759f8-b7rcq" Apr 23 13:54:36.345422 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:36.345394 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2cd36014-1cf4-473d-9b21-8a05cbcca2f7-proxy-tls\") pod \"error-404-isvc-8d150-predictor-568b6759f8-b7rcq\" (UID: \"2cd36014-1cf4-473d-9b21-8a05cbcca2f7\") " pod="kserve-ci-e2e-test/error-404-isvc-8d150-predictor-568b6759f8-b7rcq" Apr 23 13:54:36.348065 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:36.348039 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2cd36014-1cf4-473d-9b21-8a05cbcca2f7-proxy-tls\") pod \"error-404-isvc-8d150-predictor-568b6759f8-b7rcq\" (UID: \"2cd36014-1cf4-473d-9b21-8a05cbcca2f7\") " pod="kserve-ci-e2e-test/error-404-isvc-8d150-predictor-568b6759f8-b7rcq" Apr 23 13:54:36.350129 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:36.350103 2577 generic.go:358] "Generic (PLEG): container finished" podID="38b4a054-ea79-474e-ab9a-7cdac8139d19" containerID="cbff41087885ab2522bb98ac7a318568a186f835d0ebb6712e9b720cb2692cc6" exitCode=2 Apr 23 13:54:36.350240 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:36.350143 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-1cfcc-predictor-655bf67bfd-prkpg" event={"ID":"38b4a054-ea79-474e-ab9a-7cdac8139d19","Type":"ContainerDied","Data":"cbff41087885ab2522bb98ac7a318568a186f835d0ebb6712e9b720cb2692cc6"} Apr 23 13:54:36.549883 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:36.549846 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-8d150-predictor-568b6759f8-b7rcq" Apr 23 13:54:36.669243 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:36.669215 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-8d150-predictor-568b6759f8-b7rcq"] Apr 23 13:54:36.673226 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:54:36.673196 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2cd36014_1cf4_473d_9b21_8a05cbcca2f7.slice/crio-077921c12edd32e94a8f8764eca6db0a46ac7b49c230acc14d4d90b5c202bd0f WatchSource:0}: Error finding container 077921c12edd32e94a8f8764eca6db0a46ac7b49c230acc14d4d90b5c202bd0f: Status 404 returned error can't find the container with id 077921c12edd32e94a8f8764eca6db0a46ac7b49c230acc14d4d90b5c202bd0f Apr 23 13:54:37.325397 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:37.325359 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-1cfcc-57cb468459-llds7" podUID="10263db1-913b-4de1-b858-7eccae24824d" containerName="sequence-graph-1cfcc" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:54:37.356607 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:37.356573 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-8d150-predictor-568b6759f8-b7rcq" event={"ID":"2cd36014-1cf4-473d-9b21-8a05cbcca2f7","Type":"ContainerStarted","Data":"a156e8e7057c64bb8fc001901c09d63f58f8f358a01a7cf6f88d3e69898bd03b"} Apr 23 13:54:37.356607 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:37.356608 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-8d150-predictor-568b6759f8-b7rcq" event={"ID":"2cd36014-1cf4-473d-9b21-8a05cbcca2f7","Type":"ContainerStarted","Data":"0510ea135d5eba008ba32bac722fc527294c3d3d3bd3786e8d177ba6eb38544a"} Apr 23 13:54:37.356812 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:37.356618 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-8d150-predictor-568b6759f8-b7rcq" event={"ID":"2cd36014-1cf4-473d-9b21-8a05cbcca2f7","Type":"ContainerStarted","Data":"077921c12edd32e94a8f8764eca6db0a46ac7b49c230acc14d4d90b5c202bd0f"} Apr 23 13:54:37.356812 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:37.356721 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-8d150-predictor-568b6759f8-b7rcq" Apr 23 13:54:37.376433 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:37.376385 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-8d150-predictor-568b6759f8-b7rcq" podStartSLOduration=2.376371314 podStartE2EDuration="2.376371314s" podCreationTimestamp="2026-04-23 13:54:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:54:37.374159062 +0000 UTC m=+1350.352067654" watchObservedRunningTime="2026-04-23 13:54:37.376371314 +0000 UTC m=+1350.354279905" Apr 23 13:54:38.130323 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:38.130285 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-1cfcc-predictor-655bf67bfd-prkpg" podUID="38b4a054-ea79-474e-ab9a-7cdac8139d19" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.26:8643/healthz\": dial tcp 10.132.0.26:8643: connect: connection refused" Apr 23 13:54:38.259287 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:38.259248 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-37fbd-predictor-7544db6574-c755q" podUID="014980aa-a818-4b54-8ae1-7d91c7a3e843" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.28:8080: connect: connection refused" Apr 23 13:54:38.359862 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:38.359833 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-8d150-predictor-568b6759f8-b7rcq" Apr 23 13:54:38.361096 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:38.361067 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-8d150-predictor-568b6759f8-b7rcq" podUID="2cd36014-1cf4-473d-9b21-8a05cbcca2f7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:8080: connect: connection refused" Apr 23 13:54:38.572672 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:38.572649 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-1cfcc-predictor-655bf67bfd-prkpg" Apr 23 13:54:38.664003 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:38.663968 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-1cfcc-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/38b4a054-ea79-474e-ab9a-7cdac8139d19-error-404-isvc-1cfcc-kube-rbac-proxy-sar-config\") pod \"38b4a054-ea79-474e-ab9a-7cdac8139d19\" (UID: \"38b4a054-ea79-474e-ab9a-7cdac8139d19\") " Apr 23 13:54:38.664144 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:38.664070 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/38b4a054-ea79-474e-ab9a-7cdac8139d19-proxy-tls\") pod \"38b4a054-ea79-474e-ab9a-7cdac8139d19\" (UID: \"38b4a054-ea79-474e-ab9a-7cdac8139d19\") " Apr 23 13:54:38.664144 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:38.664098 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nmpdl\" (UniqueName: \"kubernetes.io/projected/38b4a054-ea79-474e-ab9a-7cdac8139d19-kube-api-access-nmpdl\") pod \"38b4a054-ea79-474e-ab9a-7cdac8139d19\" (UID: \"38b4a054-ea79-474e-ab9a-7cdac8139d19\") " Apr 23 13:54:38.664261 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:38.664239 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38b4a054-ea79-474e-ab9a-7cdac8139d19-error-404-isvc-1cfcc-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-1cfcc-kube-rbac-proxy-sar-config") pod "38b4a054-ea79-474e-ab9a-7cdac8139d19" (UID: "38b4a054-ea79-474e-ab9a-7cdac8139d19"). InnerVolumeSpecName "error-404-isvc-1cfcc-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:54:38.666257 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:38.666228 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38b4a054-ea79-474e-ab9a-7cdac8139d19-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "38b4a054-ea79-474e-ab9a-7cdac8139d19" (UID: "38b4a054-ea79-474e-ab9a-7cdac8139d19"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:54:38.666257 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:38.666228 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38b4a054-ea79-474e-ab9a-7cdac8139d19-kube-api-access-nmpdl" (OuterVolumeSpecName: "kube-api-access-nmpdl") pod "38b4a054-ea79-474e-ab9a-7cdac8139d19" (UID: "38b4a054-ea79-474e-ab9a-7cdac8139d19"). InnerVolumeSpecName "kube-api-access-nmpdl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:54:38.765012 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:38.764942 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nmpdl\" (UniqueName: \"kubernetes.io/projected/38b4a054-ea79-474e-ab9a-7cdac8139d19-kube-api-access-nmpdl\") on node \"ip-10-0-132-207.ec2.internal\" DevicePath \"\"" Apr 23 13:54:38.765012 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:38.764969 2577 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-1cfcc-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/38b4a054-ea79-474e-ab9a-7cdac8139d19-error-404-isvc-1cfcc-kube-rbac-proxy-sar-config\") on node \"ip-10-0-132-207.ec2.internal\" DevicePath \"\"" Apr 23 13:54:38.765012 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:38.764981 2577 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/38b4a054-ea79-474e-ab9a-7cdac8139d19-proxy-tls\") on node \"ip-10-0-132-207.ec2.internal\" DevicePath \"\"" Apr 23 13:54:39.364016 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:39.363976 2577 generic.go:358] "Generic (PLEG): container finished" podID="38b4a054-ea79-474e-ab9a-7cdac8139d19" containerID="709db47bc88eb8435b65285b1d328847ecb41baa7fd3b18467de35bd33d138f8" exitCode=0 Apr 23 13:54:39.364470 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:39.364054 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-1cfcc-predictor-655bf67bfd-prkpg" Apr 23 13:54:39.364470 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:39.364060 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-1cfcc-predictor-655bf67bfd-prkpg" event={"ID":"38b4a054-ea79-474e-ab9a-7cdac8139d19","Type":"ContainerDied","Data":"709db47bc88eb8435b65285b1d328847ecb41baa7fd3b18467de35bd33d138f8"} Apr 23 13:54:39.364470 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:39.364109 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-1cfcc-predictor-655bf67bfd-prkpg" event={"ID":"38b4a054-ea79-474e-ab9a-7cdac8139d19","Type":"ContainerDied","Data":"dec8801bd1f2e35cc6acc3f025974f94570a3bd2e2a046f41812a41b4f714796"} Apr 23 13:54:39.364470 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:39.364131 2577 scope.go:117] "RemoveContainer" containerID="cbff41087885ab2522bb98ac7a318568a186f835d0ebb6712e9b720cb2692cc6" Apr 23 13:54:39.364775 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:39.364748 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-8d150-predictor-568b6759f8-b7rcq" podUID="2cd36014-1cf4-473d-9b21-8a05cbcca2f7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:8080: connect: connection refused" Apr 23 13:54:39.372389 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:39.372146 2577 scope.go:117] "RemoveContainer" containerID="709db47bc88eb8435b65285b1d328847ecb41baa7fd3b18467de35bd33d138f8" Apr 23 13:54:39.378777 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:39.378763 2577 scope.go:117] "RemoveContainer" containerID="cbff41087885ab2522bb98ac7a318568a186f835d0ebb6712e9b720cb2692cc6" Apr 23 13:54:39.378988 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:54:39.378972 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cbff41087885ab2522bb98ac7a318568a186f835d0ebb6712e9b720cb2692cc6\": container with ID starting with cbff41087885ab2522bb98ac7a318568a186f835d0ebb6712e9b720cb2692cc6 not found: ID does not exist" containerID="cbff41087885ab2522bb98ac7a318568a186f835d0ebb6712e9b720cb2692cc6" Apr 23 13:54:39.379030 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:39.378993 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbff41087885ab2522bb98ac7a318568a186f835d0ebb6712e9b720cb2692cc6"} err="failed to get container status \"cbff41087885ab2522bb98ac7a318568a186f835d0ebb6712e9b720cb2692cc6\": rpc error: code = NotFound desc = could not find container \"cbff41087885ab2522bb98ac7a318568a186f835d0ebb6712e9b720cb2692cc6\": container with ID starting with cbff41087885ab2522bb98ac7a318568a186f835d0ebb6712e9b720cb2692cc6 not found: ID does not exist" Apr 23 13:54:39.379030 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:39.379007 2577 scope.go:117] "RemoveContainer" containerID="709db47bc88eb8435b65285b1d328847ecb41baa7fd3b18467de35bd33d138f8" Apr 23 13:54:39.379179 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:54:39.379165 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"709db47bc88eb8435b65285b1d328847ecb41baa7fd3b18467de35bd33d138f8\": container with ID starting with 709db47bc88eb8435b65285b1d328847ecb41baa7fd3b18467de35bd33d138f8 not found: ID does not exist" containerID="709db47bc88eb8435b65285b1d328847ecb41baa7fd3b18467de35bd33d138f8" Apr 23 13:54:39.379219 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:39.379180 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"709db47bc88eb8435b65285b1d328847ecb41baa7fd3b18467de35bd33d138f8"} err="failed to get container status \"709db47bc88eb8435b65285b1d328847ecb41baa7fd3b18467de35bd33d138f8\": rpc error: code = NotFound desc = could not find container \"709db47bc88eb8435b65285b1d328847ecb41baa7fd3b18467de35bd33d138f8\": container with ID starting with 709db47bc88eb8435b65285b1d328847ecb41baa7fd3b18467de35bd33d138f8 not found: ID does not exist" Apr 23 13:54:39.386212 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:39.386192 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-1cfcc-predictor-655bf67bfd-prkpg"] Apr 23 13:54:39.391093 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:39.391074 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-1cfcc-predictor-655bf67bfd-prkpg"] Apr 23 13:54:39.583408 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:39.583372 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38b4a054-ea79-474e-ab9a-7cdac8139d19" path="/var/lib/kubelet/pods/38b4a054-ea79-474e-ab9a-7cdac8139d19/volumes" Apr 23 13:54:42.325483 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:42.325446 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-1cfcc-57cb468459-llds7" podUID="10263db1-913b-4de1-b858-7eccae24824d" containerName="sequence-graph-1cfcc" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:54:44.369312 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:44.369285 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-8d150-predictor-568b6759f8-b7rcq" Apr 23 13:54:44.369794 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:44.369770 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-8d150-predictor-568b6759f8-b7rcq" podUID="2cd36014-1cf4-473d-9b21-8a05cbcca2f7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:8080: connect: connection refused" Apr 23 13:54:47.324828 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:47.324787 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-1cfcc-57cb468459-llds7" podUID="10263db1-913b-4de1-b858-7eccae24824d" containerName="sequence-graph-1cfcc" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:54:47.325257 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:47.324937 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-1cfcc-57cb468459-llds7" Apr 23 13:54:48.259202 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:48.259168 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-37fbd-predictor-7544db6574-c755q" Apr 23 13:54:52.324714 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:52.324673 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-1cfcc-57cb468459-llds7" podUID="10263db1-913b-4de1-b858-7eccae24824d" containerName="sequence-graph-1cfcc" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:54:54.370556 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:54.370508 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-8d150-predictor-568b6759f8-b7rcq" podUID="2cd36014-1cf4-473d-9b21-8a05cbcca2f7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:8080: connect: connection refused" Apr 23 13:54:57.324905 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:57.324862 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-1cfcc-57cb468459-llds7" podUID="10263db1-913b-4de1-b858-7eccae24824d" containerName="sequence-graph-1cfcc" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:54:59.861080 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:59.861048 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-37fbd-576848b778-mxpf4"] Apr 23 13:54:59.861418 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:59.861344 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="38b4a054-ea79-474e-ab9a-7cdac8139d19" containerName="kserve-container" Apr 23 13:54:59.861418 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:59.861359 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="38b4a054-ea79-474e-ab9a-7cdac8139d19" containerName="kserve-container" Apr 23 13:54:59.861418 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:59.861370 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="38b4a054-ea79-474e-ab9a-7cdac8139d19" containerName="kube-rbac-proxy" Apr 23 13:54:59.861418 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:59.861375 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="38b4a054-ea79-474e-ab9a-7cdac8139d19" containerName="kube-rbac-proxy" Apr 23 13:54:59.861418 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:59.861419 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="38b4a054-ea79-474e-ab9a-7cdac8139d19" containerName="kserve-container" Apr 23 13:54:59.861602 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:59.861433 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="38b4a054-ea79-474e-ab9a-7cdac8139d19" containerName="kube-rbac-proxy" Apr 23 13:54:59.865746 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:59.865730 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-37fbd-576848b778-mxpf4" Apr 23 13:54:59.868207 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:59.868190 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"ensemble-graph-37fbd-kube-rbac-proxy-sar-config\"" Apr 23 13:54:59.868289 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:59.868203 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"ensemble-graph-37fbd-serving-cert\"" Apr 23 13:54:59.873445 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:59.873423 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-37fbd-576848b778-mxpf4"] Apr 23 13:54:59.912545 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:59.912515 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9a070a55-54b0-48a6-b96a-58a34dd2fa30-openshift-service-ca-bundle\") pod \"ensemble-graph-37fbd-576848b778-mxpf4\" (UID: \"9a070a55-54b0-48a6-b96a-58a34dd2fa30\") " pod="kserve-ci-e2e-test/ensemble-graph-37fbd-576848b778-mxpf4" Apr 23 13:54:59.912671 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:54:59.912571 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9a070a55-54b0-48a6-b96a-58a34dd2fa30-proxy-tls\") pod \"ensemble-graph-37fbd-576848b778-mxpf4\" (UID: \"9a070a55-54b0-48a6-b96a-58a34dd2fa30\") " pod="kserve-ci-e2e-test/ensemble-graph-37fbd-576848b778-mxpf4" Apr 23 13:55:00.013483 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:55:00.013441 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9a070a55-54b0-48a6-b96a-58a34dd2fa30-proxy-tls\") pod \"ensemble-graph-37fbd-576848b778-mxpf4\" (UID: \"9a070a55-54b0-48a6-b96a-58a34dd2fa30\") " pod="kserve-ci-e2e-test/ensemble-graph-37fbd-576848b778-mxpf4" Apr 23 13:55:00.013626 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:55:00.013541 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9a070a55-54b0-48a6-b96a-58a34dd2fa30-openshift-service-ca-bundle\") pod \"ensemble-graph-37fbd-576848b778-mxpf4\" (UID: \"9a070a55-54b0-48a6-b96a-58a34dd2fa30\") " pod="kserve-ci-e2e-test/ensemble-graph-37fbd-576848b778-mxpf4" Apr 23 13:55:00.013626 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:55:00.013582 2577 secret.go:189] Couldn't get secret kserve-ci-e2e-test/ensemble-graph-37fbd-serving-cert: secret "ensemble-graph-37fbd-serving-cert" not found Apr 23 13:55:00.013696 ip-10-0-132-207 kubenswrapper[2577]: E0423 13:55:00.013651 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9a070a55-54b0-48a6-b96a-58a34dd2fa30-proxy-tls podName:9a070a55-54b0-48a6-b96a-58a34dd2fa30 nodeName:}" failed. No retries permitted until 2026-04-23 13:55:00.513636663 +0000 UTC m=+1373.491545233 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/9a070a55-54b0-48a6-b96a-58a34dd2fa30-proxy-tls") pod "ensemble-graph-37fbd-576848b778-mxpf4" (UID: "9a070a55-54b0-48a6-b96a-58a34dd2fa30") : secret "ensemble-graph-37fbd-serving-cert" not found Apr 23 13:55:00.014108 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:55:00.014091 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9a070a55-54b0-48a6-b96a-58a34dd2fa30-openshift-service-ca-bundle\") pod \"ensemble-graph-37fbd-576848b778-mxpf4\" (UID: \"9a070a55-54b0-48a6-b96a-58a34dd2fa30\") " pod="kserve-ci-e2e-test/ensemble-graph-37fbd-576848b778-mxpf4" Apr 23 13:55:00.517037 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:55:00.517002 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9a070a55-54b0-48a6-b96a-58a34dd2fa30-proxy-tls\") pod \"ensemble-graph-37fbd-576848b778-mxpf4\" (UID: \"9a070a55-54b0-48a6-b96a-58a34dd2fa30\") " pod="kserve-ci-e2e-test/ensemble-graph-37fbd-576848b778-mxpf4" Apr 23 13:55:00.519354 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:55:00.519333 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9a070a55-54b0-48a6-b96a-58a34dd2fa30-proxy-tls\") pod \"ensemble-graph-37fbd-576848b778-mxpf4\" (UID: \"9a070a55-54b0-48a6-b96a-58a34dd2fa30\") " pod="kserve-ci-e2e-test/ensemble-graph-37fbd-576848b778-mxpf4" Apr 23 13:55:00.776418 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:55:00.776337 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-37fbd-576848b778-mxpf4" Apr 23 13:55:00.891979 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:55:00.891956 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-37fbd-576848b778-mxpf4"] Apr 23 13:55:00.894435 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:55:00.894408 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a070a55_54b0_48a6_b96a_58a34dd2fa30.slice/crio-2f82b178a45e0fe18eb57decc724cf474a21155e8e932b972a7d2ae580b930ba WatchSource:0}: Error finding container 2f82b178a45e0fe18eb57decc724cf474a21155e8e932b972a7d2ae580b930ba: Status 404 returned error can't find the container with id 2f82b178a45e0fe18eb57decc724cf474a21155e8e932b972a7d2ae580b930ba Apr 23 13:55:01.431204 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:55:01.431169 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-37fbd-576848b778-mxpf4" event={"ID":"9a070a55-54b0-48a6-b96a-58a34dd2fa30","Type":"ContainerStarted","Data":"d5f148d5d94be2cd2084b96fca4b497127f4f8c1f7f424ef64c2de6537d15b71"} Apr 23 13:55:01.431204 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:55:01.431204 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-37fbd-576848b778-mxpf4" event={"ID":"9a070a55-54b0-48a6-b96a-58a34dd2fa30","Type":"ContainerStarted","Data":"2f82b178a45e0fe18eb57decc724cf474a21155e8e932b972a7d2ae580b930ba"} Apr 23 13:55:01.431415 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:55:01.431302 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/ensemble-graph-37fbd-576848b778-mxpf4" Apr 23 13:55:01.447701 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:55:01.447656 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/ensemble-graph-37fbd-576848b778-mxpf4" podStartSLOduration=2.447642602 podStartE2EDuration="2.447642602s" podCreationTimestamp="2026-04-23 13:54:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:55:01.446063014 +0000 UTC m=+1374.423971606" watchObservedRunningTime="2026-04-23 13:55:01.447642602 +0000 UTC m=+1374.425551191" Apr 23 13:55:02.325567 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:55:02.325530 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-1cfcc-57cb468459-llds7" podUID="10263db1-913b-4de1-b858-7eccae24824d" containerName="sequence-graph-1cfcc" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:55:04.370620 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:55:04.370582 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-8d150-predictor-568b6759f8-b7rcq" podUID="2cd36014-1cf4-473d-9b21-8a05cbcca2f7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:8080: connect: connection refused" Apr 23 13:55:05.444171 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:55:05.444130 2577 generic.go:358] "Generic (PLEG): container finished" podID="10263db1-913b-4de1-b858-7eccae24824d" containerID="12755dd30a8e239a13556b61639a3dfeb41854396b9ca5c830ca668791b52e37" exitCode=137 Apr 23 13:55:05.444534 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:55:05.444192 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-1cfcc-57cb468459-llds7" event={"ID":"10263db1-913b-4de1-b858-7eccae24824d","Type":"ContainerDied","Data":"12755dd30a8e239a13556b61639a3dfeb41854396b9ca5c830ca668791b52e37"} Apr 23 13:55:05.523772 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:55:05.523750 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-1cfcc-57cb468459-llds7" Apr 23 13:55:05.560346 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:55:05.560317 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/10263db1-913b-4de1-b858-7eccae24824d-openshift-service-ca-bundle\") pod \"10263db1-913b-4de1-b858-7eccae24824d\" (UID: \"10263db1-913b-4de1-b858-7eccae24824d\") " Apr 23 13:55:05.560507 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:55:05.560368 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/10263db1-913b-4de1-b858-7eccae24824d-proxy-tls\") pod \"10263db1-913b-4de1-b858-7eccae24824d\" (UID: \"10263db1-913b-4de1-b858-7eccae24824d\") " Apr 23 13:55:05.560679 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:55:05.560654 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10263db1-913b-4de1-b858-7eccae24824d-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "10263db1-913b-4de1-b858-7eccae24824d" (UID: "10263db1-913b-4de1-b858-7eccae24824d"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:55:05.562369 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:55:05.562347 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10263db1-913b-4de1-b858-7eccae24824d-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "10263db1-913b-4de1-b858-7eccae24824d" (UID: "10263db1-913b-4de1-b858-7eccae24824d"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:55:05.661350 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:55:05.661262 2577 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/10263db1-913b-4de1-b858-7eccae24824d-openshift-service-ca-bundle\") on node \"ip-10-0-132-207.ec2.internal\" DevicePath \"\"" Apr 23 13:55:05.661350 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:55:05.661294 2577 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/10263db1-913b-4de1-b858-7eccae24824d-proxy-tls\") on node \"ip-10-0-132-207.ec2.internal\" DevicePath \"\"" Apr 23 13:55:06.448437 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:55:06.448403 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-1cfcc-57cb468459-llds7" event={"ID":"10263db1-913b-4de1-b858-7eccae24824d","Type":"ContainerDied","Data":"5e52e25725915adbf3f91256756d21a635d806168c23a7bb32d1ff0afac34fb5"} Apr 23 13:55:06.448876 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:55:06.448450 2577 scope.go:117] "RemoveContainer" containerID="12755dd30a8e239a13556b61639a3dfeb41854396b9ca5c830ca668791b52e37" Apr 23 13:55:06.448876 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:55:06.448410 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-1cfcc-57cb468459-llds7" Apr 23 13:55:06.473265 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:55:06.473241 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-1cfcc-57cb468459-llds7"] Apr 23 13:55:06.476113 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:55:06.476094 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-1cfcc-57cb468459-llds7"] Apr 23 13:55:07.440261 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:55:07.440234 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/ensemble-graph-37fbd-576848b778-mxpf4" Apr 23 13:55:07.583032 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:55:07.583005 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10263db1-913b-4de1-b858-7eccae24824d" path="/var/lib/kubelet/pods/10263db1-913b-4de1-b858-7eccae24824d/volumes" Apr 23 13:55:14.370537 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:55:14.370478 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-8d150-predictor-568b6759f8-b7rcq" podUID="2cd36014-1cf4-473d-9b21-8a05cbcca2f7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:8080: connect: connection refused" Apr 23 13:55:24.370627 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:55:24.370595 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-8d150-predictor-568b6759f8-b7rcq" Apr 23 13:55:35.575449 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:55:35.575416 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/sequence-graph-8d150-55944bff46-k6lxc"] Apr 23 13:55:35.576001 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:55:35.575846 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="10263db1-913b-4de1-b858-7eccae24824d" containerName="sequence-graph-1cfcc" Apr 23 13:55:35.576001 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:55:35.575865 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="10263db1-913b-4de1-b858-7eccae24824d" containerName="sequence-graph-1cfcc" Apr 23 13:55:35.576001 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:55:35.575944 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="10263db1-913b-4de1-b858-7eccae24824d" containerName="sequence-graph-1cfcc" Apr 23 13:55:35.578854 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:55:35.578833 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-8d150-55944bff46-k6lxc" Apr 23 13:55:35.581503 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:55:35.581462 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-8d150-kube-rbac-proxy-sar-config\"" Apr 23 13:55:35.581589 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:55:35.581504 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-8d150-serving-cert\"" Apr 23 13:55:35.586484 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:55:35.586462 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-8d150-55944bff46-k6lxc"] Apr 23 13:55:35.670213 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:55:35.670189 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e92d4441-cbca-43ee-9969-2d4880ddff10-proxy-tls\") pod \"sequence-graph-8d150-55944bff46-k6lxc\" (UID: \"e92d4441-cbca-43ee-9969-2d4880ddff10\") " pod="kserve-ci-e2e-test/sequence-graph-8d150-55944bff46-k6lxc" Apr 23 13:55:35.670317 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:55:35.670221 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e92d4441-cbca-43ee-9969-2d4880ddff10-openshift-service-ca-bundle\") pod \"sequence-graph-8d150-55944bff46-k6lxc\" (UID: \"e92d4441-cbca-43ee-9969-2d4880ddff10\") " pod="kserve-ci-e2e-test/sequence-graph-8d150-55944bff46-k6lxc" Apr 23 13:55:35.771507 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:55:35.771469 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e92d4441-cbca-43ee-9969-2d4880ddff10-proxy-tls\") pod \"sequence-graph-8d150-55944bff46-k6lxc\" (UID: \"e92d4441-cbca-43ee-9969-2d4880ddff10\") " pod="kserve-ci-e2e-test/sequence-graph-8d150-55944bff46-k6lxc" Apr 23 13:55:35.771604 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:55:35.771516 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e92d4441-cbca-43ee-9969-2d4880ddff10-openshift-service-ca-bundle\") pod \"sequence-graph-8d150-55944bff46-k6lxc\" (UID: \"e92d4441-cbca-43ee-9969-2d4880ddff10\") " pod="kserve-ci-e2e-test/sequence-graph-8d150-55944bff46-k6lxc" Apr 23 13:55:35.772090 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:55:35.772068 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e92d4441-cbca-43ee-9969-2d4880ddff10-openshift-service-ca-bundle\") pod \"sequence-graph-8d150-55944bff46-k6lxc\" (UID: \"e92d4441-cbca-43ee-9969-2d4880ddff10\") " pod="kserve-ci-e2e-test/sequence-graph-8d150-55944bff46-k6lxc" Apr 23 13:55:35.773832 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:55:35.773812 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e92d4441-cbca-43ee-9969-2d4880ddff10-proxy-tls\") pod \"sequence-graph-8d150-55944bff46-k6lxc\" (UID: \"e92d4441-cbca-43ee-9969-2d4880ddff10\") " pod="kserve-ci-e2e-test/sequence-graph-8d150-55944bff46-k6lxc" Apr 23 13:55:35.889248 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:55:35.889191 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-8d150-55944bff46-k6lxc" Apr 23 13:55:36.006915 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:55:36.006752 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-8d150-55944bff46-k6lxc"] Apr 23 13:55:36.009698 ip-10-0-132-207 kubenswrapper[2577]: W0423 13:55:36.009670 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode92d4441_cbca_43ee_9969_2d4880ddff10.slice/crio-90f74fb932d9947203652063e534e4c9241f9e4352c44e32a6ae7c6fcd515490 WatchSource:0}: Error finding container 90f74fb932d9947203652063e534e4c9241f9e4352c44e32a6ae7c6fcd515490: Status 404 returned error can't find the container with id 90f74fb932d9947203652063e534e4c9241f9e4352c44e32a6ae7c6fcd515490 Apr 23 13:55:36.535736 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:55:36.535705 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-8d150-55944bff46-k6lxc" event={"ID":"e92d4441-cbca-43ee-9969-2d4880ddff10","Type":"ContainerStarted","Data":"4a18d44cea327f2ded44260e9264feda8321099630910ef53d84a23babc22664"} Apr 23 13:55:36.535736 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:55:36.535739 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-8d150-55944bff46-k6lxc" event={"ID":"e92d4441-cbca-43ee-9969-2d4880ddff10","Type":"ContainerStarted","Data":"90f74fb932d9947203652063e534e4c9241f9e4352c44e32a6ae7c6fcd515490"} Apr 23 13:55:36.535944 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:55:36.535763 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-8d150-55944bff46-k6lxc" Apr 23 13:55:36.555438 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:55:36.553792 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/sequence-graph-8d150-55944bff46-k6lxc" podStartSLOduration=1.553775642 podStartE2EDuration="1.553775642s" podCreationTimestamp="2026-04-23 13:55:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:55:36.550703695 +0000 UTC m=+1409.528612286" watchObservedRunningTime="2026-04-23 13:55:36.553775642 +0000 UTC m=+1409.531684235" Apr 23 13:55:42.544242 ip-10-0-132-207 kubenswrapper[2577]: I0423 13:55:42.544213 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sequence-graph-8d150-55944bff46-k6lxc" Apr 23 14:03:14.523689 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:14.523657 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-37fbd-576848b778-mxpf4"] Apr 23 14:03:14.526035 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:14.523876 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/ensemble-graph-37fbd-576848b778-mxpf4" podUID="9a070a55-54b0-48a6-b96a-58a34dd2fa30" containerName="ensemble-graph-37fbd" containerID="cri-o://d5f148d5d94be2cd2084b96fca4b497127f4f8c1f7f424ef64c2de6537d15b71" gracePeriod=30 Apr 23 14:03:14.715952 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:14.715917 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-37fbd-predictor-7544db6574-c755q"] Apr 23 14:03:14.716219 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:14.716191 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-37fbd-predictor-7544db6574-c755q" podUID="014980aa-a818-4b54-8ae1-7d91c7a3e843" containerName="kserve-container" containerID="cri-o://f2f4f13f09288e783691efa8f72a9c6de6fd795e739fcdfe6a5b9ef79c44e053" gracePeriod=30 Apr 23 14:03:14.716309 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:14.716245 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-37fbd-predictor-7544db6574-c755q" podUID="014980aa-a818-4b54-8ae1-7d91c7a3e843" containerName="kube-rbac-proxy" containerID="cri-o://8daf0f5fda6b4cb6f682d1d2fe2c5950193ddd8ece6875b3f750cae344f6c8b2" gracePeriod=30 Apr 23 14:03:14.813949 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:14.813883 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-5035e-predictor-6d8685dc6b-q4hpb"] Apr 23 14:03:14.817097 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:14.817076 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-5035e-predictor-6d8685dc6b-q4hpb" Apr 23 14:03:14.819869 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:14.819835 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-5035e-kube-rbac-proxy-sar-config\"" Apr 23 14:03:14.819869 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:14.819835 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-5035e-predictor-serving-cert\"" Apr 23 14:03:14.829368 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:14.829348 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-5035e-predictor-6d8685dc6b-q4hpb"] Apr 23 14:03:14.868828 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:14.868804 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-5035e-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/fca22878-ab4a-4c89-8f94-e4a1e5a08647-error-404-isvc-5035e-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-5035e-predictor-6d8685dc6b-q4hpb\" (UID: \"fca22878-ab4a-4c89-8f94-e4a1e5a08647\") " pod="kserve-ci-e2e-test/error-404-isvc-5035e-predictor-6d8685dc6b-q4hpb" Apr 23 14:03:14.868922 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:14.868858 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fca22878-ab4a-4c89-8f94-e4a1e5a08647-proxy-tls\") pod \"error-404-isvc-5035e-predictor-6d8685dc6b-q4hpb\" (UID: \"fca22878-ab4a-4c89-8f94-e4a1e5a08647\") " pod="kserve-ci-e2e-test/error-404-isvc-5035e-predictor-6d8685dc6b-q4hpb" Apr 23 14:03:14.868922 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:14.868907 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdx9b\" (UniqueName: \"kubernetes.io/projected/fca22878-ab4a-4c89-8f94-e4a1e5a08647-kube-api-access-vdx9b\") pod \"error-404-isvc-5035e-predictor-6d8685dc6b-q4hpb\" (UID: \"fca22878-ab4a-4c89-8f94-e4a1e5a08647\") " pod="kserve-ci-e2e-test/error-404-isvc-5035e-predictor-6d8685dc6b-q4hpb" Apr 23 14:03:14.969525 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:14.969481 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fca22878-ab4a-4c89-8f94-e4a1e5a08647-proxy-tls\") pod \"error-404-isvc-5035e-predictor-6d8685dc6b-q4hpb\" (UID: \"fca22878-ab4a-4c89-8f94-e4a1e5a08647\") " pod="kserve-ci-e2e-test/error-404-isvc-5035e-predictor-6d8685dc6b-q4hpb" Apr 23 14:03:14.969525 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:14.969535 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vdx9b\" (UniqueName: \"kubernetes.io/projected/fca22878-ab4a-4c89-8f94-e4a1e5a08647-kube-api-access-vdx9b\") pod \"error-404-isvc-5035e-predictor-6d8685dc6b-q4hpb\" (UID: \"fca22878-ab4a-4c89-8f94-e4a1e5a08647\") " pod="kserve-ci-e2e-test/error-404-isvc-5035e-predictor-6d8685dc6b-q4hpb" Apr 23 14:03:14.969756 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:14.969582 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-5035e-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/fca22878-ab4a-4c89-8f94-e4a1e5a08647-error-404-isvc-5035e-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-5035e-predictor-6d8685dc6b-q4hpb\" (UID: \"fca22878-ab4a-4c89-8f94-e4a1e5a08647\") " pod="kserve-ci-e2e-test/error-404-isvc-5035e-predictor-6d8685dc6b-q4hpb" Apr 23 14:03:14.970277 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:14.970256 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-5035e-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/fca22878-ab4a-4c89-8f94-e4a1e5a08647-error-404-isvc-5035e-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-5035e-predictor-6d8685dc6b-q4hpb\" (UID: \"fca22878-ab4a-4c89-8f94-e4a1e5a08647\") " pod="kserve-ci-e2e-test/error-404-isvc-5035e-predictor-6d8685dc6b-q4hpb" Apr 23 14:03:14.972055 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:14.972029 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fca22878-ab4a-4c89-8f94-e4a1e5a08647-proxy-tls\") pod \"error-404-isvc-5035e-predictor-6d8685dc6b-q4hpb\" (UID: \"fca22878-ab4a-4c89-8f94-e4a1e5a08647\") " pod="kserve-ci-e2e-test/error-404-isvc-5035e-predictor-6d8685dc6b-q4hpb" Apr 23 14:03:14.978745 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:14.978724 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdx9b\" (UniqueName: \"kubernetes.io/projected/fca22878-ab4a-4c89-8f94-e4a1e5a08647-kube-api-access-vdx9b\") pod \"error-404-isvc-5035e-predictor-6d8685dc6b-q4hpb\" (UID: \"fca22878-ab4a-4c89-8f94-e4a1e5a08647\") " pod="kserve-ci-e2e-test/error-404-isvc-5035e-predictor-6d8685dc6b-q4hpb" Apr 23 14:03:15.127431 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:15.127350 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-5035e-predictor-6d8685dc6b-q4hpb" Apr 23 14:03:15.248929 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:15.248905 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-5035e-predictor-6d8685dc6b-q4hpb"] Apr 23 14:03:15.251331 ip-10-0-132-207 kubenswrapper[2577]: W0423 14:03:15.251299 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfca22878_ab4a_4c89_8f94_e4a1e5a08647.slice/crio-c88dea163369b7683482c5ae49c9b46099155220f0197c1638939189c76bfd0a WatchSource:0}: Error finding container c88dea163369b7683482c5ae49c9b46099155220f0197c1638939189c76bfd0a: Status 404 returned error can't find the container with id c88dea163369b7683482c5ae49c9b46099155220f0197c1638939189c76bfd0a Apr 23 14:03:15.253069 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:15.253051 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 14:03:15.808878 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:15.808838 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-5035e-predictor-6d8685dc6b-q4hpb" event={"ID":"fca22878-ab4a-4c89-8f94-e4a1e5a08647","Type":"ContainerStarted","Data":"9f32cde5b9bfecc7bef023e5847d48de953d3f0766b93628ea7dc91b0791621c"} Apr 23 14:03:15.808878 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:15.808884 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-5035e-predictor-6d8685dc6b-q4hpb" event={"ID":"fca22878-ab4a-4c89-8f94-e4a1e5a08647","Type":"ContainerStarted","Data":"244266ae8dbc90a36717af084239edc74182ffa7c7451765054b9224959f5da4"} Apr 23 14:03:15.809447 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:15.808898 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-5035e-predictor-6d8685dc6b-q4hpb" event={"ID":"fca22878-ab4a-4c89-8f94-e4a1e5a08647","Type":"ContainerStarted","Data":"c88dea163369b7683482c5ae49c9b46099155220f0197c1638939189c76bfd0a"} Apr 23 14:03:15.809447 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:15.808988 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-5035e-predictor-6d8685dc6b-q4hpb" Apr 23 14:03:15.810438 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:15.810417 2577 generic.go:358] "Generic (PLEG): container finished" podID="014980aa-a818-4b54-8ae1-7d91c7a3e843" containerID="8daf0f5fda6b4cb6f682d1d2fe2c5950193ddd8ece6875b3f750cae344f6c8b2" exitCode=2 Apr 23 14:03:15.810538 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:15.810466 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-37fbd-predictor-7544db6574-c755q" event={"ID":"014980aa-a818-4b54-8ae1-7d91c7a3e843","Type":"ContainerDied","Data":"8daf0f5fda6b4cb6f682d1d2fe2c5950193ddd8ece6875b3f750cae344f6c8b2"} Apr 23 14:03:15.833810 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:15.833773 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-5035e-predictor-6d8685dc6b-q4hpb" podStartSLOduration=1.833761126 podStartE2EDuration="1.833761126s" podCreationTimestamp="2026-04-23 14:03:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 14:03:15.831594318 +0000 UTC m=+1868.809502908" watchObservedRunningTime="2026-04-23 14:03:15.833761126 +0000 UTC m=+1868.811669778" Apr 23 14:03:16.814342 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:16.814310 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-5035e-predictor-6d8685dc6b-q4hpb" Apr 23 14:03:16.815400 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:16.815379 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-5035e-predictor-6d8685dc6b-q4hpb" podUID="fca22878-ab4a-4c89-8f94-e4a1e5a08647" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 23 14:03:17.438627 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:17.438585 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-37fbd-576848b778-mxpf4" podUID="9a070a55-54b0-48a6-b96a-58a34dd2fa30" containerName="ensemble-graph-37fbd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 14:03:17.745877 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:17.745850 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-37fbd-predictor-7544db6574-c755q" Apr 23 14:03:17.790035 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:17.790005 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/014980aa-a818-4b54-8ae1-7d91c7a3e843-proxy-tls\") pod \"014980aa-a818-4b54-8ae1-7d91c7a3e843\" (UID: \"014980aa-a818-4b54-8ae1-7d91c7a3e843\") " Apr 23 14:03:17.790174 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:17.790042 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-37fbd-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/014980aa-a818-4b54-8ae1-7d91c7a3e843-error-404-isvc-37fbd-kube-rbac-proxy-sar-config\") pod \"014980aa-a818-4b54-8ae1-7d91c7a3e843\" (UID: \"014980aa-a818-4b54-8ae1-7d91c7a3e843\") " Apr 23 14:03:17.790174 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:17.790104 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8qpln\" (UniqueName: \"kubernetes.io/projected/014980aa-a818-4b54-8ae1-7d91c7a3e843-kube-api-access-8qpln\") pod \"014980aa-a818-4b54-8ae1-7d91c7a3e843\" (UID: \"014980aa-a818-4b54-8ae1-7d91c7a3e843\") " Apr 23 14:03:17.790424 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:17.790401 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/014980aa-a818-4b54-8ae1-7d91c7a3e843-error-404-isvc-37fbd-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-37fbd-kube-rbac-proxy-sar-config") pod "014980aa-a818-4b54-8ae1-7d91c7a3e843" (UID: "014980aa-a818-4b54-8ae1-7d91c7a3e843"). InnerVolumeSpecName "error-404-isvc-37fbd-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 14:03:17.792087 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:17.792062 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/014980aa-a818-4b54-8ae1-7d91c7a3e843-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "014980aa-a818-4b54-8ae1-7d91c7a3e843" (UID: "014980aa-a818-4b54-8ae1-7d91c7a3e843"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 14:03:17.792182 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:17.792089 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/014980aa-a818-4b54-8ae1-7d91c7a3e843-kube-api-access-8qpln" (OuterVolumeSpecName: "kube-api-access-8qpln") pod "014980aa-a818-4b54-8ae1-7d91c7a3e843" (UID: "014980aa-a818-4b54-8ae1-7d91c7a3e843"). InnerVolumeSpecName "kube-api-access-8qpln". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 14:03:17.818331 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:17.818300 2577 generic.go:358] "Generic (PLEG): container finished" podID="014980aa-a818-4b54-8ae1-7d91c7a3e843" containerID="f2f4f13f09288e783691efa8f72a9c6de6fd795e739fcdfe6a5b9ef79c44e053" exitCode=0 Apr 23 14:03:17.818709 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:17.818373 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-37fbd-predictor-7544db6574-c755q" Apr 23 14:03:17.818709 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:17.818382 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-37fbd-predictor-7544db6574-c755q" event={"ID":"014980aa-a818-4b54-8ae1-7d91c7a3e843","Type":"ContainerDied","Data":"f2f4f13f09288e783691efa8f72a9c6de6fd795e739fcdfe6a5b9ef79c44e053"} Apr 23 14:03:17.818709 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:17.818415 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-37fbd-predictor-7544db6574-c755q" event={"ID":"014980aa-a818-4b54-8ae1-7d91c7a3e843","Type":"ContainerDied","Data":"f18216fe5348e59a889557c0ed5396406eff9db9e62d80f82b12dc9ec3e4216a"} Apr 23 14:03:17.818709 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:17.818436 2577 scope.go:117] "RemoveContainer" containerID="8daf0f5fda6b4cb6f682d1d2fe2c5950193ddd8ece6875b3f750cae344f6c8b2" Apr 23 14:03:17.819088 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:17.819061 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-5035e-predictor-6d8685dc6b-q4hpb" podUID="fca22878-ab4a-4c89-8f94-e4a1e5a08647" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 23 14:03:17.826449 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:17.826430 2577 scope.go:117] "RemoveContainer" containerID="f2f4f13f09288e783691efa8f72a9c6de6fd795e739fcdfe6a5b9ef79c44e053" Apr 23 14:03:17.832991 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:17.832976 2577 scope.go:117] "RemoveContainer" containerID="8daf0f5fda6b4cb6f682d1d2fe2c5950193ddd8ece6875b3f750cae344f6c8b2" Apr 23 14:03:17.833217 ip-10-0-132-207 kubenswrapper[2577]: E0423 14:03:17.833196 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8daf0f5fda6b4cb6f682d1d2fe2c5950193ddd8ece6875b3f750cae344f6c8b2\": container with ID starting with 8daf0f5fda6b4cb6f682d1d2fe2c5950193ddd8ece6875b3f750cae344f6c8b2 not found: ID does not exist" containerID="8daf0f5fda6b4cb6f682d1d2fe2c5950193ddd8ece6875b3f750cae344f6c8b2" Apr 23 14:03:17.833316 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:17.833221 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8daf0f5fda6b4cb6f682d1d2fe2c5950193ddd8ece6875b3f750cae344f6c8b2"} err="failed to get container status \"8daf0f5fda6b4cb6f682d1d2fe2c5950193ddd8ece6875b3f750cae344f6c8b2\": rpc error: code = NotFound desc = could not find container \"8daf0f5fda6b4cb6f682d1d2fe2c5950193ddd8ece6875b3f750cae344f6c8b2\": container with ID starting with 8daf0f5fda6b4cb6f682d1d2fe2c5950193ddd8ece6875b3f750cae344f6c8b2 not found: ID does not exist" Apr 23 14:03:17.833316 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:17.833237 2577 scope.go:117] "RemoveContainer" containerID="f2f4f13f09288e783691efa8f72a9c6de6fd795e739fcdfe6a5b9ef79c44e053" Apr 23 14:03:17.833439 ip-10-0-132-207 kubenswrapper[2577]: E0423 14:03:17.833425 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2f4f13f09288e783691efa8f72a9c6de6fd795e739fcdfe6a5b9ef79c44e053\": container with ID starting with f2f4f13f09288e783691efa8f72a9c6de6fd795e739fcdfe6a5b9ef79c44e053 not found: ID does not exist" containerID="f2f4f13f09288e783691efa8f72a9c6de6fd795e739fcdfe6a5b9ef79c44e053" Apr 23 14:03:17.833482 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:17.833441 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2f4f13f09288e783691efa8f72a9c6de6fd795e739fcdfe6a5b9ef79c44e053"} err="failed to get container status \"f2f4f13f09288e783691efa8f72a9c6de6fd795e739fcdfe6a5b9ef79c44e053\": rpc error: code = NotFound desc = could not find container \"f2f4f13f09288e783691efa8f72a9c6de6fd795e739fcdfe6a5b9ef79c44e053\": container with ID starting with f2f4f13f09288e783691efa8f72a9c6de6fd795e739fcdfe6a5b9ef79c44e053 not found: ID does not exist" Apr 23 14:03:17.841377 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:17.841349 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-37fbd-predictor-7544db6574-c755q"] Apr 23 14:03:17.844702 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:17.844677 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-37fbd-predictor-7544db6574-c755q"] Apr 23 14:03:17.891272 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:17.891249 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8qpln\" (UniqueName: \"kubernetes.io/projected/014980aa-a818-4b54-8ae1-7d91c7a3e843-kube-api-access-8qpln\") on node \"ip-10-0-132-207.ec2.internal\" DevicePath \"\"" Apr 23 14:03:17.891352 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:17.891276 2577 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/014980aa-a818-4b54-8ae1-7d91c7a3e843-proxy-tls\") on node \"ip-10-0-132-207.ec2.internal\" DevicePath \"\"" Apr 23 14:03:17.891352 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:17.891293 2577 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-37fbd-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/014980aa-a818-4b54-8ae1-7d91c7a3e843-error-404-isvc-37fbd-kube-rbac-proxy-sar-config\") on node \"ip-10-0-132-207.ec2.internal\" DevicePath \"\"" Apr 23 14:03:19.583841 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:19.583808 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="014980aa-a818-4b54-8ae1-7d91c7a3e843" path="/var/lib/kubelet/pods/014980aa-a818-4b54-8ae1-7d91c7a3e843/volumes" Apr 23 14:03:22.438181 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:22.438142 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-37fbd-576848b778-mxpf4" podUID="9a070a55-54b0-48a6-b96a-58a34dd2fa30" containerName="ensemble-graph-37fbd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 14:03:22.823426 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:22.823350 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-5035e-predictor-6d8685dc6b-q4hpb" Apr 23 14:03:22.823909 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:22.823873 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-5035e-predictor-6d8685dc6b-q4hpb" podUID="fca22878-ab4a-4c89-8f94-e4a1e5a08647" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 23 14:03:27.438814 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:27.438720 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-37fbd-576848b778-mxpf4" podUID="9a070a55-54b0-48a6-b96a-58a34dd2fa30" containerName="ensemble-graph-37fbd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 14:03:27.439264 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:27.438835 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/ensemble-graph-37fbd-576848b778-mxpf4" Apr 23 14:03:32.438640 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:32.438599 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-37fbd-576848b778-mxpf4" podUID="9a070a55-54b0-48a6-b96a-58a34dd2fa30" containerName="ensemble-graph-37fbd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 14:03:32.824404 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:32.824310 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-5035e-predictor-6d8685dc6b-q4hpb" podUID="fca22878-ab4a-4c89-8f94-e4a1e5a08647" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 23 14:03:37.438181 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:37.438143 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-37fbd-576848b778-mxpf4" podUID="9a070a55-54b0-48a6-b96a-58a34dd2fa30" containerName="ensemble-graph-37fbd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 14:03:42.438070 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:42.438028 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-37fbd-576848b778-mxpf4" podUID="9a070a55-54b0-48a6-b96a-58a34dd2fa30" containerName="ensemble-graph-37fbd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 14:03:42.824366 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:42.824286 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-5035e-predictor-6d8685dc6b-q4hpb" podUID="fca22878-ab4a-4c89-8f94-e4a1e5a08647" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 23 14:03:44.650693 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:44.650669 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-37fbd-576848b778-mxpf4" Apr 23 14:03:44.684211 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:44.684182 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9a070a55-54b0-48a6-b96a-58a34dd2fa30-proxy-tls\") pod \"9a070a55-54b0-48a6-b96a-58a34dd2fa30\" (UID: \"9a070a55-54b0-48a6-b96a-58a34dd2fa30\") " Apr 23 14:03:44.684369 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:44.684240 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9a070a55-54b0-48a6-b96a-58a34dd2fa30-openshift-service-ca-bundle\") pod \"9a070a55-54b0-48a6-b96a-58a34dd2fa30\" (UID: \"9a070a55-54b0-48a6-b96a-58a34dd2fa30\") " Apr 23 14:03:44.684644 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:44.684615 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a070a55-54b0-48a6-b96a-58a34dd2fa30-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "9a070a55-54b0-48a6-b96a-58a34dd2fa30" (UID: "9a070a55-54b0-48a6-b96a-58a34dd2fa30"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 14:03:44.686213 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:44.686191 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a070a55-54b0-48a6-b96a-58a34dd2fa30-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "9a070a55-54b0-48a6-b96a-58a34dd2fa30" (UID: "9a070a55-54b0-48a6-b96a-58a34dd2fa30"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 14:03:44.785258 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:44.785191 2577 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9a070a55-54b0-48a6-b96a-58a34dd2fa30-proxy-tls\") on node \"ip-10-0-132-207.ec2.internal\" DevicePath \"\"" Apr 23 14:03:44.785258 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:44.785215 2577 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9a070a55-54b0-48a6-b96a-58a34dd2fa30-openshift-service-ca-bundle\") on node \"ip-10-0-132-207.ec2.internal\" DevicePath \"\"" Apr 23 14:03:44.899433 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:44.899401 2577 generic.go:358] "Generic (PLEG): container finished" podID="9a070a55-54b0-48a6-b96a-58a34dd2fa30" containerID="d5f148d5d94be2cd2084b96fca4b497127f4f8c1f7f424ef64c2de6537d15b71" exitCode=0 Apr 23 14:03:44.899600 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:44.899445 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-37fbd-576848b778-mxpf4" event={"ID":"9a070a55-54b0-48a6-b96a-58a34dd2fa30","Type":"ContainerDied","Data":"d5f148d5d94be2cd2084b96fca4b497127f4f8c1f7f424ef64c2de6537d15b71"} Apr 23 14:03:44.899600 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:44.899467 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-37fbd-576848b778-mxpf4" event={"ID":"9a070a55-54b0-48a6-b96a-58a34dd2fa30","Type":"ContainerDied","Data":"2f82b178a45e0fe18eb57decc724cf474a21155e8e932b972a7d2ae580b930ba"} Apr 23 14:03:44.899600 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:44.899481 2577 scope.go:117] "RemoveContainer" containerID="d5f148d5d94be2cd2084b96fca4b497127f4f8c1f7f424ef64c2de6537d15b71" Apr 23 14:03:44.899600 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:44.899480 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-37fbd-576848b778-mxpf4" Apr 23 14:03:44.907059 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:44.906964 2577 scope.go:117] "RemoveContainer" containerID="d5f148d5d94be2cd2084b96fca4b497127f4f8c1f7f424ef64c2de6537d15b71" Apr 23 14:03:44.907315 ip-10-0-132-207 kubenswrapper[2577]: E0423 14:03:44.907289 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5f148d5d94be2cd2084b96fca4b497127f4f8c1f7f424ef64c2de6537d15b71\": container with ID starting with d5f148d5d94be2cd2084b96fca4b497127f4f8c1f7f424ef64c2de6537d15b71 not found: ID does not exist" containerID="d5f148d5d94be2cd2084b96fca4b497127f4f8c1f7f424ef64c2de6537d15b71" Apr 23 14:03:44.907389 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:44.907324 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5f148d5d94be2cd2084b96fca4b497127f4f8c1f7f424ef64c2de6537d15b71"} err="failed to get container status \"d5f148d5d94be2cd2084b96fca4b497127f4f8c1f7f424ef64c2de6537d15b71\": rpc error: code = NotFound desc = could not find container \"d5f148d5d94be2cd2084b96fca4b497127f4f8c1f7f424ef64c2de6537d15b71\": container with ID starting with d5f148d5d94be2cd2084b96fca4b497127f4f8c1f7f424ef64c2de6537d15b71 not found: ID does not exist" Apr 23 14:03:44.919959 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:44.919935 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-37fbd-576848b778-mxpf4"] Apr 23 14:03:44.926883 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:44.926864 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-37fbd-576848b778-mxpf4"] Apr 23 14:03:45.583202 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:45.583161 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a070a55-54b0-48a6-b96a-58a34dd2fa30" path="/var/lib/kubelet/pods/9a070a55-54b0-48a6-b96a-58a34dd2fa30/volumes" Apr 23 14:03:50.233383 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:50.233353 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-8d150-55944bff46-k6lxc"] Apr 23 14:03:50.233770 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:50.233583 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sequence-graph-8d150-55944bff46-k6lxc" podUID="e92d4441-cbca-43ee-9969-2d4880ddff10" containerName="sequence-graph-8d150" containerID="cri-o://4a18d44cea327f2ded44260e9264feda8321099630910ef53d84a23babc22664" gracePeriod=30 Apr 23 14:03:50.554371 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:50.554294 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-8d150-predictor-568b6759f8-b7rcq"] Apr 23 14:03:50.554635 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:50.554599 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-8d150-predictor-568b6759f8-b7rcq" podUID="2cd36014-1cf4-473d-9b21-8a05cbcca2f7" containerName="kserve-container" containerID="cri-o://0510ea135d5eba008ba32bac722fc527294c3d3d3bd3786e8d177ba6eb38544a" gracePeriod=30 Apr 23 14:03:50.554771 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:50.554655 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-8d150-predictor-568b6759f8-b7rcq" podUID="2cd36014-1cf4-473d-9b21-8a05cbcca2f7" containerName="kube-rbac-proxy" containerID="cri-o://a156e8e7057c64bb8fc001901c09d63f58f8f358a01a7cf6f88d3e69898bd03b" gracePeriod=30 Apr 23 14:03:50.602019 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:50.601992 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-feffa-predictor-75cd96c4d6-tzwhk"] Apr 23 14:03:50.602292 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:50.602281 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9a070a55-54b0-48a6-b96a-58a34dd2fa30" containerName="ensemble-graph-37fbd" Apr 23 14:03:50.602338 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:50.602294 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a070a55-54b0-48a6-b96a-58a34dd2fa30" containerName="ensemble-graph-37fbd" Apr 23 14:03:50.602338 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:50.602305 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="014980aa-a818-4b54-8ae1-7d91c7a3e843" containerName="kserve-container" Apr 23 14:03:50.602338 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:50.602312 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="014980aa-a818-4b54-8ae1-7d91c7a3e843" containerName="kserve-container" Apr 23 14:03:50.602338 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:50.602317 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="014980aa-a818-4b54-8ae1-7d91c7a3e843" containerName="kube-rbac-proxy" Apr 23 14:03:50.602338 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:50.602323 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="014980aa-a818-4b54-8ae1-7d91c7a3e843" containerName="kube-rbac-proxy" Apr 23 14:03:50.602482 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:50.602366 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="9a070a55-54b0-48a6-b96a-58a34dd2fa30" containerName="ensemble-graph-37fbd" Apr 23 14:03:50.602482 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:50.602374 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="014980aa-a818-4b54-8ae1-7d91c7a3e843" containerName="kserve-container" Apr 23 14:03:50.602482 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:50.602381 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="014980aa-a818-4b54-8ae1-7d91c7a3e843" containerName="kube-rbac-proxy" Apr 23 14:03:50.606546 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:50.606528 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-feffa-predictor-75cd96c4d6-tzwhk" Apr 23 14:03:50.612596 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:50.612572 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-feffa-kube-rbac-proxy-sar-config\"" Apr 23 14:03:50.612596 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:50.612584 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-feffa-predictor-serving-cert\"" Apr 23 14:03:50.618599 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:50.618581 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-feffa-predictor-75cd96c4d6-tzwhk"] Apr 23 14:03:50.726293 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:50.726251 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgbhd\" (UniqueName: \"kubernetes.io/projected/7830be0b-2892-4e47-b17d-bb0aa34efc73-kube-api-access-wgbhd\") pod \"error-404-isvc-feffa-predictor-75cd96c4d6-tzwhk\" (UID: \"7830be0b-2892-4e47-b17d-bb0aa34efc73\") " pod="kserve-ci-e2e-test/error-404-isvc-feffa-predictor-75cd96c4d6-tzwhk" Apr 23 14:03:50.726293 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:50.726299 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7830be0b-2892-4e47-b17d-bb0aa34efc73-proxy-tls\") pod \"error-404-isvc-feffa-predictor-75cd96c4d6-tzwhk\" (UID: \"7830be0b-2892-4e47-b17d-bb0aa34efc73\") " pod="kserve-ci-e2e-test/error-404-isvc-feffa-predictor-75cd96c4d6-tzwhk" Apr 23 14:03:50.726477 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:50.726388 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-feffa-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7830be0b-2892-4e47-b17d-bb0aa34efc73-error-404-isvc-feffa-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-feffa-predictor-75cd96c4d6-tzwhk\" (UID: \"7830be0b-2892-4e47-b17d-bb0aa34efc73\") " pod="kserve-ci-e2e-test/error-404-isvc-feffa-predictor-75cd96c4d6-tzwhk" Apr 23 14:03:50.827604 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:50.827524 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-feffa-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7830be0b-2892-4e47-b17d-bb0aa34efc73-error-404-isvc-feffa-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-feffa-predictor-75cd96c4d6-tzwhk\" (UID: \"7830be0b-2892-4e47-b17d-bb0aa34efc73\") " pod="kserve-ci-e2e-test/error-404-isvc-feffa-predictor-75cd96c4d6-tzwhk" Apr 23 14:03:50.827604 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:50.827595 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wgbhd\" (UniqueName: \"kubernetes.io/projected/7830be0b-2892-4e47-b17d-bb0aa34efc73-kube-api-access-wgbhd\") pod \"error-404-isvc-feffa-predictor-75cd96c4d6-tzwhk\" (UID: \"7830be0b-2892-4e47-b17d-bb0aa34efc73\") " pod="kserve-ci-e2e-test/error-404-isvc-feffa-predictor-75cd96c4d6-tzwhk" Apr 23 14:03:50.827804 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:50.827625 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7830be0b-2892-4e47-b17d-bb0aa34efc73-proxy-tls\") pod \"error-404-isvc-feffa-predictor-75cd96c4d6-tzwhk\" (UID: \"7830be0b-2892-4e47-b17d-bb0aa34efc73\") " pod="kserve-ci-e2e-test/error-404-isvc-feffa-predictor-75cd96c4d6-tzwhk" Apr 23 14:03:50.828169 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:50.828147 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-feffa-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7830be0b-2892-4e47-b17d-bb0aa34efc73-error-404-isvc-feffa-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-feffa-predictor-75cd96c4d6-tzwhk\" (UID: \"7830be0b-2892-4e47-b17d-bb0aa34efc73\") " pod="kserve-ci-e2e-test/error-404-isvc-feffa-predictor-75cd96c4d6-tzwhk" Apr 23 14:03:50.830008 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:50.829988 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7830be0b-2892-4e47-b17d-bb0aa34efc73-proxy-tls\") pod \"error-404-isvc-feffa-predictor-75cd96c4d6-tzwhk\" (UID: \"7830be0b-2892-4e47-b17d-bb0aa34efc73\") " pod="kserve-ci-e2e-test/error-404-isvc-feffa-predictor-75cd96c4d6-tzwhk" Apr 23 14:03:50.841442 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:50.837755 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgbhd\" (UniqueName: \"kubernetes.io/projected/7830be0b-2892-4e47-b17d-bb0aa34efc73-kube-api-access-wgbhd\") pod \"error-404-isvc-feffa-predictor-75cd96c4d6-tzwhk\" (UID: \"7830be0b-2892-4e47-b17d-bb0aa34efc73\") " pod="kserve-ci-e2e-test/error-404-isvc-feffa-predictor-75cd96c4d6-tzwhk" Apr 23 14:03:50.918915 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:50.918882 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-feffa-predictor-75cd96c4d6-tzwhk" Apr 23 14:03:50.922793 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:50.922765 2577 generic.go:358] "Generic (PLEG): container finished" podID="2cd36014-1cf4-473d-9b21-8a05cbcca2f7" containerID="a156e8e7057c64bb8fc001901c09d63f58f8f358a01a7cf6f88d3e69898bd03b" exitCode=2 Apr 23 14:03:50.922890 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:50.922826 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-8d150-predictor-568b6759f8-b7rcq" event={"ID":"2cd36014-1cf4-473d-9b21-8a05cbcca2f7","Type":"ContainerDied","Data":"a156e8e7057c64bb8fc001901c09d63f58f8f358a01a7cf6f88d3e69898bd03b"} Apr 23 14:03:51.044926 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:51.044890 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-feffa-predictor-75cd96c4d6-tzwhk"] Apr 23 14:03:51.047869 ip-10-0-132-207 kubenswrapper[2577]: W0423 14:03:51.047841 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7830be0b_2892_4e47_b17d_bb0aa34efc73.slice/crio-7eb35964d19a0a1b84ff2aa6cb94330185fb6d0900ab28192fa3106188baaf24 WatchSource:0}: Error finding container 7eb35964d19a0a1b84ff2aa6cb94330185fb6d0900ab28192fa3106188baaf24: Status 404 returned error can't find the container with id 7eb35964d19a0a1b84ff2aa6cb94330185fb6d0900ab28192fa3106188baaf24 Apr 23 14:03:51.927620 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:51.927582 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-feffa-predictor-75cd96c4d6-tzwhk" event={"ID":"7830be0b-2892-4e47-b17d-bb0aa34efc73","Type":"ContainerStarted","Data":"4a04b94b443468f97a1511768fc905ddcfc037a62f865982751f7fc478ec0269"} Apr 23 14:03:51.927620 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:51.927621 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-feffa-predictor-75cd96c4d6-tzwhk" event={"ID":"7830be0b-2892-4e47-b17d-bb0aa34efc73","Type":"ContainerStarted","Data":"a6b36c2cf542b9a92b573ec04e595e780bb4ee26e9145be0f7d8470156bf77c1"} Apr 23 14:03:51.928025 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:51.927630 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-feffa-predictor-75cd96c4d6-tzwhk" event={"ID":"7830be0b-2892-4e47-b17d-bb0aa34efc73","Type":"ContainerStarted","Data":"7eb35964d19a0a1b84ff2aa6cb94330185fb6d0900ab28192fa3106188baaf24"} Apr 23 14:03:51.928025 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:51.927802 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-feffa-predictor-75cd96c4d6-tzwhk" Apr 23 14:03:51.946868 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:51.946830 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-feffa-predictor-75cd96c4d6-tzwhk" podStartSLOduration=1.946818352 podStartE2EDuration="1.946818352s" podCreationTimestamp="2026-04-23 14:03:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 14:03:51.945970938 +0000 UTC m=+1904.923879530" watchObservedRunningTime="2026-04-23 14:03:51.946818352 +0000 UTC m=+1904.924726943" Apr 23 14:03:52.542751 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:52.542717 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-8d150-55944bff46-k6lxc" podUID="e92d4441-cbca-43ee-9969-2d4880ddff10" containerName="sequence-graph-8d150" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 14:03:52.824119 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:52.824034 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-5035e-predictor-6d8685dc6b-q4hpb" podUID="fca22878-ab4a-4c89-8f94-e4a1e5a08647" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 23 14:03:52.931341 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:52.931310 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-feffa-predictor-75cd96c4d6-tzwhk" Apr 23 14:03:52.932503 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:52.932464 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-feffa-predictor-75cd96c4d6-tzwhk" podUID="7830be0b-2892-4e47-b17d-bb0aa34efc73" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 23 14:03:53.689530 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:53.689508 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-8d150-predictor-568b6759f8-b7rcq" Apr 23 14:03:53.749441 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:53.749376 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2cd36014-1cf4-473d-9b21-8a05cbcca2f7-proxy-tls\") pod \"2cd36014-1cf4-473d-9b21-8a05cbcca2f7\" (UID: \"2cd36014-1cf4-473d-9b21-8a05cbcca2f7\") " Apr 23 14:03:53.749441 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:53.749425 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5cjwt\" (UniqueName: \"kubernetes.io/projected/2cd36014-1cf4-473d-9b21-8a05cbcca2f7-kube-api-access-5cjwt\") pod \"2cd36014-1cf4-473d-9b21-8a05cbcca2f7\" (UID: \"2cd36014-1cf4-473d-9b21-8a05cbcca2f7\") " Apr 23 14:03:53.749653 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:53.749460 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-8d150-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2cd36014-1cf4-473d-9b21-8a05cbcca2f7-error-404-isvc-8d150-kube-rbac-proxy-sar-config\") pod \"2cd36014-1cf4-473d-9b21-8a05cbcca2f7\" (UID: \"2cd36014-1cf4-473d-9b21-8a05cbcca2f7\") " Apr 23 14:03:53.749860 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:53.749827 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2cd36014-1cf4-473d-9b21-8a05cbcca2f7-error-404-isvc-8d150-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-8d150-kube-rbac-proxy-sar-config") pod "2cd36014-1cf4-473d-9b21-8a05cbcca2f7" (UID: "2cd36014-1cf4-473d-9b21-8a05cbcca2f7"). InnerVolumeSpecName "error-404-isvc-8d150-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 14:03:53.751247 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:53.751227 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2cd36014-1cf4-473d-9b21-8a05cbcca2f7-kube-api-access-5cjwt" (OuterVolumeSpecName: "kube-api-access-5cjwt") pod "2cd36014-1cf4-473d-9b21-8a05cbcca2f7" (UID: "2cd36014-1cf4-473d-9b21-8a05cbcca2f7"). InnerVolumeSpecName "kube-api-access-5cjwt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 14:03:53.751414 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:53.751393 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cd36014-1cf4-473d-9b21-8a05cbcca2f7-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "2cd36014-1cf4-473d-9b21-8a05cbcca2f7" (UID: "2cd36014-1cf4-473d-9b21-8a05cbcca2f7"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 14:03:53.850617 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:53.850579 2577 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-8d150-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2cd36014-1cf4-473d-9b21-8a05cbcca2f7-error-404-isvc-8d150-kube-rbac-proxy-sar-config\") on node \"ip-10-0-132-207.ec2.internal\" DevicePath \"\"" Apr 23 14:03:53.850617 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:53.850614 2577 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2cd36014-1cf4-473d-9b21-8a05cbcca2f7-proxy-tls\") on node \"ip-10-0-132-207.ec2.internal\" DevicePath \"\"" Apr 23 14:03:53.850617 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:53.850625 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5cjwt\" (UniqueName: \"kubernetes.io/projected/2cd36014-1cf4-473d-9b21-8a05cbcca2f7-kube-api-access-5cjwt\") on node \"ip-10-0-132-207.ec2.internal\" DevicePath \"\"" Apr 23 14:03:53.936123 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:53.936088 2577 generic.go:358] "Generic (PLEG): container finished" podID="2cd36014-1cf4-473d-9b21-8a05cbcca2f7" containerID="0510ea135d5eba008ba32bac722fc527294c3d3d3bd3786e8d177ba6eb38544a" exitCode=0 Apr 23 14:03:53.936530 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:53.936169 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-8d150-predictor-568b6759f8-b7rcq" Apr 23 14:03:53.936530 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:53.936175 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-8d150-predictor-568b6759f8-b7rcq" event={"ID":"2cd36014-1cf4-473d-9b21-8a05cbcca2f7","Type":"ContainerDied","Data":"0510ea135d5eba008ba32bac722fc527294c3d3d3bd3786e8d177ba6eb38544a"} Apr 23 14:03:53.936530 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:53.936209 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-8d150-predictor-568b6759f8-b7rcq" event={"ID":"2cd36014-1cf4-473d-9b21-8a05cbcca2f7","Type":"ContainerDied","Data":"077921c12edd32e94a8f8764eca6db0a46ac7b49c230acc14d4d90b5c202bd0f"} Apr 23 14:03:53.936530 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:53.936229 2577 scope.go:117] "RemoveContainer" containerID="a156e8e7057c64bb8fc001901c09d63f58f8f358a01a7cf6f88d3e69898bd03b" Apr 23 14:03:53.936894 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:53.936869 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-feffa-predictor-75cd96c4d6-tzwhk" podUID="7830be0b-2892-4e47-b17d-bb0aa34efc73" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 23 14:03:53.944282 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:53.944267 2577 scope.go:117] "RemoveContainer" containerID="0510ea135d5eba008ba32bac722fc527294c3d3d3bd3786e8d177ba6eb38544a" Apr 23 14:03:53.950821 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:53.950798 2577 scope.go:117] "RemoveContainer" containerID="a156e8e7057c64bb8fc001901c09d63f58f8f358a01a7cf6f88d3e69898bd03b" Apr 23 14:03:53.951046 ip-10-0-132-207 kubenswrapper[2577]: E0423 14:03:53.951027 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a156e8e7057c64bb8fc001901c09d63f58f8f358a01a7cf6f88d3e69898bd03b\": container with ID starting with a156e8e7057c64bb8fc001901c09d63f58f8f358a01a7cf6f88d3e69898bd03b not found: ID does not exist" containerID="a156e8e7057c64bb8fc001901c09d63f58f8f358a01a7cf6f88d3e69898bd03b" Apr 23 14:03:53.951105 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:53.951055 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a156e8e7057c64bb8fc001901c09d63f58f8f358a01a7cf6f88d3e69898bd03b"} err="failed to get container status \"a156e8e7057c64bb8fc001901c09d63f58f8f358a01a7cf6f88d3e69898bd03b\": rpc error: code = NotFound desc = could not find container \"a156e8e7057c64bb8fc001901c09d63f58f8f358a01a7cf6f88d3e69898bd03b\": container with ID starting with a156e8e7057c64bb8fc001901c09d63f58f8f358a01a7cf6f88d3e69898bd03b not found: ID does not exist" Apr 23 14:03:53.951105 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:53.951073 2577 scope.go:117] "RemoveContainer" containerID="0510ea135d5eba008ba32bac722fc527294c3d3d3bd3786e8d177ba6eb38544a" Apr 23 14:03:53.951317 ip-10-0-132-207 kubenswrapper[2577]: E0423 14:03:53.951298 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0510ea135d5eba008ba32bac722fc527294c3d3d3bd3786e8d177ba6eb38544a\": container with ID starting with 0510ea135d5eba008ba32bac722fc527294c3d3d3bd3786e8d177ba6eb38544a not found: ID does not exist" containerID="0510ea135d5eba008ba32bac722fc527294c3d3d3bd3786e8d177ba6eb38544a" Apr 23 14:03:53.951355 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:53.951324 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0510ea135d5eba008ba32bac722fc527294c3d3d3bd3786e8d177ba6eb38544a"} err="failed to get container status \"0510ea135d5eba008ba32bac722fc527294c3d3d3bd3786e8d177ba6eb38544a\": rpc error: code = NotFound desc = could not find container \"0510ea135d5eba008ba32bac722fc527294c3d3d3bd3786e8d177ba6eb38544a\": container with ID starting with 0510ea135d5eba008ba32bac722fc527294c3d3d3bd3786e8d177ba6eb38544a not found: ID does not exist" Apr 23 14:03:53.960159 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:53.960138 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-8d150-predictor-568b6759f8-b7rcq"] Apr 23 14:03:53.965704 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:53.965682 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-8d150-predictor-568b6759f8-b7rcq"] Apr 23 14:03:55.583048 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:55.583011 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2cd36014-1cf4-473d-9b21-8a05cbcca2f7" path="/var/lib/kubelet/pods/2cd36014-1cf4-473d-9b21-8a05cbcca2f7/volumes" Apr 23 14:03:57.542794 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:57.542756 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-8d150-55944bff46-k6lxc" podUID="e92d4441-cbca-43ee-9969-2d4880ddff10" containerName="sequence-graph-8d150" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 14:03:58.940653 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:58.940621 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-feffa-predictor-75cd96c4d6-tzwhk" Apr 23 14:03:58.941199 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:03:58.941158 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-feffa-predictor-75cd96c4d6-tzwhk" podUID="7830be0b-2892-4e47-b17d-bb0aa34efc73" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 23 14:04:02.542669 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:02.542629 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-8d150-55944bff46-k6lxc" podUID="e92d4441-cbca-43ee-9969-2d4880ddff10" containerName="sequence-graph-8d150" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 14:04:02.543042 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:02.542757 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-8d150-55944bff46-k6lxc" Apr 23 14:04:02.825253 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:02.825184 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-5035e-predictor-6d8685dc6b-q4hpb" Apr 23 14:04:07.542572 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:07.542534 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-8d150-55944bff46-k6lxc" podUID="e92d4441-cbca-43ee-9969-2d4880ddff10" containerName="sequence-graph-8d150" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 14:04:08.941860 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:08.941818 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-feffa-predictor-75cd96c4d6-tzwhk" podUID="7830be0b-2892-4e47-b17d-bb0aa34efc73" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 23 14:04:12.542786 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:12.542737 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-8d150-55944bff46-k6lxc" podUID="e92d4441-cbca-43ee-9969-2d4880ddff10" containerName="sequence-graph-8d150" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 14:04:14.774216 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:14.774185 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/splitter-graph-5035e-5c58fb744-qcjf5"] Apr 23 14:04:14.774623 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:14.774502 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2cd36014-1cf4-473d-9b21-8a05cbcca2f7" containerName="kube-rbac-proxy" Apr 23 14:04:14.774623 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:14.774517 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cd36014-1cf4-473d-9b21-8a05cbcca2f7" containerName="kube-rbac-proxy" Apr 23 14:04:14.774623 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:14.774533 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2cd36014-1cf4-473d-9b21-8a05cbcca2f7" containerName="kserve-container" Apr 23 14:04:14.774623 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:14.774539 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cd36014-1cf4-473d-9b21-8a05cbcca2f7" containerName="kserve-container" Apr 23 14:04:14.774623 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:14.774592 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="2cd36014-1cf4-473d-9b21-8a05cbcca2f7" containerName="kube-rbac-proxy" Apr 23 14:04:14.774623 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:14.774600 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="2cd36014-1cf4-473d-9b21-8a05cbcca2f7" containerName="kserve-container" Apr 23 14:04:14.778840 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:14.778823 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-5035e-5c58fb744-qcjf5" Apr 23 14:04:14.781641 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:14.781620 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"splitter-graph-5035e-kube-rbac-proxy-sar-config\"" Apr 23 14:04:14.781768 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:14.781624 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"splitter-graph-5035e-serving-cert\"" Apr 23 14:04:14.787005 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:14.786983 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-5035e-5c58fb744-qcjf5"] Apr 23 14:04:14.908220 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:14.908190 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9cc3527f-b5d1-4e0c-bb4c-08b69b815709-openshift-service-ca-bundle\") pod \"splitter-graph-5035e-5c58fb744-qcjf5\" (UID: \"9cc3527f-b5d1-4e0c-bb4c-08b69b815709\") " pod="kserve-ci-e2e-test/splitter-graph-5035e-5c58fb744-qcjf5" Apr 23 14:04:14.908388 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:14.908251 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9cc3527f-b5d1-4e0c-bb4c-08b69b815709-proxy-tls\") pod \"splitter-graph-5035e-5c58fb744-qcjf5\" (UID: \"9cc3527f-b5d1-4e0c-bb4c-08b69b815709\") " pod="kserve-ci-e2e-test/splitter-graph-5035e-5c58fb744-qcjf5" Apr 23 14:04:15.009265 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:15.009237 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9cc3527f-b5d1-4e0c-bb4c-08b69b815709-openshift-service-ca-bundle\") pod \"splitter-graph-5035e-5c58fb744-qcjf5\" (UID: \"9cc3527f-b5d1-4e0c-bb4c-08b69b815709\") " pod="kserve-ci-e2e-test/splitter-graph-5035e-5c58fb744-qcjf5" Apr 23 14:04:15.009394 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:15.009283 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9cc3527f-b5d1-4e0c-bb4c-08b69b815709-proxy-tls\") pod \"splitter-graph-5035e-5c58fb744-qcjf5\" (UID: \"9cc3527f-b5d1-4e0c-bb4c-08b69b815709\") " pod="kserve-ci-e2e-test/splitter-graph-5035e-5c58fb744-qcjf5" Apr 23 14:04:15.009394 ip-10-0-132-207 kubenswrapper[2577]: E0423 14:04:15.009374 2577 secret.go:189] Couldn't get secret kserve-ci-e2e-test/splitter-graph-5035e-serving-cert: secret "splitter-graph-5035e-serving-cert" not found Apr 23 14:04:15.009481 ip-10-0-132-207 kubenswrapper[2577]: E0423 14:04:15.009431 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9cc3527f-b5d1-4e0c-bb4c-08b69b815709-proxy-tls podName:9cc3527f-b5d1-4e0c-bb4c-08b69b815709 nodeName:}" failed. No retries permitted until 2026-04-23 14:04:15.509415573 +0000 UTC m=+1928.487324144 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/9cc3527f-b5d1-4e0c-bb4c-08b69b815709-proxy-tls") pod "splitter-graph-5035e-5c58fb744-qcjf5" (UID: "9cc3527f-b5d1-4e0c-bb4c-08b69b815709") : secret "splitter-graph-5035e-serving-cert" not found Apr 23 14:04:15.009838 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:15.009818 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9cc3527f-b5d1-4e0c-bb4c-08b69b815709-openshift-service-ca-bundle\") pod \"splitter-graph-5035e-5c58fb744-qcjf5\" (UID: \"9cc3527f-b5d1-4e0c-bb4c-08b69b815709\") " pod="kserve-ci-e2e-test/splitter-graph-5035e-5c58fb744-qcjf5" Apr 23 14:04:15.514551 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:15.514485 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9cc3527f-b5d1-4e0c-bb4c-08b69b815709-proxy-tls\") pod \"splitter-graph-5035e-5c58fb744-qcjf5\" (UID: \"9cc3527f-b5d1-4e0c-bb4c-08b69b815709\") " pod="kserve-ci-e2e-test/splitter-graph-5035e-5c58fb744-qcjf5" Apr 23 14:04:15.516847 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:15.516829 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9cc3527f-b5d1-4e0c-bb4c-08b69b815709-proxy-tls\") pod \"splitter-graph-5035e-5c58fb744-qcjf5\" (UID: \"9cc3527f-b5d1-4e0c-bb4c-08b69b815709\") " pod="kserve-ci-e2e-test/splitter-graph-5035e-5c58fb744-qcjf5" Apr 23 14:04:15.689085 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:15.689058 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-5035e-5c58fb744-qcjf5" Apr 23 14:04:15.804597 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:15.804574 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-5035e-5c58fb744-qcjf5"] Apr 23 14:04:15.807050 ip-10-0-132-207 kubenswrapper[2577]: W0423 14:04:15.807018 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9cc3527f_b5d1_4e0c_bb4c_08b69b815709.slice/crio-469587ca0694018e6baa14e03d366c6dfc2f4f5b3d904f1bfd604899252538b6 WatchSource:0}: Error finding container 469587ca0694018e6baa14e03d366c6dfc2f4f5b3d904f1bfd604899252538b6: Status 404 returned error can't find the container with id 469587ca0694018e6baa14e03d366c6dfc2f4f5b3d904f1bfd604899252538b6 Apr 23 14:04:16.012099 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:16.012063 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-5035e-5c58fb744-qcjf5" event={"ID":"9cc3527f-b5d1-4e0c-bb4c-08b69b815709","Type":"ContainerStarted","Data":"d03179c6754cbeb2b69beea08436e3617ae9c044ab77f7d0e74a9860ad8a0712"} Apr 23 14:04:16.012099 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:16.012099 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-5035e-5c58fb744-qcjf5" event={"ID":"9cc3527f-b5d1-4e0c-bb4c-08b69b815709","Type":"ContainerStarted","Data":"469587ca0694018e6baa14e03d366c6dfc2f4f5b3d904f1bfd604899252538b6"} Apr 23 14:04:16.012305 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:16.012185 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/splitter-graph-5035e-5c58fb744-qcjf5" Apr 23 14:04:16.030460 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:16.030416 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/splitter-graph-5035e-5c58fb744-qcjf5" podStartSLOduration=2.030402687 podStartE2EDuration="2.030402687s" podCreationTimestamp="2026-04-23 14:04:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 14:04:16.029848745 +0000 UTC m=+1929.007757338" watchObservedRunningTime="2026-04-23 14:04:16.030402687 +0000 UTC m=+1929.008311278" Apr 23 14:04:17.542365 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:17.542326 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-8d150-55944bff46-k6lxc" podUID="e92d4441-cbca-43ee-9969-2d4880ddff10" containerName="sequence-graph-8d150" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 14:04:18.941453 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:18.941417 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-feffa-predictor-75cd96c4d6-tzwhk" podUID="7830be0b-2892-4e47-b17d-bb0aa34efc73" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 23 14:04:20.375639 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:20.375616 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-8d150-55944bff46-k6lxc" Apr 23 14:04:20.557771 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:20.557685 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e92d4441-cbca-43ee-9969-2d4880ddff10-openshift-service-ca-bundle\") pod \"e92d4441-cbca-43ee-9969-2d4880ddff10\" (UID: \"e92d4441-cbca-43ee-9969-2d4880ddff10\") " Apr 23 14:04:20.557940 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:20.557791 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e92d4441-cbca-43ee-9969-2d4880ddff10-proxy-tls\") pod \"e92d4441-cbca-43ee-9969-2d4880ddff10\" (UID: \"e92d4441-cbca-43ee-9969-2d4880ddff10\") " Apr 23 14:04:20.558107 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:20.558070 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e92d4441-cbca-43ee-9969-2d4880ddff10-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "e92d4441-cbca-43ee-9969-2d4880ddff10" (UID: "e92d4441-cbca-43ee-9969-2d4880ddff10"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 14:04:20.559847 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:20.559823 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e92d4441-cbca-43ee-9969-2d4880ddff10-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "e92d4441-cbca-43ee-9969-2d4880ddff10" (UID: "e92d4441-cbca-43ee-9969-2d4880ddff10"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 14:04:20.658479 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:20.658446 2577 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e92d4441-cbca-43ee-9969-2d4880ddff10-proxy-tls\") on node \"ip-10-0-132-207.ec2.internal\" DevicePath \"\"" Apr 23 14:04:20.658479 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:20.658475 2577 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e92d4441-cbca-43ee-9969-2d4880ddff10-openshift-service-ca-bundle\") on node \"ip-10-0-132-207.ec2.internal\" DevicePath \"\"" Apr 23 14:04:21.028604 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:21.028571 2577 generic.go:358] "Generic (PLEG): container finished" podID="e92d4441-cbca-43ee-9969-2d4880ddff10" containerID="4a18d44cea327f2ded44260e9264feda8321099630910ef53d84a23babc22664" exitCode=0 Apr 23 14:04:21.028772 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:21.028631 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-8d150-55944bff46-k6lxc" event={"ID":"e92d4441-cbca-43ee-9969-2d4880ddff10","Type":"ContainerDied","Data":"4a18d44cea327f2ded44260e9264feda8321099630910ef53d84a23babc22664"} Apr 23 14:04:21.028772 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:21.028634 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-8d150-55944bff46-k6lxc" Apr 23 14:04:21.028772 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:21.028657 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-8d150-55944bff46-k6lxc" event={"ID":"e92d4441-cbca-43ee-9969-2d4880ddff10","Type":"ContainerDied","Data":"90f74fb932d9947203652063e534e4c9241f9e4352c44e32a6ae7c6fcd515490"} Apr 23 14:04:21.028772 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:21.028671 2577 scope.go:117] "RemoveContainer" containerID="4a18d44cea327f2ded44260e9264feda8321099630910ef53d84a23babc22664" Apr 23 14:04:21.036405 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:21.036385 2577 scope.go:117] "RemoveContainer" containerID="4a18d44cea327f2ded44260e9264feda8321099630910ef53d84a23babc22664" Apr 23 14:04:21.036665 ip-10-0-132-207 kubenswrapper[2577]: E0423 14:04:21.036646 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a18d44cea327f2ded44260e9264feda8321099630910ef53d84a23babc22664\": container with ID starting with 4a18d44cea327f2ded44260e9264feda8321099630910ef53d84a23babc22664 not found: ID does not exist" containerID="4a18d44cea327f2ded44260e9264feda8321099630910ef53d84a23babc22664" Apr 23 14:04:21.036728 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:21.036674 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a18d44cea327f2ded44260e9264feda8321099630910ef53d84a23babc22664"} err="failed to get container status \"4a18d44cea327f2ded44260e9264feda8321099630910ef53d84a23babc22664\": rpc error: code = NotFound desc = could not find container \"4a18d44cea327f2ded44260e9264feda8321099630910ef53d84a23babc22664\": container with ID starting with 4a18d44cea327f2ded44260e9264feda8321099630910ef53d84a23babc22664 not found: ID does not exist" Apr 23 14:04:21.049651 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:21.049629 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-8d150-55944bff46-k6lxc"] Apr 23 14:04:21.051701 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:21.051659 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-8d150-55944bff46-k6lxc"] Apr 23 14:04:21.582907 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:21.582864 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e92d4441-cbca-43ee-9969-2d4880ddff10" path="/var/lib/kubelet/pods/e92d4441-cbca-43ee-9969-2d4880ddff10/volumes" Apr 23 14:04:22.020798 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:22.020761 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/splitter-graph-5035e-5c58fb744-qcjf5" Apr 23 14:04:24.854957 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:24.854881 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-5035e-5c58fb744-qcjf5"] Apr 23 14:04:24.855316 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:24.855101 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/splitter-graph-5035e-5c58fb744-qcjf5" podUID="9cc3527f-b5d1-4e0c-bb4c-08b69b815709" containerName="splitter-graph-5035e" containerID="cri-o://d03179c6754cbeb2b69beea08436e3617ae9c044ab77f7d0e74a9860ad8a0712" gracePeriod=30 Apr 23 14:04:24.920077 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:24.920042 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-5035e-predictor-6d8685dc6b-q4hpb"] Apr 23 14:04:24.920431 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:24.920406 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-5035e-predictor-6d8685dc6b-q4hpb" podUID="fca22878-ab4a-4c89-8f94-e4a1e5a08647" containerName="kserve-container" containerID="cri-o://244266ae8dbc90a36717af084239edc74182ffa7c7451765054b9224959f5da4" gracePeriod=30 Apr 23 14:04:24.920562 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:24.920503 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-5035e-predictor-6d8685dc6b-q4hpb" podUID="fca22878-ab4a-4c89-8f94-e4a1e5a08647" containerName="kube-rbac-proxy" containerID="cri-o://9f32cde5b9bfecc7bef023e5847d48de953d3f0766b93628ea7dc91b0791621c" gracePeriod=30 Apr 23 14:04:25.041509 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:25.041457 2577 generic.go:358] "Generic (PLEG): container finished" podID="fca22878-ab4a-4c89-8f94-e4a1e5a08647" containerID="9f32cde5b9bfecc7bef023e5847d48de953d3f0766b93628ea7dc91b0791621c" exitCode=2 Apr 23 14:04:25.041639 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:25.041518 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-5035e-predictor-6d8685dc6b-q4hpb" event={"ID":"fca22878-ab4a-4c89-8f94-e4a1e5a08647","Type":"ContainerDied","Data":"9f32cde5b9bfecc7bef023e5847d48de953d3f0766b93628ea7dc91b0791621c"} Apr 23 14:04:25.136183 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:25.136109 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-a8c3e-predictor-965fc7978-k4bn8"] Apr 23 14:04:25.136423 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:25.136411 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e92d4441-cbca-43ee-9969-2d4880ddff10" containerName="sequence-graph-8d150" Apr 23 14:04:25.136468 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:25.136424 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="e92d4441-cbca-43ee-9969-2d4880ddff10" containerName="sequence-graph-8d150" Apr 23 14:04:25.136531 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:25.136470 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="e92d4441-cbca-43ee-9969-2d4880ddff10" containerName="sequence-graph-8d150" Apr 23 14:04:25.139270 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:25.139255 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-a8c3e-predictor-965fc7978-k4bn8" Apr 23 14:04:25.144953 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:25.144929 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-a8c3e-kube-rbac-proxy-sar-config\"" Apr 23 14:04:25.144953 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:25.144929 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-a8c3e-predictor-serving-cert\"" Apr 23 14:04:25.150957 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:25.150934 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-a8c3e-predictor-965fc7978-k4bn8"] Apr 23 14:04:25.297748 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:25.297704 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2ba177c1-5524-4560-8e62-9ef0ce9f1cbf-proxy-tls\") pod \"error-404-isvc-a8c3e-predictor-965fc7978-k4bn8\" (UID: \"2ba177c1-5524-4560-8e62-9ef0ce9f1cbf\") " pod="kserve-ci-e2e-test/error-404-isvc-a8c3e-predictor-965fc7978-k4bn8" Apr 23 14:04:25.297941 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:25.297760 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnxmd\" (UniqueName: \"kubernetes.io/projected/2ba177c1-5524-4560-8e62-9ef0ce9f1cbf-kube-api-access-hnxmd\") pod \"error-404-isvc-a8c3e-predictor-965fc7978-k4bn8\" (UID: \"2ba177c1-5524-4560-8e62-9ef0ce9f1cbf\") " pod="kserve-ci-e2e-test/error-404-isvc-a8c3e-predictor-965fc7978-k4bn8" Apr 23 14:04:25.297941 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:25.297789 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-a8c3e-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2ba177c1-5524-4560-8e62-9ef0ce9f1cbf-error-404-isvc-a8c3e-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-a8c3e-predictor-965fc7978-k4bn8\" (UID: \"2ba177c1-5524-4560-8e62-9ef0ce9f1cbf\") " pod="kserve-ci-e2e-test/error-404-isvc-a8c3e-predictor-965fc7978-k4bn8" Apr 23 14:04:25.399234 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:25.399134 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hnxmd\" (UniqueName: \"kubernetes.io/projected/2ba177c1-5524-4560-8e62-9ef0ce9f1cbf-kube-api-access-hnxmd\") pod \"error-404-isvc-a8c3e-predictor-965fc7978-k4bn8\" (UID: \"2ba177c1-5524-4560-8e62-9ef0ce9f1cbf\") " pod="kserve-ci-e2e-test/error-404-isvc-a8c3e-predictor-965fc7978-k4bn8" Apr 23 14:04:25.399234 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:25.399194 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-a8c3e-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2ba177c1-5524-4560-8e62-9ef0ce9f1cbf-error-404-isvc-a8c3e-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-a8c3e-predictor-965fc7978-k4bn8\" (UID: \"2ba177c1-5524-4560-8e62-9ef0ce9f1cbf\") " pod="kserve-ci-e2e-test/error-404-isvc-a8c3e-predictor-965fc7978-k4bn8" Apr 23 14:04:25.399535 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:25.399255 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2ba177c1-5524-4560-8e62-9ef0ce9f1cbf-proxy-tls\") pod \"error-404-isvc-a8c3e-predictor-965fc7978-k4bn8\" (UID: \"2ba177c1-5524-4560-8e62-9ef0ce9f1cbf\") " pod="kserve-ci-e2e-test/error-404-isvc-a8c3e-predictor-965fc7978-k4bn8" Apr 23 14:04:25.399960 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:25.399938 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-a8c3e-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2ba177c1-5524-4560-8e62-9ef0ce9f1cbf-error-404-isvc-a8c3e-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-a8c3e-predictor-965fc7978-k4bn8\" (UID: \"2ba177c1-5524-4560-8e62-9ef0ce9f1cbf\") " pod="kserve-ci-e2e-test/error-404-isvc-a8c3e-predictor-965fc7978-k4bn8" Apr 23 14:04:25.401824 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:25.401793 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2ba177c1-5524-4560-8e62-9ef0ce9f1cbf-proxy-tls\") pod \"error-404-isvc-a8c3e-predictor-965fc7978-k4bn8\" (UID: \"2ba177c1-5524-4560-8e62-9ef0ce9f1cbf\") " pod="kserve-ci-e2e-test/error-404-isvc-a8c3e-predictor-965fc7978-k4bn8" Apr 23 14:04:25.407200 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:25.407177 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnxmd\" (UniqueName: \"kubernetes.io/projected/2ba177c1-5524-4560-8e62-9ef0ce9f1cbf-kube-api-access-hnxmd\") pod \"error-404-isvc-a8c3e-predictor-965fc7978-k4bn8\" (UID: \"2ba177c1-5524-4560-8e62-9ef0ce9f1cbf\") " pod="kserve-ci-e2e-test/error-404-isvc-a8c3e-predictor-965fc7978-k4bn8" Apr 23 14:04:25.449120 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:25.449087 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-a8c3e-predictor-965fc7978-k4bn8" Apr 23 14:04:25.567648 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:25.567619 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-a8c3e-predictor-965fc7978-k4bn8"] Apr 23 14:04:25.570955 ip-10-0-132-207 kubenswrapper[2577]: W0423 14:04:25.570922 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ba177c1_5524_4560_8e62_9ef0ce9f1cbf.slice/crio-e02a2f22e25c8417317425b92b5f109e61b1222e008b53868c2d24a3e720ec8b WatchSource:0}: Error finding container e02a2f22e25c8417317425b92b5f109e61b1222e008b53868c2d24a3e720ec8b: Status 404 returned error can't find the container with id e02a2f22e25c8417317425b92b5f109e61b1222e008b53868c2d24a3e720ec8b Apr 23 14:04:26.045732 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:26.045685 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-a8c3e-predictor-965fc7978-k4bn8" event={"ID":"2ba177c1-5524-4560-8e62-9ef0ce9f1cbf","Type":"ContainerStarted","Data":"fd13e53254576a755909b12f183440ddd383f8a002c796da7bed2cd791d44c22"} Apr 23 14:04:26.045732 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:26.045722 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-a8c3e-predictor-965fc7978-k4bn8" event={"ID":"2ba177c1-5524-4560-8e62-9ef0ce9f1cbf","Type":"ContainerStarted","Data":"ceca1c9acf58237f5acf77b38798b4236eb0983dece102d98d907975156730e1"} Apr 23 14:04:26.045732 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:26.045732 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-a8c3e-predictor-965fc7978-k4bn8" event={"ID":"2ba177c1-5524-4560-8e62-9ef0ce9f1cbf","Type":"ContainerStarted","Data":"e02a2f22e25c8417317425b92b5f109e61b1222e008b53868c2d24a3e720ec8b"} Apr 23 14:04:26.046210 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:26.045803 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-a8c3e-predictor-965fc7978-k4bn8" Apr 23 14:04:26.066322 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:26.066267 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-a8c3e-predictor-965fc7978-k4bn8" podStartSLOduration=1.066249029 podStartE2EDuration="1.066249029s" podCreationTimestamp="2026-04-23 14:04:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 14:04:26.066045677 +0000 UTC m=+1939.043954289" watchObservedRunningTime="2026-04-23 14:04:26.066249029 +0000 UTC m=+1939.044157624" Apr 23 14:04:27.020269 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:27.020231 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-5035e-5c58fb744-qcjf5" podUID="9cc3527f-b5d1-4e0c-bb4c-08b69b815709" containerName="splitter-graph-5035e" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 14:04:27.050851 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:27.050827 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-a8c3e-predictor-965fc7978-k4bn8" Apr 23 14:04:27.052030 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:27.052004 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-a8c3e-predictor-965fc7978-k4bn8" podUID="2ba177c1-5524-4560-8e62-9ef0ce9f1cbf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 23 14:04:27.819146 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:27.819110 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-5035e-predictor-6d8685dc6b-q4hpb" podUID="fca22878-ab4a-4c89-8f94-e4a1e5a08647" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.33:8643/healthz\": dial tcp 10.132.0.33:8643: connect: connection refused" Apr 23 14:04:28.054338 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:28.054300 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-a8c3e-predictor-965fc7978-k4bn8" podUID="2ba177c1-5524-4560-8e62-9ef0ce9f1cbf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 23 14:04:28.254611 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:28.254591 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-5035e-predictor-6d8685dc6b-q4hpb" Apr 23 14:04:28.322554 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:28.322468 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vdx9b\" (UniqueName: \"kubernetes.io/projected/fca22878-ab4a-4c89-8f94-e4a1e5a08647-kube-api-access-vdx9b\") pod \"fca22878-ab4a-4c89-8f94-e4a1e5a08647\" (UID: \"fca22878-ab4a-4c89-8f94-e4a1e5a08647\") " Apr 23 14:04:28.322554 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:28.322532 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-5035e-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/fca22878-ab4a-4c89-8f94-e4a1e5a08647-error-404-isvc-5035e-kube-rbac-proxy-sar-config\") pod \"fca22878-ab4a-4c89-8f94-e4a1e5a08647\" (UID: \"fca22878-ab4a-4c89-8f94-e4a1e5a08647\") " Apr 23 14:04:28.322769 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:28.322587 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fca22878-ab4a-4c89-8f94-e4a1e5a08647-proxy-tls\") pod \"fca22878-ab4a-4c89-8f94-e4a1e5a08647\" (UID: \"fca22878-ab4a-4c89-8f94-e4a1e5a08647\") " Apr 23 14:04:28.322925 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:28.322904 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fca22878-ab4a-4c89-8f94-e4a1e5a08647-error-404-isvc-5035e-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-5035e-kube-rbac-proxy-sar-config") pod "fca22878-ab4a-4c89-8f94-e4a1e5a08647" (UID: "fca22878-ab4a-4c89-8f94-e4a1e5a08647"). InnerVolumeSpecName "error-404-isvc-5035e-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 14:04:28.324564 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:28.324539 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fca22878-ab4a-4c89-8f94-e4a1e5a08647-kube-api-access-vdx9b" (OuterVolumeSpecName: "kube-api-access-vdx9b") pod "fca22878-ab4a-4c89-8f94-e4a1e5a08647" (UID: "fca22878-ab4a-4c89-8f94-e4a1e5a08647"). InnerVolumeSpecName "kube-api-access-vdx9b". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 14:04:28.324672 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:28.324640 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fca22878-ab4a-4c89-8f94-e4a1e5a08647-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fca22878-ab4a-4c89-8f94-e4a1e5a08647" (UID: "fca22878-ab4a-4c89-8f94-e4a1e5a08647"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 14:04:28.423264 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:28.423233 2577 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fca22878-ab4a-4c89-8f94-e4a1e5a08647-proxy-tls\") on node \"ip-10-0-132-207.ec2.internal\" DevicePath \"\"" Apr 23 14:04:28.423264 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:28.423261 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vdx9b\" (UniqueName: \"kubernetes.io/projected/fca22878-ab4a-4c89-8f94-e4a1e5a08647-kube-api-access-vdx9b\") on node \"ip-10-0-132-207.ec2.internal\" DevicePath \"\"" Apr 23 14:04:28.423434 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:28.423274 2577 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-5035e-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/fca22878-ab4a-4c89-8f94-e4a1e5a08647-error-404-isvc-5035e-kube-rbac-proxy-sar-config\") on node \"ip-10-0-132-207.ec2.internal\" DevicePath \"\"" Apr 23 14:04:28.941083 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:28.941045 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-feffa-predictor-75cd96c4d6-tzwhk" podUID="7830be0b-2892-4e47-b17d-bb0aa34efc73" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 23 14:04:29.064549 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:29.064511 2577 generic.go:358] "Generic (PLEG): container finished" podID="fca22878-ab4a-4c89-8f94-e4a1e5a08647" containerID="244266ae8dbc90a36717af084239edc74182ffa7c7451765054b9224959f5da4" exitCode=0 Apr 23 14:04:29.065008 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:29.064577 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-5035e-predictor-6d8685dc6b-q4hpb" event={"ID":"fca22878-ab4a-4c89-8f94-e4a1e5a08647","Type":"ContainerDied","Data":"244266ae8dbc90a36717af084239edc74182ffa7c7451765054b9224959f5da4"} Apr 23 14:04:29.065008 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:29.064607 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-5035e-predictor-6d8685dc6b-q4hpb" event={"ID":"fca22878-ab4a-4c89-8f94-e4a1e5a08647","Type":"ContainerDied","Data":"c88dea163369b7683482c5ae49c9b46099155220f0197c1638939189c76bfd0a"} Apr 23 14:04:29.065008 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:29.064606 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-5035e-predictor-6d8685dc6b-q4hpb" Apr 23 14:04:29.065008 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:29.064676 2577 scope.go:117] "RemoveContainer" containerID="9f32cde5b9bfecc7bef023e5847d48de953d3f0766b93628ea7dc91b0791621c" Apr 23 14:04:29.073674 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:29.073653 2577 scope.go:117] "RemoveContainer" containerID="244266ae8dbc90a36717af084239edc74182ffa7c7451765054b9224959f5da4" Apr 23 14:04:29.080897 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:29.080877 2577 scope.go:117] "RemoveContainer" containerID="9f32cde5b9bfecc7bef023e5847d48de953d3f0766b93628ea7dc91b0791621c" Apr 23 14:04:29.081163 ip-10-0-132-207 kubenswrapper[2577]: E0423 14:04:29.081142 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f32cde5b9bfecc7bef023e5847d48de953d3f0766b93628ea7dc91b0791621c\": container with ID starting with 9f32cde5b9bfecc7bef023e5847d48de953d3f0766b93628ea7dc91b0791621c not found: ID does not exist" containerID="9f32cde5b9bfecc7bef023e5847d48de953d3f0766b93628ea7dc91b0791621c" Apr 23 14:04:29.081225 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:29.081172 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f32cde5b9bfecc7bef023e5847d48de953d3f0766b93628ea7dc91b0791621c"} err="failed to get container status \"9f32cde5b9bfecc7bef023e5847d48de953d3f0766b93628ea7dc91b0791621c\": rpc error: code = NotFound desc = could not find container \"9f32cde5b9bfecc7bef023e5847d48de953d3f0766b93628ea7dc91b0791621c\": container with ID starting with 9f32cde5b9bfecc7bef023e5847d48de953d3f0766b93628ea7dc91b0791621c not found: ID does not exist" Apr 23 14:04:29.081225 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:29.081190 2577 scope.go:117] "RemoveContainer" containerID="244266ae8dbc90a36717af084239edc74182ffa7c7451765054b9224959f5da4" Apr 23 14:04:29.081446 ip-10-0-132-207 kubenswrapper[2577]: E0423 14:04:29.081423 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"244266ae8dbc90a36717af084239edc74182ffa7c7451765054b9224959f5da4\": container with ID starting with 244266ae8dbc90a36717af084239edc74182ffa7c7451765054b9224959f5da4 not found: ID does not exist" containerID="244266ae8dbc90a36717af084239edc74182ffa7c7451765054b9224959f5da4" Apr 23 14:04:29.081522 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:29.081458 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"244266ae8dbc90a36717af084239edc74182ffa7c7451765054b9224959f5da4"} err="failed to get container status \"244266ae8dbc90a36717af084239edc74182ffa7c7451765054b9224959f5da4\": rpc error: code = NotFound desc = could not find container \"244266ae8dbc90a36717af084239edc74182ffa7c7451765054b9224959f5da4\": container with ID starting with 244266ae8dbc90a36717af084239edc74182ffa7c7451765054b9224959f5da4 not found: ID does not exist" Apr 23 14:04:29.087570 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:29.087477 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-5035e-predictor-6d8685dc6b-q4hpb"] Apr 23 14:04:29.095119 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:29.095098 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-5035e-predictor-6d8685dc6b-q4hpb"] Apr 23 14:04:29.583275 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:29.583242 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fca22878-ab4a-4c89-8f94-e4a1e5a08647" path="/var/lib/kubelet/pods/fca22878-ab4a-4c89-8f94-e4a1e5a08647/volumes" Apr 23 14:04:32.019934 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:32.019894 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-5035e-5c58fb744-qcjf5" podUID="9cc3527f-b5d1-4e0c-bb4c-08b69b815709" containerName="splitter-graph-5035e" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 14:04:33.058933 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:33.058905 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-a8c3e-predictor-965fc7978-k4bn8" Apr 23 14:04:33.059384 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:33.059357 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-a8c3e-predictor-965fc7978-k4bn8" podUID="2ba177c1-5524-4560-8e62-9ef0ce9f1cbf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 23 14:04:37.019898 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:37.019854 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-5035e-5c58fb744-qcjf5" podUID="9cc3527f-b5d1-4e0c-bb4c-08b69b815709" containerName="splitter-graph-5035e" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 14:04:37.020272 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:37.019969 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/splitter-graph-5035e-5c58fb744-qcjf5" Apr 23 14:04:38.942385 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:38.942357 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-feffa-predictor-75cd96c4d6-tzwhk" Apr 23 14:04:42.019429 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:42.019390 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-5035e-5c58fb744-qcjf5" podUID="9cc3527f-b5d1-4e0c-bb4c-08b69b815709" containerName="splitter-graph-5035e" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 14:04:43.059733 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:43.059697 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-a8c3e-predictor-965fc7978-k4bn8" podUID="2ba177c1-5524-4560-8e62-9ef0ce9f1cbf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 23 14:04:47.019462 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:47.019423 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-5035e-5c58fb744-qcjf5" podUID="9cc3527f-b5d1-4e0c-bb4c-08b69b815709" containerName="splitter-graph-5035e" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 14:04:50.448803 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:50.448770 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/switch-graph-feffa-7cd874cdb5-n9vxb"] Apr 23 14:04:50.449150 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:50.449048 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fca22878-ab4a-4c89-8f94-e4a1e5a08647" containerName="kube-rbac-proxy" Apr 23 14:04:50.449150 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:50.449058 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="fca22878-ab4a-4c89-8f94-e4a1e5a08647" containerName="kube-rbac-proxy" Apr 23 14:04:50.449150 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:50.449073 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fca22878-ab4a-4c89-8f94-e4a1e5a08647" containerName="kserve-container" Apr 23 14:04:50.449150 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:50.449078 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="fca22878-ab4a-4c89-8f94-e4a1e5a08647" containerName="kserve-container" Apr 23 14:04:50.449150 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:50.449125 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="fca22878-ab4a-4c89-8f94-e4a1e5a08647" containerName="kserve-container" Apr 23 14:04:50.449150 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:50.449135 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="fca22878-ab4a-4c89-8f94-e4a1e5a08647" containerName="kube-rbac-proxy" Apr 23 14:04:50.451976 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:50.451959 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-feffa-7cd874cdb5-n9vxb" Apr 23 14:04:50.454565 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:50.454547 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-feffa-kube-rbac-proxy-sar-config\"" Apr 23 14:04:50.454639 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:50.454584 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-feffa-serving-cert\"" Apr 23 14:04:50.457930 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:50.457906 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-feffa-7cd874cdb5-n9vxb"] Apr 23 14:04:50.482197 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:50.482178 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/361259e9-32f4-43ac-bfe5-7d4ae4337229-proxy-tls\") pod \"switch-graph-feffa-7cd874cdb5-n9vxb\" (UID: \"361259e9-32f4-43ac-bfe5-7d4ae4337229\") " pod="kserve-ci-e2e-test/switch-graph-feffa-7cd874cdb5-n9vxb" Apr 23 14:04:50.482297 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:50.482207 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/361259e9-32f4-43ac-bfe5-7d4ae4337229-openshift-service-ca-bundle\") pod \"switch-graph-feffa-7cd874cdb5-n9vxb\" (UID: \"361259e9-32f4-43ac-bfe5-7d4ae4337229\") " pod="kserve-ci-e2e-test/switch-graph-feffa-7cd874cdb5-n9vxb" Apr 23 14:04:50.583082 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:50.583057 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/361259e9-32f4-43ac-bfe5-7d4ae4337229-proxy-tls\") pod \"switch-graph-feffa-7cd874cdb5-n9vxb\" (UID: \"361259e9-32f4-43ac-bfe5-7d4ae4337229\") " pod="kserve-ci-e2e-test/switch-graph-feffa-7cd874cdb5-n9vxb" Apr 23 14:04:50.583182 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:50.583090 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/361259e9-32f4-43ac-bfe5-7d4ae4337229-openshift-service-ca-bundle\") pod \"switch-graph-feffa-7cd874cdb5-n9vxb\" (UID: \"361259e9-32f4-43ac-bfe5-7d4ae4337229\") " pod="kserve-ci-e2e-test/switch-graph-feffa-7cd874cdb5-n9vxb" Apr 23 14:04:50.583223 ip-10-0-132-207 kubenswrapper[2577]: E0423 14:04:50.583194 2577 secret.go:189] Couldn't get secret kserve-ci-e2e-test/switch-graph-feffa-serving-cert: secret "switch-graph-feffa-serving-cert" not found Apr 23 14:04:50.583266 ip-10-0-132-207 kubenswrapper[2577]: E0423 14:04:50.583255 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/361259e9-32f4-43ac-bfe5-7d4ae4337229-proxy-tls podName:361259e9-32f4-43ac-bfe5-7d4ae4337229 nodeName:}" failed. No retries permitted until 2026-04-23 14:04:51.083239169 +0000 UTC m=+1964.061147738 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/361259e9-32f4-43ac-bfe5-7d4ae4337229-proxy-tls") pod "switch-graph-feffa-7cd874cdb5-n9vxb" (UID: "361259e9-32f4-43ac-bfe5-7d4ae4337229") : secret "switch-graph-feffa-serving-cert" not found Apr 23 14:04:50.583678 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:50.583662 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/361259e9-32f4-43ac-bfe5-7d4ae4337229-openshift-service-ca-bundle\") pod \"switch-graph-feffa-7cd874cdb5-n9vxb\" (UID: \"361259e9-32f4-43ac-bfe5-7d4ae4337229\") " pod="kserve-ci-e2e-test/switch-graph-feffa-7cd874cdb5-n9vxb" Apr 23 14:04:51.086310 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:51.086272 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/361259e9-32f4-43ac-bfe5-7d4ae4337229-proxy-tls\") pod \"switch-graph-feffa-7cd874cdb5-n9vxb\" (UID: \"361259e9-32f4-43ac-bfe5-7d4ae4337229\") " pod="kserve-ci-e2e-test/switch-graph-feffa-7cd874cdb5-n9vxb" Apr 23 14:04:51.088623 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:51.088606 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/361259e9-32f4-43ac-bfe5-7d4ae4337229-proxy-tls\") pod \"switch-graph-feffa-7cd874cdb5-n9vxb\" (UID: \"361259e9-32f4-43ac-bfe5-7d4ae4337229\") " pod="kserve-ci-e2e-test/switch-graph-feffa-7cd874cdb5-n9vxb" Apr 23 14:04:51.363215 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:51.363133 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-feffa-7cd874cdb5-n9vxb" Apr 23 14:04:51.476884 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:51.476860 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-feffa-7cd874cdb5-n9vxb"] Apr 23 14:04:51.479053 ip-10-0-132-207 kubenswrapper[2577]: W0423 14:04:51.479024 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod361259e9_32f4_43ac_bfe5_7d4ae4337229.slice/crio-0fc0ab5366e3d7e5cad402df8ee78040af76afc0ebc751f05e80f95072111bc2 WatchSource:0}: Error finding container 0fc0ab5366e3d7e5cad402df8ee78040af76afc0ebc751f05e80f95072111bc2: Status 404 returned error can't find the container with id 0fc0ab5366e3d7e5cad402df8ee78040af76afc0ebc751f05e80f95072111bc2 Apr 23 14:04:52.019405 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:52.019362 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-5035e-5c58fb744-qcjf5" podUID="9cc3527f-b5d1-4e0c-bb4c-08b69b815709" containerName="splitter-graph-5035e" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 14:04:52.140205 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:52.140168 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-feffa-7cd874cdb5-n9vxb" event={"ID":"361259e9-32f4-43ac-bfe5-7d4ae4337229","Type":"ContainerStarted","Data":"7bfc4d292052f016dca898c5a3c7aa3100314329d4095c48672a714af5106912"} Apr 23 14:04:52.140205 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:52.140209 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-feffa-7cd874cdb5-n9vxb" event={"ID":"361259e9-32f4-43ac-bfe5-7d4ae4337229","Type":"ContainerStarted","Data":"0fc0ab5366e3d7e5cad402df8ee78040af76afc0ebc751f05e80f95072111bc2"} Apr 23 14:04:52.140426 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:52.140249 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-feffa-7cd874cdb5-n9vxb" Apr 23 14:04:52.158532 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:52.158480 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/switch-graph-feffa-7cd874cdb5-n9vxb" podStartSLOduration=2.158467953 podStartE2EDuration="2.158467953s" podCreationTimestamp="2026-04-23 14:04:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 14:04:52.15745668 +0000 UTC m=+1965.135365274" watchObservedRunningTime="2026-04-23 14:04:52.158467953 +0000 UTC m=+1965.136376548" Apr 23 14:04:53.060239 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:53.060202 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-a8c3e-predictor-965fc7978-k4bn8" podUID="2ba177c1-5524-4560-8e62-9ef0ce9f1cbf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 23 14:04:54.991998 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:54.991975 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-5035e-5c58fb744-qcjf5" Apr 23 14:04:55.116143 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:55.116058 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9cc3527f-b5d1-4e0c-bb4c-08b69b815709-proxy-tls\") pod \"9cc3527f-b5d1-4e0c-bb4c-08b69b815709\" (UID: \"9cc3527f-b5d1-4e0c-bb4c-08b69b815709\") " Apr 23 14:04:55.116143 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:55.116134 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9cc3527f-b5d1-4e0c-bb4c-08b69b815709-openshift-service-ca-bundle\") pod \"9cc3527f-b5d1-4e0c-bb4c-08b69b815709\" (UID: \"9cc3527f-b5d1-4e0c-bb4c-08b69b815709\") " Apr 23 14:04:55.116510 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:55.116453 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9cc3527f-b5d1-4e0c-bb4c-08b69b815709-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "9cc3527f-b5d1-4e0c-bb4c-08b69b815709" (UID: "9cc3527f-b5d1-4e0c-bb4c-08b69b815709"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 14:04:55.118047 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:55.118026 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cc3527f-b5d1-4e0c-bb4c-08b69b815709-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "9cc3527f-b5d1-4e0c-bb4c-08b69b815709" (UID: "9cc3527f-b5d1-4e0c-bb4c-08b69b815709"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 14:04:55.149351 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:55.149325 2577 generic.go:358] "Generic (PLEG): container finished" podID="9cc3527f-b5d1-4e0c-bb4c-08b69b815709" containerID="d03179c6754cbeb2b69beea08436e3617ae9c044ab77f7d0e74a9860ad8a0712" exitCode=0 Apr 23 14:04:55.149450 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:55.149388 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-5035e-5c58fb744-qcjf5" Apr 23 14:04:55.149450 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:55.149405 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-5035e-5c58fb744-qcjf5" event={"ID":"9cc3527f-b5d1-4e0c-bb4c-08b69b815709","Type":"ContainerDied","Data":"d03179c6754cbeb2b69beea08436e3617ae9c044ab77f7d0e74a9860ad8a0712"} Apr 23 14:04:55.149450 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:55.149445 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-5035e-5c58fb744-qcjf5" event={"ID":"9cc3527f-b5d1-4e0c-bb4c-08b69b815709","Type":"ContainerDied","Data":"469587ca0694018e6baa14e03d366c6dfc2f4f5b3d904f1bfd604899252538b6"} Apr 23 14:04:55.149617 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:55.149466 2577 scope.go:117] "RemoveContainer" containerID="d03179c6754cbeb2b69beea08436e3617ae9c044ab77f7d0e74a9860ad8a0712" Apr 23 14:04:55.157414 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:55.157396 2577 scope.go:117] "RemoveContainer" containerID="d03179c6754cbeb2b69beea08436e3617ae9c044ab77f7d0e74a9860ad8a0712" Apr 23 14:04:55.157680 ip-10-0-132-207 kubenswrapper[2577]: E0423 14:04:55.157658 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d03179c6754cbeb2b69beea08436e3617ae9c044ab77f7d0e74a9860ad8a0712\": container with ID starting with d03179c6754cbeb2b69beea08436e3617ae9c044ab77f7d0e74a9860ad8a0712 not found: ID does not exist" containerID="d03179c6754cbeb2b69beea08436e3617ae9c044ab77f7d0e74a9860ad8a0712" Apr 23 14:04:55.157780 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:55.157684 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d03179c6754cbeb2b69beea08436e3617ae9c044ab77f7d0e74a9860ad8a0712"} err="failed to get container status \"d03179c6754cbeb2b69beea08436e3617ae9c044ab77f7d0e74a9860ad8a0712\": rpc error: code = NotFound desc = could not find container \"d03179c6754cbeb2b69beea08436e3617ae9c044ab77f7d0e74a9860ad8a0712\": container with ID starting with d03179c6754cbeb2b69beea08436e3617ae9c044ab77f7d0e74a9860ad8a0712 not found: ID does not exist" Apr 23 14:04:55.170638 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:55.170616 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-5035e-5c58fb744-qcjf5"] Apr 23 14:04:55.174282 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:55.174261 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-5035e-5c58fb744-qcjf5"] Apr 23 14:04:55.217148 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:55.217124 2577 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9cc3527f-b5d1-4e0c-bb4c-08b69b815709-openshift-service-ca-bundle\") on node \"ip-10-0-132-207.ec2.internal\" DevicePath \"\"" Apr 23 14:04:55.217148 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:55.217144 2577 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9cc3527f-b5d1-4e0c-bb4c-08b69b815709-proxy-tls\") on node \"ip-10-0-132-207.ec2.internal\" DevicePath \"\"" Apr 23 14:04:55.583863 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:55.583831 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9cc3527f-b5d1-4e0c-bb4c-08b69b815709" path="/var/lib/kubelet/pods/9cc3527f-b5d1-4e0c-bb4c-08b69b815709/volumes" Apr 23 14:04:58.149014 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:04:58.148928 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/switch-graph-feffa-7cd874cdb5-n9vxb" Apr 23 14:05:03.059387 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:05:03.059345 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-a8c3e-predictor-965fc7978-k4bn8" podUID="2ba177c1-5524-4560-8e62-9ef0ce9f1cbf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 23 14:05:13.060617 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:05:13.060587 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-a8c3e-predictor-965fc7978-k4bn8" Apr 23 14:05:25.072583 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:05:25.072555 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/splitter-graph-a8c3e-bc877ccbf-qwvt7"] Apr 23 14:05:25.072936 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:05:25.072848 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9cc3527f-b5d1-4e0c-bb4c-08b69b815709" containerName="splitter-graph-5035e" Apr 23 14:05:25.072936 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:05:25.072858 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cc3527f-b5d1-4e0c-bb4c-08b69b815709" containerName="splitter-graph-5035e" Apr 23 14:05:25.072936 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:05:25.072910 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="9cc3527f-b5d1-4e0c-bb4c-08b69b815709" containerName="splitter-graph-5035e" Apr 23 14:05:25.076993 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:05:25.076968 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-a8c3e-bc877ccbf-qwvt7" Apr 23 14:05:25.079639 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:05:25.079617 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"splitter-graph-a8c3e-serving-cert\"" Apr 23 14:05:25.079752 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:05:25.079643 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"splitter-graph-a8c3e-kube-rbac-proxy-sar-config\"" Apr 23 14:05:25.085530 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:05:25.085508 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-a8c3e-bc877ccbf-qwvt7"] Apr 23 14:05:25.136411 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:05:25.136387 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fca53ba5-e5a9-4af1-8910-72a0f613d62e-openshift-service-ca-bundle\") pod \"splitter-graph-a8c3e-bc877ccbf-qwvt7\" (UID: \"fca53ba5-e5a9-4af1-8910-72a0f613d62e\") " pod="kserve-ci-e2e-test/splitter-graph-a8c3e-bc877ccbf-qwvt7" Apr 23 14:05:25.136535 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:05:25.136427 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fca53ba5-e5a9-4af1-8910-72a0f613d62e-proxy-tls\") pod \"splitter-graph-a8c3e-bc877ccbf-qwvt7\" (UID: \"fca53ba5-e5a9-4af1-8910-72a0f613d62e\") " pod="kserve-ci-e2e-test/splitter-graph-a8c3e-bc877ccbf-qwvt7" Apr 23 14:05:25.237276 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:05:25.237249 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fca53ba5-e5a9-4af1-8910-72a0f613d62e-openshift-service-ca-bundle\") pod \"splitter-graph-a8c3e-bc877ccbf-qwvt7\" (UID: \"fca53ba5-e5a9-4af1-8910-72a0f613d62e\") " pod="kserve-ci-e2e-test/splitter-graph-a8c3e-bc877ccbf-qwvt7" Apr 23 14:05:25.237381 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:05:25.237299 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fca53ba5-e5a9-4af1-8910-72a0f613d62e-proxy-tls\") pod \"splitter-graph-a8c3e-bc877ccbf-qwvt7\" (UID: \"fca53ba5-e5a9-4af1-8910-72a0f613d62e\") " pod="kserve-ci-e2e-test/splitter-graph-a8c3e-bc877ccbf-qwvt7" Apr 23 14:05:25.237430 ip-10-0-132-207 kubenswrapper[2577]: E0423 14:05:25.237417 2577 secret.go:189] Couldn't get secret kserve-ci-e2e-test/splitter-graph-a8c3e-serving-cert: secret "splitter-graph-a8c3e-serving-cert" not found Apr 23 14:05:25.237513 ip-10-0-132-207 kubenswrapper[2577]: E0423 14:05:25.237482 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fca53ba5-e5a9-4af1-8910-72a0f613d62e-proxy-tls podName:fca53ba5-e5a9-4af1-8910-72a0f613d62e nodeName:}" failed. No retries permitted until 2026-04-23 14:05:25.737466281 +0000 UTC m=+1998.715374850 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/fca53ba5-e5a9-4af1-8910-72a0f613d62e-proxy-tls") pod "splitter-graph-a8c3e-bc877ccbf-qwvt7" (UID: "fca53ba5-e5a9-4af1-8910-72a0f613d62e") : secret "splitter-graph-a8c3e-serving-cert" not found Apr 23 14:05:25.237855 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:05:25.237839 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fca53ba5-e5a9-4af1-8910-72a0f613d62e-openshift-service-ca-bundle\") pod \"splitter-graph-a8c3e-bc877ccbf-qwvt7\" (UID: \"fca53ba5-e5a9-4af1-8910-72a0f613d62e\") " pod="kserve-ci-e2e-test/splitter-graph-a8c3e-bc877ccbf-qwvt7" Apr 23 14:05:25.742037 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:05:25.742003 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fca53ba5-e5a9-4af1-8910-72a0f613d62e-proxy-tls\") pod \"splitter-graph-a8c3e-bc877ccbf-qwvt7\" (UID: \"fca53ba5-e5a9-4af1-8910-72a0f613d62e\") " pod="kserve-ci-e2e-test/splitter-graph-a8c3e-bc877ccbf-qwvt7" Apr 23 14:05:25.744250 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:05:25.744221 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fca53ba5-e5a9-4af1-8910-72a0f613d62e-proxy-tls\") pod \"splitter-graph-a8c3e-bc877ccbf-qwvt7\" (UID: \"fca53ba5-e5a9-4af1-8910-72a0f613d62e\") " pod="kserve-ci-e2e-test/splitter-graph-a8c3e-bc877ccbf-qwvt7" Apr 23 14:05:25.987024 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:05:25.986994 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-a8c3e-bc877ccbf-qwvt7" Apr 23 14:05:26.111283 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:05:26.111262 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-a8c3e-bc877ccbf-qwvt7"] Apr 23 14:05:26.113657 ip-10-0-132-207 kubenswrapper[2577]: W0423 14:05:26.113625 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfca53ba5_e5a9_4af1_8910_72a0f613d62e.slice/crio-823c254a6cb109358777ae259b8d4673d37c851fa1564f219425430f65917dab WatchSource:0}: Error finding container 823c254a6cb109358777ae259b8d4673d37c851fa1564f219425430f65917dab: Status 404 returned error can't find the container with id 823c254a6cb109358777ae259b8d4673d37c851fa1564f219425430f65917dab Apr 23 14:05:26.238357 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:05:26.238320 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-a8c3e-bc877ccbf-qwvt7" event={"ID":"fca53ba5-e5a9-4af1-8910-72a0f613d62e","Type":"ContainerStarted","Data":"4d3c4574e6ea72045204e4e7ffe1175d9a28ee2ec6c1c0164b3c0f24763c0ab6"} Apr 23 14:05:26.238357 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:05:26.238358 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-a8c3e-bc877ccbf-qwvt7" event={"ID":"fca53ba5-e5a9-4af1-8910-72a0f613d62e","Type":"ContainerStarted","Data":"823c254a6cb109358777ae259b8d4673d37c851fa1564f219425430f65917dab"} Apr 23 14:05:26.238567 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:05:26.238440 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/splitter-graph-a8c3e-bc877ccbf-qwvt7" Apr 23 14:05:26.256260 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:05:26.256167 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/splitter-graph-a8c3e-bc877ccbf-qwvt7" podStartSLOduration=1.256146523 podStartE2EDuration="1.256146523s" podCreationTimestamp="2026-04-23 14:05:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 14:05:26.255311409 +0000 UTC m=+1999.233220001" watchObservedRunningTime="2026-04-23 14:05:26.256146523 +0000 UTC m=+1999.234055117" Apr 23 14:05:32.247937 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:05:32.247908 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/splitter-graph-a8c3e-bc877ccbf-qwvt7" Apr 23 14:13:39.718977 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:13:39.718940 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-a8c3e-bc877ccbf-qwvt7"] Apr 23 14:13:39.721295 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:13:39.719154 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/splitter-graph-a8c3e-bc877ccbf-qwvt7" podUID="fca53ba5-e5a9-4af1-8910-72a0f613d62e" containerName="splitter-graph-a8c3e" containerID="cri-o://4d3c4574e6ea72045204e4e7ffe1175d9a28ee2ec6c1c0164b3c0f24763c0ab6" gracePeriod=30 Apr 23 14:13:39.870726 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:13:39.870691 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-a8c3e-predictor-965fc7978-k4bn8"] Apr 23 14:13:39.871037 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:13:39.871010 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-a8c3e-predictor-965fc7978-k4bn8" podUID="2ba177c1-5524-4560-8e62-9ef0ce9f1cbf" containerName="kserve-container" containerID="cri-o://ceca1c9acf58237f5acf77b38798b4236eb0983dece102d98d907975156730e1" gracePeriod=30 Apr 23 14:13:39.871157 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:13:39.871110 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-a8c3e-predictor-965fc7978-k4bn8" podUID="2ba177c1-5524-4560-8e62-9ef0ce9f1cbf" containerName="kube-rbac-proxy" containerID="cri-o://fd13e53254576a755909b12f183440ddd383f8a002c796da7bed2cd791d44c22" gracePeriod=30 Apr 23 14:13:40.598114 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:13:40.598069 2577 generic.go:358] "Generic (PLEG): container finished" podID="2ba177c1-5524-4560-8e62-9ef0ce9f1cbf" containerID="fd13e53254576a755909b12f183440ddd383f8a002c796da7bed2cd791d44c22" exitCode=2 Apr 23 14:13:40.598293 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:13:40.598132 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-a8c3e-predictor-965fc7978-k4bn8" event={"ID":"2ba177c1-5524-4560-8e62-9ef0ce9f1cbf","Type":"ContainerDied","Data":"fd13e53254576a755909b12f183440ddd383f8a002c796da7bed2cd791d44c22"} Apr 23 14:13:42.245505 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:13:42.245447 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-a8c3e-bc877ccbf-qwvt7" podUID="fca53ba5-e5a9-4af1-8910-72a0f613d62e" containerName="splitter-graph-a8c3e" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 14:13:42.818040 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:13:42.818012 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-a8c3e-predictor-965fc7978-k4bn8" Apr 23 14:13:42.886838 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:13:42.886801 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2ba177c1-5524-4560-8e62-9ef0ce9f1cbf-proxy-tls\") pod \"2ba177c1-5524-4560-8e62-9ef0ce9f1cbf\" (UID: \"2ba177c1-5524-4560-8e62-9ef0ce9f1cbf\") " Apr 23 14:13:42.887018 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:13:42.886869 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hnxmd\" (UniqueName: \"kubernetes.io/projected/2ba177c1-5524-4560-8e62-9ef0ce9f1cbf-kube-api-access-hnxmd\") pod \"2ba177c1-5524-4560-8e62-9ef0ce9f1cbf\" (UID: \"2ba177c1-5524-4560-8e62-9ef0ce9f1cbf\") " Apr 23 14:13:42.887018 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:13:42.886913 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-a8c3e-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2ba177c1-5524-4560-8e62-9ef0ce9f1cbf-error-404-isvc-a8c3e-kube-rbac-proxy-sar-config\") pod \"2ba177c1-5524-4560-8e62-9ef0ce9f1cbf\" (UID: \"2ba177c1-5524-4560-8e62-9ef0ce9f1cbf\") " Apr 23 14:13:42.887289 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:13:42.887259 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ba177c1-5524-4560-8e62-9ef0ce9f1cbf-error-404-isvc-a8c3e-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-a8c3e-kube-rbac-proxy-sar-config") pod "2ba177c1-5524-4560-8e62-9ef0ce9f1cbf" (UID: "2ba177c1-5524-4560-8e62-9ef0ce9f1cbf"). InnerVolumeSpecName "error-404-isvc-a8c3e-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 14:13:42.888925 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:13:42.888895 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ba177c1-5524-4560-8e62-9ef0ce9f1cbf-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "2ba177c1-5524-4560-8e62-9ef0ce9f1cbf" (UID: "2ba177c1-5524-4560-8e62-9ef0ce9f1cbf"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 14:13:42.889031 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:13:42.888954 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ba177c1-5524-4560-8e62-9ef0ce9f1cbf-kube-api-access-hnxmd" (OuterVolumeSpecName: "kube-api-access-hnxmd") pod "2ba177c1-5524-4560-8e62-9ef0ce9f1cbf" (UID: "2ba177c1-5524-4560-8e62-9ef0ce9f1cbf"). InnerVolumeSpecName "kube-api-access-hnxmd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 14:13:42.988096 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:13:42.988065 2577 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2ba177c1-5524-4560-8e62-9ef0ce9f1cbf-proxy-tls\") on node \"ip-10-0-132-207.ec2.internal\" DevicePath \"\"" Apr 23 14:13:42.988096 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:13:42.988091 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hnxmd\" (UniqueName: \"kubernetes.io/projected/2ba177c1-5524-4560-8e62-9ef0ce9f1cbf-kube-api-access-hnxmd\") on node \"ip-10-0-132-207.ec2.internal\" DevicePath \"\"" Apr 23 14:13:42.988096 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:13:42.988102 2577 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-a8c3e-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2ba177c1-5524-4560-8e62-9ef0ce9f1cbf-error-404-isvc-a8c3e-kube-rbac-proxy-sar-config\") on node \"ip-10-0-132-207.ec2.internal\" DevicePath \"\"" Apr 23 14:13:43.607366 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:13:43.607324 2577 generic.go:358] "Generic (PLEG): container finished" podID="2ba177c1-5524-4560-8e62-9ef0ce9f1cbf" containerID="ceca1c9acf58237f5acf77b38798b4236eb0983dece102d98d907975156730e1" exitCode=0 Apr 23 14:13:43.607749 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:13:43.607396 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-a8c3e-predictor-965fc7978-k4bn8" Apr 23 14:13:43.607749 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:13:43.607409 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-a8c3e-predictor-965fc7978-k4bn8" event={"ID":"2ba177c1-5524-4560-8e62-9ef0ce9f1cbf","Type":"ContainerDied","Data":"ceca1c9acf58237f5acf77b38798b4236eb0983dece102d98d907975156730e1"} Apr 23 14:13:43.607749 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:13:43.607452 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-a8c3e-predictor-965fc7978-k4bn8" event={"ID":"2ba177c1-5524-4560-8e62-9ef0ce9f1cbf","Type":"ContainerDied","Data":"e02a2f22e25c8417317425b92b5f109e61b1222e008b53868c2d24a3e720ec8b"} Apr 23 14:13:43.607749 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:13:43.607471 2577 scope.go:117] "RemoveContainer" containerID="fd13e53254576a755909b12f183440ddd383f8a002c796da7bed2cd791d44c22" Apr 23 14:13:43.615254 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:13:43.615237 2577 scope.go:117] "RemoveContainer" containerID="ceca1c9acf58237f5acf77b38798b4236eb0983dece102d98d907975156730e1" Apr 23 14:13:43.622102 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:13:43.622075 2577 scope.go:117] "RemoveContainer" containerID="fd13e53254576a755909b12f183440ddd383f8a002c796da7bed2cd791d44c22" Apr 23 14:13:43.622343 ip-10-0-132-207 kubenswrapper[2577]: E0423 14:13:43.622320 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd13e53254576a755909b12f183440ddd383f8a002c796da7bed2cd791d44c22\": container with ID starting with fd13e53254576a755909b12f183440ddd383f8a002c796da7bed2cd791d44c22 not found: ID does not exist" containerID="fd13e53254576a755909b12f183440ddd383f8a002c796da7bed2cd791d44c22" Apr 23 14:13:43.622406 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:13:43.622352 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd13e53254576a755909b12f183440ddd383f8a002c796da7bed2cd791d44c22"} err="failed to get container status \"fd13e53254576a755909b12f183440ddd383f8a002c796da7bed2cd791d44c22\": rpc error: code = NotFound desc = could not find container \"fd13e53254576a755909b12f183440ddd383f8a002c796da7bed2cd791d44c22\": container with ID starting with fd13e53254576a755909b12f183440ddd383f8a002c796da7bed2cd791d44c22 not found: ID does not exist" Apr 23 14:13:43.622406 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:13:43.622369 2577 scope.go:117] "RemoveContainer" containerID="ceca1c9acf58237f5acf77b38798b4236eb0983dece102d98d907975156730e1" Apr 23 14:13:43.622622 ip-10-0-132-207 kubenswrapper[2577]: E0423 14:13:43.622600 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ceca1c9acf58237f5acf77b38798b4236eb0983dece102d98d907975156730e1\": container with ID starting with ceca1c9acf58237f5acf77b38798b4236eb0983dece102d98d907975156730e1 not found: ID does not exist" containerID="ceca1c9acf58237f5acf77b38798b4236eb0983dece102d98d907975156730e1" Apr 23 14:13:43.622691 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:13:43.622631 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ceca1c9acf58237f5acf77b38798b4236eb0983dece102d98d907975156730e1"} err="failed to get container status \"ceca1c9acf58237f5acf77b38798b4236eb0983dece102d98d907975156730e1\": rpc error: code = NotFound desc = could not find container \"ceca1c9acf58237f5acf77b38798b4236eb0983dece102d98d907975156730e1\": container with ID starting with ceca1c9acf58237f5acf77b38798b4236eb0983dece102d98d907975156730e1 not found: ID does not exist" Apr 23 14:13:43.626757 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:13:43.626733 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-a8c3e-predictor-965fc7978-k4bn8"] Apr 23 14:13:43.629804 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:13:43.629783 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-a8c3e-predictor-965fc7978-k4bn8"] Apr 23 14:13:45.583735 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:13:45.583702 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ba177c1-5524-4560-8e62-9ef0ce9f1cbf" path="/var/lib/kubelet/pods/2ba177c1-5524-4560-8e62-9ef0ce9f1cbf/volumes" Apr 23 14:13:47.245484 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:13:47.245447 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-a8c3e-bc877ccbf-qwvt7" podUID="fca53ba5-e5a9-4af1-8910-72a0f613d62e" containerName="splitter-graph-a8c3e" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 14:13:52.245741 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:13:52.245698 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-a8c3e-bc877ccbf-qwvt7" podUID="fca53ba5-e5a9-4af1-8910-72a0f613d62e" containerName="splitter-graph-a8c3e" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 14:13:52.246156 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:13:52.245802 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/splitter-graph-a8c3e-bc877ccbf-qwvt7" Apr 23 14:13:57.246186 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:13:57.246093 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-a8c3e-bc877ccbf-qwvt7" podUID="fca53ba5-e5a9-4af1-8910-72a0f613d62e" containerName="splitter-graph-a8c3e" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 14:14:02.246055 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:14:02.246014 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-a8c3e-bc877ccbf-qwvt7" podUID="fca53ba5-e5a9-4af1-8910-72a0f613d62e" containerName="splitter-graph-a8c3e" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 14:14:07.245805 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:14:07.245765 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-a8c3e-bc877ccbf-qwvt7" podUID="fca53ba5-e5a9-4af1-8910-72a0f613d62e" containerName="splitter-graph-a8c3e" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 14:14:09.745976 ip-10-0-132-207 kubenswrapper[2577]: E0423 14:14:09.745946 2577 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfca53ba5_e5a9_4af1_8910_72a0f613d62e.slice/crio-4d3c4574e6ea72045204e4e7ffe1175d9a28ee2ec6c1c0164b3c0f24763c0ab6.scope\": RecentStats: unable to find data in memory cache]" Apr 23 14:14:09.863413 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:14:09.863387 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-a8c3e-bc877ccbf-qwvt7" Apr 23 14:14:09.991478 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:14:09.991390 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fca53ba5-e5a9-4af1-8910-72a0f613d62e-proxy-tls\") pod \"fca53ba5-e5a9-4af1-8910-72a0f613d62e\" (UID: \"fca53ba5-e5a9-4af1-8910-72a0f613d62e\") " Apr 23 14:14:09.991663 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:14:09.991520 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fca53ba5-e5a9-4af1-8910-72a0f613d62e-openshift-service-ca-bundle\") pod \"fca53ba5-e5a9-4af1-8910-72a0f613d62e\" (UID: \"fca53ba5-e5a9-4af1-8910-72a0f613d62e\") " Apr 23 14:14:09.991863 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:14:09.991827 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fca53ba5-e5a9-4af1-8910-72a0f613d62e-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "fca53ba5-e5a9-4af1-8910-72a0f613d62e" (UID: "fca53ba5-e5a9-4af1-8910-72a0f613d62e"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 14:14:09.993471 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:14:09.993450 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fca53ba5-e5a9-4af1-8910-72a0f613d62e-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fca53ba5-e5a9-4af1-8910-72a0f613d62e" (UID: "fca53ba5-e5a9-4af1-8910-72a0f613d62e"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 14:14:10.092321 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:14:10.092279 2577 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fca53ba5-e5a9-4af1-8910-72a0f613d62e-openshift-service-ca-bundle\") on node \"ip-10-0-132-207.ec2.internal\" DevicePath \"\"" Apr 23 14:14:10.092321 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:14:10.092314 2577 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fca53ba5-e5a9-4af1-8910-72a0f613d62e-proxy-tls\") on node \"ip-10-0-132-207.ec2.internal\" DevicePath \"\"" Apr 23 14:14:10.683527 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:14:10.683474 2577 generic.go:358] "Generic (PLEG): container finished" podID="fca53ba5-e5a9-4af1-8910-72a0f613d62e" containerID="4d3c4574e6ea72045204e4e7ffe1175d9a28ee2ec6c1c0164b3c0f24763c0ab6" exitCode=0 Apr 23 14:14:10.683735 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:14:10.683550 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-a8c3e-bc877ccbf-qwvt7" Apr 23 14:14:10.683735 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:14:10.683565 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-a8c3e-bc877ccbf-qwvt7" event={"ID":"fca53ba5-e5a9-4af1-8910-72a0f613d62e","Type":"ContainerDied","Data":"4d3c4574e6ea72045204e4e7ffe1175d9a28ee2ec6c1c0164b3c0f24763c0ab6"} Apr 23 14:14:10.683735 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:14:10.683607 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-a8c3e-bc877ccbf-qwvt7" event={"ID":"fca53ba5-e5a9-4af1-8910-72a0f613d62e","Type":"ContainerDied","Data":"823c254a6cb109358777ae259b8d4673d37c851fa1564f219425430f65917dab"} Apr 23 14:14:10.683735 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:14:10.683626 2577 scope.go:117] "RemoveContainer" containerID="4d3c4574e6ea72045204e4e7ffe1175d9a28ee2ec6c1c0164b3c0f24763c0ab6" Apr 23 14:14:10.691350 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:14:10.691334 2577 scope.go:117] "RemoveContainer" containerID="4d3c4574e6ea72045204e4e7ffe1175d9a28ee2ec6c1c0164b3c0f24763c0ab6" Apr 23 14:14:10.691638 ip-10-0-132-207 kubenswrapper[2577]: E0423 14:14:10.691618 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d3c4574e6ea72045204e4e7ffe1175d9a28ee2ec6c1c0164b3c0f24763c0ab6\": container with ID starting with 4d3c4574e6ea72045204e4e7ffe1175d9a28ee2ec6c1c0164b3c0f24763c0ab6 not found: ID does not exist" containerID="4d3c4574e6ea72045204e4e7ffe1175d9a28ee2ec6c1c0164b3c0f24763c0ab6" Apr 23 14:14:10.691706 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:14:10.691648 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d3c4574e6ea72045204e4e7ffe1175d9a28ee2ec6c1c0164b3c0f24763c0ab6"} err="failed to get container status \"4d3c4574e6ea72045204e4e7ffe1175d9a28ee2ec6c1c0164b3c0f24763c0ab6\": rpc error: code = NotFound desc = could not find container \"4d3c4574e6ea72045204e4e7ffe1175d9a28ee2ec6c1c0164b3c0f24763c0ab6\": container with ID starting with 4d3c4574e6ea72045204e4e7ffe1175d9a28ee2ec6c1c0164b3c0f24763c0ab6 not found: ID does not exist" Apr 23 14:14:10.706548 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:14:10.706521 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-a8c3e-bc877ccbf-qwvt7"] Apr 23 14:14:10.712414 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:14:10.712386 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-a8c3e-bc877ccbf-qwvt7"] Apr 23 14:14:11.584104 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:14:11.584065 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fca53ba5-e5a9-4af1-8910-72a0f613d62e" path="/var/lib/kubelet/pods/fca53ba5-e5a9-4af1-8910-72a0f613d62e/volumes" Apr 23 14:21:09.823509 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:09.823467 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-feffa-7cd874cdb5-n9vxb"] Apr 23 14:21:09.825935 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:09.823702 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/switch-graph-feffa-7cd874cdb5-n9vxb" podUID="361259e9-32f4-43ac-bfe5-7d4ae4337229" containerName="switch-graph-feffa" containerID="cri-o://7bfc4d292052f016dca898c5a3c7aa3100314329d4095c48672a714af5106912" gracePeriod=30 Apr 23 14:21:10.001385 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:10.001352 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-feffa-predictor-75cd96c4d6-tzwhk"] Apr 23 14:21:10.001721 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:10.001681 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-feffa-predictor-75cd96c4d6-tzwhk" podUID="7830be0b-2892-4e47-b17d-bb0aa34efc73" containerName="kserve-container" containerID="cri-o://a6b36c2cf542b9a92b573ec04e595e780bb4ee26e9145be0f7d8470156bf77c1" gracePeriod=30 Apr 23 14:21:10.001826 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:10.001758 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-feffa-predictor-75cd96c4d6-tzwhk" podUID="7830be0b-2892-4e47-b17d-bb0aa34efc73" containerName="kube-rbac-proxy" containerID="cri-o://4a04b94b443468f97a1511768fc905ddcfc037a62f865982751f7fc478ec0269" gracePeriod=30 Apr 23 14:21:10.822393 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:10.822358 2577 generic.go:358] "Generic (PLEG): container finished" podID="7830be0b-2892-4e47-b17d-bb0aa34efc73" containerID="4a04b94b443468f97a1511768fc905ddcfc037a62f865982751f7fc478ec0269" exitCode=2 Apr 23 14:21:10.822590 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:10.822430 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-feffa-predictor-75cd96c4d6-tzwhk" event={"ID":"7830be0b-2892-4e47-b17d-bb0aa34efc73","Type":"ContainerDied","Data":"4a04b94b443468f97a1511768fc905ddcfc037a62f865982751f7fc478ec0269"} Apr 23 14:21:12.640011 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:12.639986 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-feffa-predictor-75cd96c4d6-tzwhk" Apr 23 14:21:12.706366 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:12.706339 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-feffa-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7830be0b-2892-4e47-b17d-bb0aa34efc73-error-404-isvc-feffa-kube-rbac-proxy-sar-config\") pod \"7830be0b-2892-4e47-b17d-bb0aa34efc73\" (UID: \"7830be0b-2892-4e47-b17d-bb0aa34efc73\") " Apr 23 14:21:12.706511 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:12.706388 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wgbhd\" (UniqueName: \"kubernetes.io/projected/7830be0b-2892-4e47-b17d-bb0aa34efc73-kube-api-access-wgbhd\") pod \"7830be0b-2892-4e47-b17d-bb0aa34efc73\" (UID: \"7830be0b-2892-4e47-b17d-bb0aa34efc73\") " Apr 23 14:21:12.706511 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:12.706410 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7830be0b-2892-4e47-b17d-bb0aa34efc73-proxy-tls\") pod \"7830be0b-2892-4e47-b17d-bb0aa34efc73\" (UID: \"7830be0b-2892-4e47-b17d-bb0aa34efc73\") " Apr 23 14:21:12.706760 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:12.706735 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7830be0b-2892-4e47-b17d-bb0aa34efc73-error-404-isvc-feffa-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-feffa-kube-rbac-proxy-sar-config") pod "7830be0b-2892-4e47-b17d-bb0aa34efc73" (UID: "7830be0b-2892-4e47-b17d-bb0aa34efc73"). InnerVolumeSpecName "error-404-isvc-feffa-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 14:21:12.708400 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:12.708368 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7830be0b-2892-4e47-b17d-bb0aa34efc73-kube-api-access-wgbhd" (OuterVolumeSpecName: "kube-api-access-wgbhd") pod "7830be0b-2892-4e47-b17d-bb0aa34efc73" (UID: "7830be0b-2892-4e47-b17d-bb0aa34efc73"). InnerVolumeSpecName "kube-api-access-wgbhd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 14:21:12.708400 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:12.708386 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7830be0b-2892-4e47-b17d-bb0aa34efc73-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "7830be0b-2892-4e47-b17d-bb0aa34efc73" (UID: "7830be0b-2892-4e47-b17d-bb0aa34efc73"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 14:21:12.806863 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:12.806802 2577 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-feffa-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7830be0b-2892-4e47-b17d-bb0aa34efc73-error-404-isvc-feffa-kube-rbac-proxy-sar-config\") on node \"ip-10-0-132-207.ec2.internal\" DevicePath \"\"" Apr 23 14:21:12.806863 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:12.806834 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wgbhd\" (UniqueName: \"kubernetes.io/projected/7830be0b-2892-4e47-b17d-bb0aa34efc73-kube-api-access-wgbhd\") on node \"ip-10-0-132-207.ec2.internal\" DevicePath \"\"" Apr 23 14:21:12.806863 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:12.806844 2577 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7830be0b-2892-4e47-b17d-bb0aa34efc73-proxy-tls\") on node \"ip-10-0-132-207.ec2.internal\" DevicePath \"\"" Apr 23 14:21:12.828121 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:12.828099 2577 generic.go:358] "Generic (PLEG): container finished" podID="7830be0b-2892-4e47-b17d-bb0aa34efc73" containerID="a6b36c2cf542b9a92b573ec04e595e780bb4ee26e9145be0f7d8470156bf77c1" exitCode=0 Apr 23 14:21:12.828207 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:12.828166 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-feffa-predictor-75cd96c4d6-tzwhk" Apr 23 14:21:12.828207 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:12.828181 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-feffa-predictor-75cd96c4d6-tzwhk" event={"ID":"7830be0b-2892-4e47-b17d-bb0aa34efc73","Type":"ContainerDied","Data":"a6b36c2cf542b9a92b573ec04e595e780bb4ee26e9145be0f7d8470156bf77c1"} Apr 23 14:21:12.828280 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:12.828217 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-feffa-predictor-75cd96c4d6-tzwhk" event={"ID":"7830be0b-2892-4e47-b17d-bb0aa34efc73","Type":"ContainerDied","Data":"7eb35964d19a0a1b84ff2aa6cb94330185fb6d0900ab28192fa3106188baaf24"} Apr 23 14:21:12.828280 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:12.828238 2577 scope.go:117] "RemoveContainer" containerID="4a04b94b443468f97a1511768fc905ddcfc037a62f865982751f7fc478ec0269" Apr 23 14:21:12.835398 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:12.835381 2577 scope.go:117] "RemoveContainer" containerID="a6b36c2cf542b9a92b573ec04e595e780bb4ee26e9145be0f7d8470156bf77c1" Apr 23 14:21:12.842093 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:12.842072 2577 scope.go:117] "RemoveContainer" containerID="4a04b94b443468f97a1511768fc905ddcfc037a62f865982751f7fc478ec0269" Apr 23 14:21:12.842325 ip-10-0-132-207 kubenswrapper[2577]: E0423 14:21:12.842307 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a04b94b443468f97a1511768fc905ddcfc037a62f865982751f7fc478ec0269\": container with ID starting with 4a04b94b443468f97a1511768fc905ddcfc037a62f865982751f7fc478ec0269 not found: ID does not exist" containerID="4a04b94b443468f97a1511768fc905ddcfc037a62f865982751f7fc478ec0269" Apr 23 14:21:12.842373 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:12.842334 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a04b94b443468f97a1511768fc905ddcfc037a62f865982751f7fc478ec0269"} err="failed to get container status \"4a04b94b443468f97a1511768fc905ddcfc037a62f865982751f7fc478ec0269\": rpc error: code = NotFound desc = could not find container \"4a04b94b443468f97a1511768fc905ddcfc037a62f865982751f7fc478ec0269\": container with ID starting with 4a04b94b443468f97a1511768fc905ddcfc037a62f865982751f7fc478ec0269 not found: ID does not exist" Apr 23 14:21:12.842373 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:12.842352 2577 scope.go:117] "RemoveContainer" containerID="a6b36c2cf542b9a92b573ec04e595e780bb4ee26e9145be0f7d8470156bf77c1" Apr 23 14:21:12.842615 ip-10-0-132-207 kubenswrapper[2577]: E0423 14:21:12.842596 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6b36c2cf542b9a92b573ec04e595e780bb4ee26e9145be0f7d8470156bf77c1\": container with ID starting with a6b36c2cf542b9a92b573ec04e595e780bb4ee26e9145be0f7d8470156bf77c1 not found: ID does not exist" containerID="a6b36c2cf542b9a92b573ec04e595e780bb4ee26e9145be0f7d8470156bf77c1" Apr 23 14:21:12.842667 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:12.842622 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6b36c2cf542b9a92b573ec04e595e780bb4ee26e9145be0f7d8470156bf77c1"} err="failed to get container status \"a6b36c2cf542b9a92b573ec04e595e780bb4ee26e9145be0f7d8470156bf77c1\": rpc error: code = NotFound desc = could not find container \"a6b36c2cf542b9a92b573ec04e595e780bb4ee26e9145be0f7d8470156bf77c1\": container with ID starting with a6b36c2cf542b9a92b573ec04e595e780bb4ee26e9145be0f7d8470156bf77c1 not found: ID does not exist" Apr 23 14:21:12.848754 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:12.848733 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-feffa-predictor-75cd96c4d6-tzwhk"] Apr 23 14:21:12.853153 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:12.853132 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-feffa-predictor-75cd96c4d6-tzwhk"] Apr 23 14:21:13.146525 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:13.146434 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-feffa-7cd874cdb5-n9vxb" podUID="361259e9-32f4-43ac-bfe5-7d4ae4337229" containerName="switch-graph-feffa" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 14:21:13.584091 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:13.584056 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7830be0b-2892-4e47-b17d-bb0aa34efc73" path="/var/lib/kubelet/pods/7830be0b-2892-4e47-b17d-bb0aa34efc73/volumes" Apr 23 14:21:18.147139 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:18.147096 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-feffa-7cd874cdb5-n9vxb" podUID="361259e9-32f4-43ac-bfe5-7d4ae4337229" containerName="switch-graph-feffa" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 14:21:23.146681 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:23.146645 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-feffa-7cd874cdb5-n9vxb" podUID="361259e9-32f4-43ac-bfe5-7d4ae4337229" containerName="switch-graph-feffa" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 14:21:23.147044 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:23.146781 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-feffa-7cd874cdb5-n9vxb" Apr 23 14:21:25.078085 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:25.078060 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-feffa-7cd874cdb5-n9vxb_361259e9-32f4-43ac-bfe5-7d4ae4337229/switch-graph-feffa/0.log" Apr 23 14:21:25.897929 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:25.897874 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-feffa-7cd874cdb5-n9vxb_361259e9-32f4-43ac-bfe5-7d4ae4337229/switch-graph-feffa/0.log" Apr 23 14:21:26.725840 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:26.725816 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-feffa-7cd874cdb5-n9vxb_361259e9-32f4-43ac-bfe5-7d4ae4337229/switch-graph-feffa/0.log" Apr 23 14:21:27.519502 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:27.519473 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-feffa-7cd874cdb5-n9vxb_361259e9-32f4-43ac-bfe5-7d4ae4337229/switch-graph-feffa/0.log" Apr 23 14:21:28.147098 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:28.147058 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-feffa-7cd874cdb5-n9vxb" podUID="361259e9-32f4-43ac-bfe5-7d4ae4337229" containerName="switch-graph-feffa" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 14:21:28.354710 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:28.354666 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-feffa-7cd874cdb5-n9vxb_361259e9-32f4-43ac-bfe5-7d4ae4337229/switch-graph-feffa/0.log" Apr 23 14:21:29.154847 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:29.154804 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-feffa-7cd874cdb5-n9vxb_361259e9-32f4-43ac-bfe5-7d4ae4337229/switch-graph-feffa/0.log" Apr 23 14:21:29.946185 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:29.946152 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-feffa-7cd874cdb5-n9vxb_361259e9-32f4-43ac-bfe5-7d4ae4337229/switch-graph-feffa/0.log" Apr 23 14:21:30.751781 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:30.751721 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-feffa-7cd874cdb5-n9vxb_361259e9-32f4-43ac-bfe5-7d4ae4337229/switch-graph-feffa/0.log" Apr 23 14:21:31.591357 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:31.591327 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-feffa-7cd874cdb5-n9vxb_361259e9-32f4-43ac-bfe5-7d4ae4337229/switch-graph-feffa/0.log" Apr 23 14:21:32.432702 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:32.432677 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-feffa-7cd874cdb5-n9vxb_361259e9-32f4-43ac-bfe5-7d4ae4337229/switch-graph-feffa/0.log" Apr 23 14:21:33.146762 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:33.146718 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-feffa-7cd874cdb5-n9vxb" podUID="361259e9-32f4-43ac-bfe5-7d4ae4337229" containerName="switch-graph-feffa" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 14:21:33.254244 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:33.254214 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-feffa-7cd874cdb5-n9vxb_361259e9-32f4-43ac-bfe5-7d4ae4337229/switch-graph-feffa/0.log" Apr 23 14:21:34.047045 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:34.047018 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-feffa-7cd874cdb5-n9vxb_361259e9-32f4-43ac-bfe5-7d4ae4337229/switch-graph-feffa/0.log" Apr 23 14:21:38.147378 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:38.147338 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-feffa-7cd874cdb5-n9vxb" podUID="361259e9-32f4-43ac-bfe5-7d4ae4337229" containerName="switch-graph-feffa" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 14:21:39.399973 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:39.399945 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-x4nd8_75053ed6-040a-450c-b423-ce9ec4714d2f/global-pull-secret-syncer/0.log" Apr 23 14:21:39.534375 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:39.534335 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-shltj_2241e05b-7796-4e2e-b1cf-f47baaeef969/konnectivity-agent/0.log" Apr 23 14:21:39.560232 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:39.560207 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-132-207.ec2.internal_bbbaef9da88b934e809d0d29bccb4dd7/haproxy/0.log" Apr 23 14:21:39.906585 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:39.906441 2577 generic.go:358] "Generic (PLEG): container finished" podID="361259e9-32f4-43ac-bfe5-7d4ae4337229" containerID="7bfc4d292052f016dca898c5a3c7aa3100314329d4095c48672a714af5106912" exitCode=0 Apr 23 14:21:39.906733 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:39.906512 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-feffa-7cd874cdb5-n9vxb" event={"ID":"361259e9-32f4-43ac-bfe5-7d4ae4337229","Type":"ContainerDied","Data":"7bfc4d292052f016dca898c5a3c7aa3100314329d4095c48672a714af5106912"} Apr 23 14:21:39.952522 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:39.952483 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-feffa-7cd874cdb5-n9vxb" Apr 23 14:21:40.097646 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:40.097570 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/361259e9-32f4-43ac-bfe5-7d4ae4337229-openshift-service-ca-bundle\") pod \"361259e9-32f4-43ac-bfe5-7d4ae4337229\" (UID: \"361259e9-32f4-43ac-bfe5-7d4ae4337229\") " Apr 23 14:21:40.097782 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:40.097655 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/361259e9-32f4-43ac-bfe5-7d4ae4337229-proxy-tls\") pod \"361259e9-32f4-43ac-bfe5-7d4ae4337229\" (UID: \"361259e9-32f4-43ac-bfe5-7d4ae4337229\") " Apr 23 14:21:40.097943 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:40.097925 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/361259e9-32f4-43ac-bfe5-7d4ae4337229-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "361259e9-32f4-43ac-bfe5-7d4ae4337229" (UID: "361259e9-32f4-43ac-bfe5-7d4ae4337229"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 14:21:40.099543 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:40.099521 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/361259e9-32f4-43ac-bfe5-7d4ae4337229-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "361259e9-32f4-43ac-bfe5-7d4ae4337229" (UID: "361259e9-32f4-43ac-bfe5-7d4ae4337229"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 14:21:40.198415 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:40.198390 2577 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/361259e9-32f4-43ac-bfe5-7d4ae4337229-openshift-service-ca-bundle\") on node \"ip-10-0-132-207.ec2.internal\" DevicePath \"\"" Apr 23 14:21:40.198415 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:40.198413 2577 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/361259e9-32f4-43ac-bfe5-7d4ae4337229-proxy-tls\") on node \"ip-10-0-132-207.ec2.internal\" DevicePath \"\"" Apr 23 14:21:40.910722 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:40.910687 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-feffa-7cd874cdb5-n9vxb" event={"ID":"361259e9-32f4-43ac-bfe5-7d4ae4337229","Type":"ContainerDied","Data":"0fc0ab5366e3d7e5cad402df8ee78040af76afc0ebc751f05e80f95072111bc2"} Apr 23 14:21:40.911140 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:40.910731 2577 scope.go:117] "RemoveContainer" containerID="7bfc4d292052f016dca898c5a3c7aa3100314329d4095c48672a714af5106912" Apr 23 14:21:40.911140 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:40.910737 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-feffa-7cd874cdb5-n9vxb" Apr 23 14:21:40.932421 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:40.932395 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-feffa-7cd874cdb5-n9vxb"] Apr 23 14:21:40.936822 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:40.936801 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/switch-graph-feffa-7cd874cdb5-n9vxb"] Apr 23 14:21:41.583947 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:41.583913 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="361259e9-32f4-43ac-bfe5-7d4ae4337229" path="/var/lib/kubelet/pods/361259e9-32f4-43ac-bfe5-7d4ae4337229/volumes" Apr 23 14:21:43.450911 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:43.450884 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-hw27g_9fe377af-7b17-4ea6-9181-973d470d1441/node-exporter/0.log" Apr 23 14:21:43.478610 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:43.478586 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-hw27g_9fe377af-7b17-4ea6-9181-973d470d1441/kube-rbac-proxy/0.log" Apr 23 14:21:43.502133 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:43.502114 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-hw27g_9fe377af-7b17-4ea6-9181-973d470d1441/init-textfile/0.log" Apr 23 14:21:46.542682 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:46.542639 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-cvccm/perf-node-gather-daemonset-l8bn8"] Apr 23 14:21:46.543177 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:46.542992 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fca53ba5-e5a9-4af1-8910-72a0f613d62e" containerName="splitter-graph-a8c3e" Apr 23 14:21:46.543177 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:46.543008 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="fca53ba5-e5a9-4af1-8910-72a0f613d62e" containerName="splitter-graph-a8c3e" Apr 23 14:21:46.543177 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:46.543027 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7830be0b-2892-4e47-b17d-bb0aa34efc73" containerName="kube-rbac-proxy" Apr 23 14:21:46.543177 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:46.543036 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="7830be0b-2892-4e47-b17d-bb0aa34efc73" containerName="kube-rbac-proxy" Apr 23 14:21:46.543177 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:46.543051 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="361259e9-32f4-43ac-bfe5-7d4ae4337229" containerName="switch-graph-feffa" Apr 23 14:21:46.543177 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:46.543059 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="361259e9-32f4-43ac-bfe5-7d4ae4337229" containerName="switch-graph-feffa" Apr 23 14:21:46.543177 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:46.543071 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2ba177c1-5524-4560-8e62-9ef0ce9f1cbf" containerName="kserve-container" Apr 23 14:21:46.543177 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:46.543080 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ba177c1-5524-4560-8e62-9ef0ce9f1cbf" containerName="kserve-container" Apr 23 14:21:46.543177 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:46.543095 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2ba177c1-5524-4560-8e62-9ef0ce9f1cbf" containerName="kube-rbac-proxy" Apr 23 14:21:46.543177 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:46.543104 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ba177c1-5524-4560-8e62-9ef0ce9f1cbf" containerName="kube-rbac-proxy" Apr 23 14:21:46.543177 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:46.543115 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7830be0b-2892-4e47-b17d-bb0aa34efc73" containerName="kserve-container" Apr 23 14:21:46.543177 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:46.543122 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="7830be0b-2892-4e47-b17d-bb0aa34efc73" containerName="kserve-container" Apr 23 14:21:46.543177 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:46.543183 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="7830be0b-2892-4e47-b17d-bb0aa34efc73" containerName="kserve-container" Apr 23 14:21:46.543855 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:46.543195 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="fca53ba5-e5a9-4af1-8910-72a0f613d62e" containerName="splitter-graph-a8c3e" Apr 23 14:21:46.543855 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:46.543205 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="2ba177c1-5524-4560-8e62-9ef0ce9f1cbf" containerName="kube-rbac-proxy" Apr 23 14:21:46.543855 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:46.543214 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="361259e9-32f4-43ac-bfe5-7d4ae4337229" containerName="switch-graph-feffa" Apr 23 14:21:46.543855 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:46.543224 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="2ba177c1-5524-4560-8e62-9ef0ce9f1cbf" containerName="kserve-container" Apr 23 14:21:46.543855 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:46.543235 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="7830be0b-2892-4e47-b17d-bb0aa34efc73" containerName="kube-rbac-proxy" Apr 23 14:21:46.546224 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:46.546200 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-cvccm/perf-node-gather-daemonset-l8bn8" Apr 23 14:21:46.548955 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:46.548936 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-cvccm\"/\"openshift-service-ca.crt\"" Apr 23 14:21:46.550235 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:46.550218 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-cvccm\"/\"default-dockercfg-2cmhh\"" Apr 23 14:21:46.550340 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:46.550249 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-cvccm\"/\"kube-root-ca.crt\"" Apr 23 14:21:46.554807 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:46.554786 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-cvccm/perf-node-gather-daemonset-l8bn8"] Apr 23 14:21:46.643817 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:46.643774 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7ec5d89b-2b42-4d56-9246-16e4b45bb235-lib-modules\") pod \"perf-node-gather-daemonset-l8bn8\" (UID: \"7ec5d89b-2b42-4d56-9246-16e4b45bb235\") " pod="openshift-must-gather-cvccm/perf-node-gather-daemonset-l8bn8" Apr 23 14:21:46.643817 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:46.643820 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/7ec5d89b-2b42-4d56-9246-16e4b45bb235-proc\") pod \"perf-node-gather-daemonset-l8bn8\" (UID: \"7ec5d89b-2b42-4d56-9246-16e4b45bb235\") " pod="openshift-must-gather-cvccm/perf-node-gather-daemonset-l8bn8" Apr 23 14:21:46.644042 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:46.643854 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/7ec5d89b-2b42-4d56-9246-16e4b45bb235-podres\") pod \"perf-node-gather-daemonset-l8bn8\" (UID: \"7ec5d89b-2b42-4d56-9246-16e4b45bb235\") " pod="openshift-must-gather-cvccm/perf-node-gather-daemonset-l8bn8" Apr 23 14:21:46.644042 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:46.643885 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mv2dw\" (UniqueName: \"kubernetes.io/projected/7ec5d89b-2b42-4d56-9246-16e4b45bb235-kube-api-access-mv2dw\") pod \"perf-node-gather-daemonset-l8bn8\" (UID: \"7ec5d89b-2b42-4d56-9246-16e4b45bb235\") " pod="openshift-must-gather-cvccm/perf-node-gather-daemonset-l8bn8" Apr 23 14:21:46.644042 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:46.643943 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7ec5d89b-2b42-4d56-9246-16e4b45bb235-sys\") pod \"perf-node-gather-daemonset-l8bn8\" (UID: \"7ec5d89b-2b42-4d56-9246-16e4b45bb235\") " pod="openshift-must-gather-cvccm/perf-node-gather-daemonset-l8bn8" Apr 23 14:21:46.745103 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:46.745052 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/7ec5d89b-2b42-4d56-9246-16e4b45bb235-podres\") pod \"perf-node-gather-daemonset-l8bn8\" (UID: \"7ec5d89b-2b42-4d56-9246-16e4b45bb235\") " pod="openshift-must-gather-cvccm/perf-node-gather-daemonset-l8bn8" Apr 23 14:21:46.745299 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:46.745117 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mv2dw\" (UniqueName: \"kubernetes.io/projected/7ec5d89b-2b42-4d56-9246-16e4b45bb235-kube-api-access-mv2dw\") pod \"perf-node-gather-daemonset-l8bn8\" (UID: \"7ec5d89b-2b42-4d56-9246-16e4b45bb235\") " pod="openshift-must-gather-cvccm/perf-node-gather-daemonset-l8bn8" Apr 23 14:21:46.745299 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:46.745149 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7ec5d89b-2b42-4d56-9246-16e4b45bb235-sys\") pod \"perf-node-gather-daemonset-l8bn8\" (UID: \"7ec5d89b-2b42-4d56-9246-16e4b45bb235\") " pod="openshift-must-gather-cvccm/perf-node-gather-daemonset-l8bn8" Apr 23 14:21:46.745299 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:46.745187 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7ec5d89b-2b42-4d56-9246-16e4b45bb235-lib-modules\") pod \"perf-node-gather-daemonset-l8bn8\" (UID: \"7ec5d89b-2b42-4d56-9246-16e4b45bb235\") " pod="openshift-must-gather-cvccm/perf-node-gather-daemonset-l8bn8" Apr 23 14:21:46.745299 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:46.745234 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/7ec5d89b-2b42-4d56-9246-16e4b45bb235-podres\") pod \"perf-node-gather-daemonset-l8bn8\" (UID: \"7ec5d89b-2b42-4d56-9246-16e4b45bb235\") " pod="openshift-must-gather-cvccm/perf-node-gather-daemonset-l8bn8" Apr 23 14:21:46.745299 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:46.745265 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7ec5d89b-2b42-4d56-9246-16e4b45bb235-sys\") pod \"perf-node-gather-daemonset-l8bn8\" (UID: \"7ec5d89b-2b42-4d56-9246-16e4b45bb235\") " pod="openshift-must-gather-cvccm/perf-node-gather-daemonset-l8bn8" Apr 23 14:21:46.745299 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:46.745277 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7ec5d89b-2b42-4d56-9246-16e4b45bb235-lib-modules\") pod \"perf-node-gather-daemonset-l8bn8\" (UID: \"7ec5d89b-2b42-4d56-9246-16e4b45bb235\") " pod="openshift-must-gather-cvccm/perf-node-gather-daemonset-l8bn8" Apr 23 14:21:46.745627 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:46.745317 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/7ec5d89b-2b42-4d56-9246-16e4b45bb235-proc\") pod \"perf-node-gather-daemonset-l8bn8\" (UID: \"7ec5d89b-2b42-4d56-9246-16e4b45bb235\") " pod="openshift-must-gather-cvccm/perf-node-gather-daemonset-l8bn8" Apr 23 14:21:46.745627 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:46.745395 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/7ec5d89b-2b42-4d56-9246-16e4b45bb235-proc\") pod \"perf-node-gather-daemonset-l8bn8\" (UID: \"7ec5d89b-2b42-4d56-9246-16e4b45bb235\") " pod="openshift-must-gather-cvccm/perf-node-gather-daemonset-l8bn8" Apr 23 14:21:46.754463 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:46.754440 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mv2dw\" (UniqueName: \"kubernetes.io/projected/7ec5d89b-2b42-4d56-9246-16e4b45bb235-kube-api-access-mv2dw\") pod \"perf-node-gather-daemonset-l8bn8\" (UID: \"7ec5d89b-2b42-4d56-9246-16e4b45bb235\") " pod="openshift-must-gather-cvccm/perf-node-gather-daemonset-l8bn8" Apr 23 14:21:46.855754 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:46.855666 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-cvccm/perf-node-gather-daemonset-l8bn8" Apr 23 14:21:46.972760 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:46.972732 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-cvccm/perf-node-gather-daemonset-l8bn8"] Apr 23 14:21:46.975478 ip-10-0-132-207 kubenswrapper[2577]: W0423 14:21:46.975453 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod7ec5d89b_2b42_4d56_9246_16e4b45bb235.slice/crio-b2ff3705d820c3de4b0d6f84f6f8a8d17c4b3f2fd8f1416f52efcbd778c60cc6 WatchSource:0}: Error finding container b2ff3705d820c3de4b0d6f84f6f8a8d17c4b3f2fd8f1416f52efcbd778c60cc6: Status 404 returned error can't find the container with id b2ff3705d820c3de4b0d6f84f6f8a8d17c4b3f2fd8f1416f52efcbd778c60cc6 Apr 23 14:21:46.977481 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:46.977463 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 14:21:47.484697 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:47.484667 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-rrmk8_d3ae8909-ecd0-47e1-a99c-4ea293db3077/dns/0.log" Apr 23 14:21:47.505899 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:47.505874 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-rrmk8_d3ae8909-ecd0-47e1-a99c-4ea293db3077/kube-rbac-proxy/0.log" Apr 23 14:21:47.574188 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:47.574164 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-wxlj2_5ab5642a-1989-41c1-956f-98f92fcc6f23/dns-node-resolver/0.log" Apr 23 14:21:47.931913 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:47.931884 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-cvccm/perf-node-gather-daemonset-l8bn8" event={"ID":"7ec5d89b-2b42-4d56-9246-16e4b45bb235","Type":"ContainerStarted","Data":"b92e5e9f8b4af88c3588ea52db333b1fd6ad0f48868a3a3cd036b98d53593e8a"} Apr 23 14:21:47.931913 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:47.931916 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-cvccm/perf-node-gather-daemonset-l8bn8" event={"ID":"7ec5d89b-2b42-4d56-9246-16e4b45bb235","Type":"ContainerStarted","Data":"b2ff3705d820c3de4b0d6f84f6f8a8d17c4b3f2fd8f1416f52efcbd778c60cc6"} Apr 23 14:21:47.932121 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:47.932004 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-cvccm/perf-node-gather-daemonset-l8bn8" Apr 23 14:21:47.951310 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:47.951267 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-cvccm/perf-node-gather-daemonset-l8bn8" podStartSLOduration=1.951255908 podStartE2EDuration="1.951255908s" podCreationTimestamp="2026-04-23 14:21:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 14:21:47.949460237 +0000 UTC m=+2980.927368830" watchObservedRunningTime="2026-04-23 14:21:47.951255908 +0000 UTC m=+2980.929164500" Apr 23 14:21:48.063248 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:48.063219 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-mbln5_aecfee63-4703-49e8-81cc-aa07bc06ce4e/node-ca/0.log" Apr 23 14:21:49.248308 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:49.248276 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-b9l6f_5d0b8118-d3c3-4333-a6a3-c53abf8e3daa/serve-healthcheck-canary/0.log" Apr 23 14:21:49.750546 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:49.750515 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-dhqrw_bec1f7dd-ec47-40f1-8ca6-554c81f3b55c/kube-rbac-proxy/0.log" Apr 23 14:21:49.771173 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:49.771143 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-dhqrw_bec1f7dd-ec47-40f1-8ca6-554c81f3b55c/exporter/0.log" Apr 23 14:21:49.793524 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:49.793501 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-dhqrw_bec1f7dd-ec47-40f1-8ca6-554c81f3b55c/extractor/0.log" Apr 23 14:21:51.863081 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:51.863051 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-68cc5db7c4-2shc8_909b49ca-874d-4fa9-a88d-1205920e24fb/manager/0.log" Apr 23 14:21:51.885016 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:51.884990 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-tdtsd_5cdd1980-12e8-4103-8738-ba37e5119ae4/server/0.log" Apr 23 14:21:52.199677 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:52.199651 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-86cc847c5c-sr6pk_6fc263fb-458d-40a2-a680-52de2c3b4007/seaweedfs/0.log" Apr 23 14:21:53.942812 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:53.942782 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-cvccm/perf-node-gather-daemonset-l8bn8" Apr 23 14:21:57.708990 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:57.708963 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-p9tzn_4a590caf-dc65-421e-a4c8-40d3258ddd7b/kube-multus-additional-cni-plugins/0.log" Apr 23 14:21:57.729517 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:57.729476 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-p9tzn_4a590caf-dc65-421e-a4c8-40d3258ddd7b/egress-router-binary-copy/0.log" Apr 23 14:21:57.751029 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:57.751006 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-p9tzn_4a590caf-dc65-421e-a4c8-40d3258ddd7b/cni-plugins/0.log" Apr 23 14:21:57.772350 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:57.772325 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-p9tzn_4a590caf-dc65-421e-a4c8-40d3258ddd7b/bond-cni-plugin/0.log" Apr 23 14:21:57.793608 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:57.793553 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-p9tzn_4a590caf-dc65-421e-a4c8-40d3258ddd7b/routeoverride-cni/0.log" Apr 23 14:21:57.814283 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:57.814261 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-p9tzn_4a590caf-dc65-421e-a4c8-40d3258ddd7b/whereabouts-cni-bincopy/0.log" Apr 23 14:21:57.834086 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:57.834063 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-p9tzn_4a590caf-dc65-421e-a4c8-40d3258ddd7b/whereabouts-cni/0.log" Apr 23 14:21:58.027255 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:58.027220 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xvg2d_b61ffc5b-bae6-4b74-a181-3e3df6606045/kube-multus/0.log" Apr 23 14:21:58.049047 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:58.048961 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-6pz6w_821df7f9-3f87-4f86-a7e9-82cad302fff0/network-metrics-daemon/0.log" Apr 23 14:21:58.067183 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:58.067155 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-6pz6w_821df7f9-3f87-4f86-a7e9-82cad302fff0/kube-rbac-proxy/0.log" Apr 23 14:21:59.511183 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:59.511152 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mwbjc_58abee5a-98ee-4a90-ab84-a17d06d08d00/ovn-controller/0.log" Apr 23 14:21:59.540655 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:59.540626 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mwbjc_58abee5a-98ee-4a90-ab84-a17d06d08d00/ovn-acl-logging/0.log" Apr 23 14:21:59.559792 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:59.559760 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mwbjc_58abee5a-98ee-4a90-ab84-a17d06d08d00/kube-rbac-proxy-node/0.log" Apr 23 14:21:59.583269 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:59.583241 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mwbjc_58abee5a-98ee-4a90-ab84-a17d06d08d00/kube-rbac-proxy-ovn-metrics/0.log" Apr 23 14:21:59.602330 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:59.602307 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mwbjc_58abee5a-98ee-4a90-ab84-a17d06d08d00/northd/0.log" Apr 23 14:21:59.622159 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:59.622134 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mwbjc_58abee5a-98ee-4a90-ab84-a17d06d08d00/nbdb/0.log" Apr 23 14:21:59.648118 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:59.648094 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mwbjc_58abee5a-98ee-4a90-ab84-a17d06d08d00/sbdb/0.log" Apr 23 14:21:59.746584 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:21:59.746552 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mwbjc_58abee5a-98ee-4a90-ab84-a17d06d08d00/ovnkube-controller/0.log" Apr 23 14:22:00.834001 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:22:00.833975 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-fl2td_373af144-ae77-4496-8057-d855373807e4/network-check-target-container/0.log" Apr 23 14:22:01.785314 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:22:01.785286 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-v5s77_8de6e215-aa9d-4003-aac4-d2a7bbdb59fb/iptables-alerter/0.log" Apr 23 14:22:02.405065 ip-10-0-132-207 kubenswrapper[2577]: I0423 14:22:02.405032 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-g4zgx_7c29fef9-0671-485c-988d-0b06e4091d1a/tuned/0.log"