Mar 18 16:44:05.865913 ip-10-0-143-175 systemd[1]: Starting Kubernetes Kubelet... Mar 18 16:44:06.328767 ip-10-0-143-175 kubenswrapper[2578]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 18 16:44:06.328767 ip-10-0-143-175 kubenswrapper[2578]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 18 16:44:06.328767 ip-10-0-143-175 kubenswrapper[2578]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 18 16:44:06.328767 ip-10-0-143-175 kubenswrapper[2578]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Mar 18 16:44:06.328767 ip-10-0-143-175 kubenswrapper[2578]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 18 16:44:06.331694 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.331576 2578 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 18 16:44:06.335505 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.335484 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Mar 18 16:44:06.335505 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.335504 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Mar 18 16:44:06.335593 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.335510 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Mar 18 16:44:06.335593 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.335515 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Mar 18 16:44:06.335593 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.335519 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Mar 18 16:44:06.335593 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.335533 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Mar 18 16:44:06.335593 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.335537 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Mar 18 16:44:06.335593 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.335539 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Mar 18 16:44:06.335593 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.335542 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Mar 18 16:44:06.335593 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.335545 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Mar 18 16:44:06.335593 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.335548 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Mar 18 16:44:06.335593 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.335551 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Mar 18 16:44:06.335593 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.335553 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Mar 18 16:44:06.335593 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.335556 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Mar 18 16:44:06.335593 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.335565 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Mar 18 16:44:06.335593 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.335568 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 18 16:44:06.335593 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.335571 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Mar 18 16:44:06.335593 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.335574 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Mar 18 16:44:06.335593 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.335577 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Mar 18 16:44:06.335593 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.335580 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Mar 18 16:44:06.335593 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.335583 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Mar 18 16:44:06.336034 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.335585 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Mar 18 16:44:06.336034 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.335588 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Mar 18 16:44:06.336034 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.335591 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Mar 18 16:44:06.336034 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.335593 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Mar 18 16:44:06.336034 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.335597 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 18 16:44:06.336034 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.335601 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Mar 18 16:44:06.336034 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.335605 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Mar 18 16:44:06.336034 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.335607 2578 feature_gate.go:328] unrecognized feature gate: Example Mar 18 16:44:06.336034 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.335610 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Mar 18 16:44:06.336034 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.335613 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Mar 18 16:44:06.336034 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.335615 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Mar 18 16:44:06.336034 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.335618 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Mar 18 16:44:06.336034 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.335621 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Mar 18 16:44:06.336034 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.335624 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Mar 18 16:44:06.336034 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.335627 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Mar 18 16:44:06.336034 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.335629 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Mar 18 16:44:06.336034 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.335632 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 18 16:44:06.336034 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.335634 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Mar 18 16:44:06.336034 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.335662 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Mar 18 16:44:06.336034 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.335666 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 18 16:44:06.336490 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.335669 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Mar 18 16:44:06.336490 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.335672 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Mar 18 16:44:06.336490 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.335674 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Mar 18 16:44:06.336490 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.335677 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Mar 18 16:44:06.336490 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.335680 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Mar 18 16:44:06.336490 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.335683 2578 feature_gate.go:328] unrecognized feature gate: Example2 Mar 18 16:44:06.336490 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.335691 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Mar 18 16:44:06.336490 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.335694 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Mar 18 16:44:06.336490 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.335697 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Mar 18 16:44:06.336490 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.335699 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Mar 18 16:44:06.336490 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.335702 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Mar 18 16:44:06.336490 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.335704 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Mar 18 16:44:06.336490 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.335708 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Mar 18 16:44:06.336490 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.335712 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Mar 18 16:44:06.336490 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.335715 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Mar 18 16:44:06.336490 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.335718 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 18 16:44:06.336490 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.335721 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Mar 18 16:44:06.336490 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.335724 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Mar 18 16:44:06.336490 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.335726 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Mar 18 16:44:06.336490 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.335729 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Mar 18 16:44:06.336970 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.335731 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Mar 18 16:44:06.336970 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.335734 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Mar 18 16:44:06.336970 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.335737 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Mar 18 16:44:06.336970 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.335740 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Mar 18 16:44:06.336970 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.335743 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Mar 18 16:44:06.336970 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.335746 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Mar 18 16:44:06.336970 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.335748 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Mar 18 16:44:06.336970 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.335751 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Mar 18 16:44:06.336970 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.335754 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Mar 18 16:44:06.336970 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.335756 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 18 16:44:06.336970 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.335759 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Mar 18 16:44:06.336970 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.335761 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Mar 18 16:44:06.336970 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.335764 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Mar 18 16:44:06.336970 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.335767 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Mar 18 16:44:06.336970 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.335769 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Mar 18 16:44:06.336970 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.335772 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 18 16:44:06.336970 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.335774 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 18 16:44:06.336970 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.335777 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Mar 18 16:44:06.336970 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.335779 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Mar 18 16:44:06.337417 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.335790 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Mar 18 16:44:06.337417 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.335793 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Mar 18 16:44:06.337417 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.335795 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Mar 18 16:44:06.337417 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.335798 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Mar 18 16:44:06.337417 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.335800 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Mar 18 16:44:06.337417 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.335803 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Mar 18 16:44:06.337417 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.336222 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Mar 18 16:44:06.337417 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.336227 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Mar 18 16:44:06.337417 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.336230 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Mar 18 16:44:06.337417 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.336233 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Mar 18 16:44:06.337417 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.336236 2578 feature_gate.go:328] unrecognized feature gate: Example Mar 18 16:44:06.337417 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.336239 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Mar 18 16:44:06.337417 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.336242 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Mar 18 16:44:06.337417 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.336244 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Mar 18 16:44:06.337417 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.336247 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Mar 18 16:44:06.337417 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.336250 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Mar 18 16:44:06.337417 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.336252 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Mar 18 16:44:06.337417 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.336255 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Mar 18 16:44:06.337417 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.336258 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Mar 18 16:44:06.337417 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.336260 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 18 16:44:06.337899 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.336263 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Mar 18 16:44:06.337899 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.336266 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Mar 18 16:44:06.337899 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.336268 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Mar 18 16:44:06.337899 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.336271 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Mar 18 16:44:06.337899 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.336274 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Mar 18 16:44:06.337899 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.336276 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Mar 18 16:44:06.337899 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.336279 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 18 16:44:06.337899 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.336281 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 18 16:44:06.337899 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.336284 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Mar 18 16:44:06.337899 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.336287 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Mar 18 16:44:06.337899 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.336289 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Mar 18 16:44:06.337899 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.336292 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Mar 18 16:44:06.337899 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.336295 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Mar 18 16:44:06.337899 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.336298 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Mar 18 16:44:06.337899 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.336300 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Mar 18 16:44:06.337899 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.336304 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Mar 18 16:44:06.337899 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.336306 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 18 16:44:06.337899 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.336309 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Mar 18 16:44:06.337899 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.336312 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Mar 18 16:44:06.337899 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.336315 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Mar 18 16:44:06.338384 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.336317 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Mar 18 16:44:06.338384 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.336320 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Mar 18 16:44:06.338384 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.336323 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Mar 18 16:44:06.338384 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.336326 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Mar 18 16:44:06.338384 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.336328 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Mar 18 16:44:06.338384 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.336331 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Mar 18 16:44:06.338384 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.336333 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Mar 18 16:44:06.338384 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.336336 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 18 16:44:06.338384 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.336340 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Mar 18 16:44:06.338384 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.336343 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Mar 18 16:44:06.338384 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.336346 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Mar 18 16:44:06.338384 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.336349 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Mar 18 16:44:06.338384 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.336352 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Mar 18 16:44:06.338384 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.336354 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Mar 18 16:44:06.338384 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.336358 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Mar 18 16:44:06.338384 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.336362 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Mar 18 16:44:06.338384 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.336365 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Mar 18 16:44:06.338384 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.336368 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Mar 18 16:44:06.338384 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.336371 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Mar 18 16:44:06.338846 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.336373 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Mar 18 16:44:06.338846 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.336376 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Mar 18 16:44:06.338846 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.336379 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Mar 18 16:44:06.338846 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.336382 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Mar 18 16:44:06.338846 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.336384 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Mar 18 16:44:06.338846 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.336387 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Mar 18 16:44:06.338846 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.336390 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Mar 18 16:44:06.338846 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.336393 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Mar 18 16:44:06.338846 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.336396 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Mar 18 16:44:06.338846 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.336398 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Mar 18 16:44:06.338846 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.336401 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Mar 18 16:44:06.338846 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.336404 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Mar 18 16:44:06.338846 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.336407 2578 feature_gate.go:328] unrecognized feature gate: Example2 Mar 18 16:44:06.338846 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.336410 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Mar 18 16:44:06.338846 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.336413 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Mar 18 16:44:06.338846 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.336415 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Mar 18 16:44:06.338846 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.336418 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Mar 18 16:44:06.338846 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.336421 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Mar 18 16:44:06.338846 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.336424 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Mar 18 16:44:06.338846 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.336426 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Mar 18 16:44:06.339317 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.336428 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 18 16:44:06.339317 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.336431 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Mar 18 16:44:06.339317 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.336434 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Mar 18 16:44:06.339317 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.336436 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 18 16:44:06.339317 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.336439 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Mar 18 16:44:06.339317 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.336442 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Mar 18 16:44:06.339317 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.336444 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Mar 18 16:44:06.339317 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.336447 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Mar 18 16:44:06.339317 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.336449 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Mar 18 16:44:06.339317 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.336452 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Mar 18 16:44:06.339317 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.336454 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 18 16:44:06.339317 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.336457 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Mar 18 16:44:06.339317 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.336459 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Mar 18 16:44:06.339317 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.337779 2578 flags.go:64] FLAG: --address="0.0.0.0" Mar 18 16:44:06.339317 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.337791 2578 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 18 16:44:06.339317 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.337800 2578 flags.go:64] FLAG: --anonymous-auth="true" Mar 18 16:44:06.339317 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.337804 2578 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 18 16:44:06.339317 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.337809 2578 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 18 16:44:06.339317 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.337813 2578 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 18 16:44:06.339317 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.337818 2578 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 18 16:44:06.339317 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.337823 2578 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 18 16:44:06.339866 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.337826 2578 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 18 16:44:06.339866 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.337829 2578 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 18 16:44:06.339866 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.337833 2578 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 18 16:44:06.339866 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.337836 2578 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 18 16:44:06.339866 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.337840 2578 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 18 16:44:06.339866 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.337843 2578 flags.go:64] FLAG: --cgroup-root="" Mar 18 16:44:06.339866 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.337846 2578 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 18 16:44:06.339866 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.337849 2578 flags.go:64] FLAG: --client-ca-file="" Mar 18 16:44:06.339866 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.337852 2578 flags.go:64] FLAG: --cloud-config="" Mar 18 16:44:06.339866 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.337855 2578 flags.go:64] FLAG: --cloud-provider="external" Mar 18 16:44:06.339866 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.337858 2578 flags.go:64] FLAG: --cluster-dns="[]" Mar 18 16:44:06.339866 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.337862 2578 flags.go:64] FLAG: --cluster-domain="" Mar 18 16:44:06.339866 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.337865 2578 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 18 16:44:06.339866 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.337868 2578 flags.go:64] FLAG: --config-dir="" Mar 18 16:44:06.339866 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.337871 2578 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 18 16:44:06.339866 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.337875 2578 flags.go:64] FLAG: --container-log-max-files="5" Mar 18 16:44:06.339866 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.337879 2578 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 18 16:44:06.339866 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.337883 2578 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 18 16:44:06.339866 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.337886 2578 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 18 16:44:06.339866 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.337889 2578 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 18 16:44:06.339866 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.337892 2578 flags.go:64] FLAG: --contention-profiling="false" Mar 18 16:44:06.339866 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.337895 2578 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 18 16:44:06.339866 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.337898 2578 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 18 16:44:06.339866 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.337902 2578 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 18 16:44:06.339866 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.337905 2578 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 18 16:44:06.340467 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.337910 2578 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 18 16:44:06.340467 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.337914 2578 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 18 16:44:06.340467 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.337917 2578 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 18 16:44:06.340467 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.337920 2578 flags.go:64] FLAG: --enable-load-reader="false" Mar 18 16:44:06.340467 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.337923 2578 flags.go:64] FLAG: --enable-server="true" Mar 18 16:44:06.340467 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.337926 2578 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 18 16:44:06.340467 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.337931 2578 flags.go:64] FLAG: --event-burst="100" Mar 18 16:44:06.340467 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.337935 2578 flags.go:64] FLAG: --event-qps="50" Mar 18 16:44:06.340467 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.337938 2578 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 18 16:44:06.340467 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.337941 2578 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 18 16:44:06.340467 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.337944 2578 flags.go:64] FLAG: --eviction-hard="" Mar 18 16:44:06.340467 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.337948 2578 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 18 16:44:06.340467 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.337951 2578 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 18 16:44:06.340467 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.337954 2578 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 18 16:44:06.340467 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.337957 2578 flags.go:64] FLAG: --eviction-soft="" Mar 18 16:44:06.340467 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.337960 2578 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 18 16:44:06.340467 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.337963 2578 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 18 16:44:06.340467 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.337966 2578 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 18 16:44:06.340467 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.337969 2578 flags.go:64] FLAG: --experimental-mounter-path="" Mar 18 16:44:06.340467 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.337972 2578 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 18 16:44:06.340467 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.337975 2578 flags.go:64] FLAG: --fail-swap-on="true" Mar 18 16:44:06.340467 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.337979 2578 flags.go:64] FLAG: --feature-gates="" Mar 18 16:44:06.340467 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.337983 2578 flags.go:64] FLAG: --file-check-frequency="20s" Mar 18 16:44:06.340467 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.337986 2578 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 18 16:44:06.340467 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.337989 2578 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 18 16:44:06.341103 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.337992 2578 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 18 16:44:06.341103 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.337995 2578 flags.go:64] FLAG: --healthz-port="10248" Mar 18 16:44:06.341103 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.337998 2578 flags.go:64] FLAG: --help="false" Mar 18 16:44:06.341103 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.338001 2578 flags.go:64] FLAG: --hostname-override="ip-10-0-143-175.ec2.internal" Mar 18 16:44:06.341103 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.338005 2578 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 18 16:44:06.341103 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.338008 2578 flags.go:64] FLAG: --http-check-frequency="20s" Mar 18 16:44:06.341103 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.338011 2578 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Mar 18 16:44:06.341103 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.338015 2578 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Mar 18 16:44:06.341103 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.338018 2578 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 18 16:44:06.341103 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.338021 2578 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 18 16:44:06.341103 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.338025 2578 flags.go:64] FLAG: --image-service-endpoint="" Mar 18 16:44:06.341103 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.338027 2578 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 18 16:44:06.341103 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.338030 2578 flags.go:64] FLAG: --kube-api-burst="100" Mar 18 16:44:06.341103 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.338033 2578 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 18 16:44:06.341103 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.338037 2578 flags.go:64] FLAG: --kube-api-qps="50" Mar 18 16:44:06.341103 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.338040 2578 flags.go:64] FLAG: --kube-reserved="" Mar 18 16:44:06.341103 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.338043 2578 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 18 16:44:06.341103 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.338046 2578 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 18 16:44:06.341103 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.338049 2578 flags.go:64] FLAG: --kubelet-cgroups="" Mar 18 16:44:06.341103 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.338052 2578 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 18 16:44:06.341103 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.338055 2578 flags.go:64] FLAG: --lock-file="" Mar 18 16:44:06.341103 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.338057 2578 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 18 16:44:06.341103 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.338060 2578 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 18 16:44:06.341103 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.338064 2578 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 18 16:44:06.341701 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.338069 2578 flags.go:64] FLAG: --log-json-split-stream="false" Mar 18 16:44:06.341701 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.338072 2578 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 18 16:44:06.341701 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.338075 2578 flags.go:64] FLAG: --log-text-split-stream="false" Mar 18 16:44:06.341701 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.338078 2578 flags.go:64] FLAG: --logging-format="text" Mar 18 16:44:06.341701 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.338081 2578 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 18 16:44:06.341701 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.338085 2578 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 18 16:44:06.341701 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.338088 2578 flags.go:64] FLAG: --manifest-url="" Mar 18 16:44:06.341701 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.338090 2578 flags.go:64] FLAG: --manifest-url-header="" Mar 18 16:44:06.341701 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.338095 2578 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 18 16:44:06.341701 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.338098 2578 flags.go:64] FLAG: --max-open-files="1000000" Mar 18 16:44:06.341701 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.338102 2578 flags.go:64] FLAG: --max-pods="110" Mar 18 16:44:06.341701 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.338105 2578 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 18 16:44:06.341701 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.338109 2578 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 18 16:44:06.341701 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.338112 2578 flags.go:64] FLAG: --memory-manager-policy="None" Mar 18 16:44:06.341701 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.338115 2578 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 18 16:44:06.341701 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.338118 2578 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 18 16:44:06.341701 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.338121 2578 flags.go:64] FLAG: --node-ip="0.0.0.0" Mar 18 16:44:06.341701 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.338124 2578 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Mar 18 16:44:06.341701 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.338132 2578 flags.go:64] FLAG: --node-status-max-images="50" Mar 18 16:44:06.341701 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.338136 2578 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 18 16:44:06.341701 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.338139 2578 flags.go:64] FLAG: --oom-score-adj="-999" Mar 18 16:44:06.341701 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.338143 2578 flags.go:64] FLAG: --pod-cidr="" Mar 18 16:44:06.341701 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.338146 2578 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b3115b2610585407ab0742648cfbe39c72f57482889f0e778f5ac6fdc482217b" Mar 18 16:44:06.342243 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.338152 2578 flags.go:64] FLAG: --pod-manifest-path="" Mar 18 16:44:06.342243 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.338156 2578 flags.go:64] FLAG: --pod-max-pids="-1" Mar 18 16:44:06.342243 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.338159 2578 flags.go:64] FLAG: --pods-per-core="0" Mar 18 16:44:06.342243 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.338162 2578 flags.go:64] FLAG: --port="10250" Mar 18 16:44:06.342243 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.338165 2578 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 18 16:44:06.342243 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.338168 2578 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0aad8566b6c5f4b0d" Mar 18 16:44:06.342243 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.338172 2578 flags.go:64] FLAG: --qos-reserved="" Mar 18 16:44:06.342243 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.338175 2578 flags.go:64] FLAG: --read-only-port="10255" Mar 18 16:44:06.342243 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.338177 2578 flags.go:64] FLAG: --register-node="true" Mar 18 16:44:06.342243 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.338180 2578 flags.go:64] FLAG: --register-schedulable="true" Mar 18 16:44:06.342243 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.338183 2578 flags.go:64] FLAG: --register-with-taints="" Mar 18 16:44:06.342243 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.338187 2578 flags.go:64] FLAG: --registry-burst="10" Mar 18 16:44:06.342243 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.338190 2578 flags.go:64] FLAG: --registry-qps="5" Mar 18 16:44:06.342243 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.338193 2578 flags.go:64] FLAG: --reserved-cpus="" Mar 18 16:44:06.342243 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.338196 2578 flags.go:64] FLAG: --reserved-memory="" Mar 18 16:44:06.342243 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.338200 2578 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 18 16:44:06.342243 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.338203 2578 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 18 16:44:06.342243 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.338206 2578 flags.go:64] FLAG: --rotate-certificates="false" Mar 18 16:44:06.342243 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.338209 2578 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 18 16:44:06.342243 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.338212 2578 flags.go:64] FLAG: --runonce="false" Mar 18 16:44:06.342243 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.338215 2578 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 18 16:44:06.342243 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.338218 2578 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 18 16:44:06.342243 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.338221 2578 flags.go:64] FLAG: --seccomp-default="false" Mar 18 16:44:06.342243 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.338224 2578 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 18 16:44:06.342243 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.338231 2578 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 18 16:44:06.342243 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.338234 2578 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 18 16:44:06.342874 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.338237 2578 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 18 16:44:06.342874 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.338241 2578 flags.go:64] FLAG: --storage-driver-password="root" Mar 18 16:44:06.342874 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.338244 2578 flags.go:64] FLAG: --storage-driver-secure="false" Mar 18 16:44:06.342874 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.338247 2578 flags.go:64] FLAG: --storage-driver-table="stats" Mar 18 16:44:06.342874 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.338250 2578 flags.go:64] FLAG: --storage-driver-user="root" Mar 18 16:44:06.342874 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.338253 2578 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 18 16:44:06.342874 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.338256 2578 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 18 16:44:06.342874 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.338259 2578 flags.go:64] FLAG: --system-cgroups="" Mar 18 16:44:06.342874 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.338262 2578 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Mar 18 16:44:06.342874 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.338268 2578 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 18 16:44:06.342874 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.338271 2578 flags.go:64] FLAG: --tls-cert-file="" Mar 18 16:44:06.342874 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.338274 2578 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 18 16:44:06.342874 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.338279 2578 flags.go:64] FLAG: --tls-min-version="" Mar 18 16:44:06.342874 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.338281 2578 flags.go:64] FLAG: --tls-private-key-file="" Mar 18 16:44:06.342874 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.338284 2578 flags.go:64] FLAG: --topology-manager-policy="none" Mar 18 16:44:06.342874 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.338287 2578 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 18 16:44:06.342874 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.338290 2578 flags.go:64] FLAG: --topology-manager-scope="container" Mar 18 16:44:06.342874 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.338293 2578 flags.go:64] FLAG: --v="2" Mar 18 16:44:06.342874 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.338298 2578 flags.go:64] FLAG: --version="false" Mar 18 16:44:06.342874 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.338302 2578 flags.go:64] FLAG: --vmodule="" Mar 18 16:44:06.342874 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.338306 2578 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 18 16:44:06.342874 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.338309 2578 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 18 16:44:06.342874 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.338408 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Mar 18 16:44:06.342874 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.338412 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Mar 18 16:44:06.343468 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.338415 2578 feature_gate.go:328] unrecognized feature gate: Example Mar 18 16:44:06.343468 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.338418 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 18 16:44:06.343468 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.338422 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Mar 18 16:44:06.343468 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.338425 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Mar 18 16:44:06.343468 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.338428 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Mar 18 16:44:06.343468 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.338431 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Mar 18 16:44:06.343468 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.338436 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Mar 18 16:44:06.343468 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.338439 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Mar 18 16:44:06.343468 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.338442 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Mar 18 16:44:06.343468 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.338444 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Mar 18 16:44:06.343468 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.338447 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Mar 18 16:44:06.343468 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.338450 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Mar 18 16:44:06.343468 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.338453 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Mar 18 16:44:06.343468 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.338455 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Mar 18 16:44:06.343468 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.338458 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Mar 18 16:44:06.343468 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.338460 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Mar 18 16:44:06.343468 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.338468 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Mar 18 16:44:06.343468 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.338471 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Mar 18 16:44:06.343468 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.338474 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Mar 18 16:44:06.343468 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.338476 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Mar 18 16:44:06.344002 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.338479 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Mar 18 16:44:06.344002 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.338482 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Mar 18 16:44:06.344002 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.338484 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 18 16:44:06.344002 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.338487 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 18 16:44:06.344002 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.338490 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Mar 18 16:44:06.344002 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.338492 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Mar 18 16:44:06.344002 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.338495 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Mar 18 16:44:06.344002 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.338500 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Mar 18 16:44:06.344002 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.338503 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Mar 18 16:44:06.344002 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.338505 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Mar 18 16:44:06.344002 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.338508 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Mar 18 16:44:06.344002 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.338511 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Mar 18 16:44:06.344002 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.338513 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Mar 18 16:44:06.344002 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.338517 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Mar 18 16:44:06.344002 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.338519 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Mar 18 16:44:06.344002 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.338535 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Mar 18 16:44:06.344002 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.338539 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Mar 18 16:44:06.344002 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.338542 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Mar 18 16:44:06.344002 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.338547 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Mar 18 16:44:06.344475 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.338549 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Mar 18 16:44:06.344475 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.338552 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Mar 18 16:44:06.344475 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.338555 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 18 16:44:06.344475 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.338558 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Mar 18 16:44:06.344475 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.338561 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Mar 18 16:44:06.344475 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.338564 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Mar 18 16:44:06.344475 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.338566 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Mar 18 16:44:06.344475 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.338569 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Mar 18 16:44:06.344475 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.338572 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Mar 18 16:44:06.344475 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.338575 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Mar 18 16:44:06.344475 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.338578 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Mar 18 16:44:06.344475 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.338581 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Mar 18 16:44:06.344475 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.338584 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Mar 18 16:44:06.344475 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.338586 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Mar 18 16:44:06.344475 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.338589 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Mar 18 16:44:06.344475 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.338591 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Mar 18 16:44:06.344475 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.338594 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Mar 18 16:44:06.344475 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.338597 2578 feature_gate.go:328] unrecognized feature gate: Example2 Mar 18 16:44:06.344475 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.338599 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Mar 18 16:44:06.344475 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.338602 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 18 16:44:06.345017 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.338606 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Mar 18 16:44:06.345017 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.338609 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Mar 18 16:44:06.345017 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.338612 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Mar 18 16:44:06.345017 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.338614 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Mar 18 16:44:06.345017 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.338617 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Mar 18 16:44:06.345017 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.338620 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Mar 18 16:44:06.345017 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.338622 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Mar 18 16:44:06.345017 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.338626 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Mar 18 16:44:06.345017 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.338628 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Mar 18 16:44:06.345017 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.338631 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Mar 18 16:44:06.345017 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.338635 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 18 16:44:06.345017 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.338640 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 18 16:44:06.345017 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.338642 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Mar 18 16:44:06.345017 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.338645 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Mar 18 16:44:06.345017 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.338648 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Mar 18 16:44:06.345017 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.338650 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Mar 18 16:44:06.345017 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.338653 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Mar 18 16:44:06.345017 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.338656 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Mar 18 16:44:06.345017 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.338658 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Mar 18 16:44:06.345017 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.338661 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Mar 18 16:44:06.345765 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.338663 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Mar 18 16:44:06.345765 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.338666 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Mar 18 16:44:06.345765 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.338669 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Mar 18 16:44:06.345765 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.338672 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Mar 18 16:44:06.345765 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.338674 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 18 16:44:06.345765 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.339365 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Mar 18 16:44:06.347487 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.347454 2578 server.go:530] "Kubelet version" kubeletVersion="v1.33.8" Mar 18 16:44:06.347487 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.347478 2578 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 18 16:44:06.347714 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.347703 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Mar 18 16:44:06.347714 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.347714 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Mar 18 16:44:06.347714 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.347718 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Mar 18 16:44:06.347797 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.347722 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Mar 18 16:44:06.347797 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.347725 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Mar 18 16:44:06.347797 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.347729 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Mar 18 16:44:06.347797 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.347732 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 18 16:44:06.347797 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.347736 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Mar 18 16:44:06.347797 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.347739 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Mar 18 16:44:06.347797 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.347742 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Mar 18 16:44:06.347797 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.347744 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Mar 18 16:44:06.347797 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.347747 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Mar 18 16:44:06.347797 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.347750 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Mar 18 16:44:06.347797 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.347753 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Mar 18 16:44:06.347797 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.347755 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 18 16:44:06.347797 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.347758 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Mar 18 16:44:06.347797 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.347761 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Mar 18 16:44:06.347797 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.347763 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Mar 18 16:44:06.347797 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.347766 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Mar 18 16:44:06.347797 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.347769 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Mar 18 16:44:06.347797 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.347771 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 18 16:44:06.347797 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.347775 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Mar 18 16:44:06.348247 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.347778 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Mar 18 16:44:06.348247 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.347781 2578 feature_gate.go:328] unrecognized feature gate: Example2 Mar 18 16:44:06.348247 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.347784 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Mar 18 16:44:06.348247 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.347787 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 18 16:44:06.348247 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.347789 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Mar 18 16:44:06.348247 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.347792 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Mar 18 16:44:06.348247 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.347794 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Mar 18 16:44:06.348247 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.347797 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Mar 18 16:44:06.348247 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.347800 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Mar 18 16:44:06.348247 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.347803 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Mar 18 16:44:06.348247 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.347806 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Mar 18 16:44:06.348247 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.347809 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Mar 18 16:44:06.348247 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.347812 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Mar 18 16:44:06.348247 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.347815 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Mar 18 16:44:06.348247 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.347818 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Mar 18 16:44:06.348247 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.347821 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Mar 18 16:44:06.348247 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.347824 2578 feature_gate.go:328] unrecognized feature gate: Example Mar 18 16:44:06.348247 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.347826 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 18 16:44:06.348247 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.347829 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Mar 18 16:44:06.348247 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.347832 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 18 16:44:06.348800 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.347834 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Mar 18 16:44:06.348800 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.347837 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Mar 18 16:44:06.348800 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.347840 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Mar 18 16:44:06.348800 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.347842 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Mar 18 16:44:06.348800 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.347845 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Mar 18 16:44:06.348800 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.347847 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Mar 18 16:44:06.348800 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.347850 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Mar 18 16:44:06.348800 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.347852 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Mar 18 16:44:06.348800 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.347855 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Mar 18 16:44:06.348800 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.347858 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Mar 18 16:44:06.348800 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.347860 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Mar 18 16:44:06.348800 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.347863 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Mar 18 16:44:06.348800 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.347865 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Mar 18 16:44:06.348800 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.347869 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 18 16:44:06.348800 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.347871 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Mar 18 16:44:06.348800 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.347874 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Mar 18 16:44:06.348800 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.347876 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Mar 18 16:44:06.348800 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.347879 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Mar 18 16:44:06.348800 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.347882 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Mar 18 16:44:06.348800 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.347884 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Mar 18 16:44:06.349288 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.347887 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Mar 18 16:44:06.349288 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.347890 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Mar 18 16:44:06.349288 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.347892 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Mar 18 16:44:06.349288 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.347895 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Mar 18 16:44:06.349288 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.347898 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Mar 18 16:44:06.349288 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.347900 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Mar 18 16:44:06.349288 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.347903 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Mar 18 16:44:06.349288 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.347905 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Mar 18 16:44:06.349288 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.347908 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Mar 18 16:44:06.349288 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.347911 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Mar 18 16:44:06.349288 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.347914 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 18 16:44:06.349288 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.347919 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Mar 18 16:44:06.349288 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.347922 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Mar 18 16:44:06.349288 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.347926 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Mar 18 16:44:06.349288 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.347932 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Mar 18 16:44:06.349288 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.347935 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Mar 18 16:44:06.349288 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.347938 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Mar 18 16:44:06.349288 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.347941 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Mar 18 16:44:06.349288 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.347944 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Mar 18 16:44:06.349768 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.347947 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Mar 18 16:44:06.349768 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.347950 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Mar 18 16:44:06.349768 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.347953 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Mar 18 16:44:06.349768 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.347956 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Mar 18 16:44:06.349768 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.347959 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Mar 18 16:44:06.349768 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.347964 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Mar 18 16:44:06.349768 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.348089 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Mar 18 16:44:06.349768 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.348096 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Mar 18 16:44:06.349768 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.348099 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Mar 18 16:44:06.349768 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.348103 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Mar 18 16:44:06.349768 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.348106 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Mar 18 16:44:06.349768 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.348109 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Mar 18 16:44:06.349768 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.348112 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Mar 18 16:44:06.349768 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.348115 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Mar 18 16:44:06.349768 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.348118 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Mar 18 16:44:06.349768 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.348120 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Mar 18 16:44:06.350171 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.348123 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 18 16:44:06.350171 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.348126 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Mar 18 16:44:06.350171 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.348128 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Mar 18 16:44:06.350171 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.348131 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Mar 18 16:44:06.350171 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.348133 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Mar 18 16:44:06.350171 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.348136 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Mar 18 16:44:06.350171 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.348139 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 18 16:44:06.350171 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.348141 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 18 16:44:06.350171 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.348144 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Mar 18 16:44:06.350171 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.348146 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Mar 18 16:44:06.350171 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.348149 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Mar 18 16:44:06.350171 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.348151 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Mar 18 16:44:06.350171 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.348154 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Mar 18 16:44:06.350171 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.348157 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Mar 18 16:44:06.350171 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.348160 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Mar 18 16:44:06.350171 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.348162 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Mar 18 16:44:06.350171 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.348165 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Mar 18 16:44:06.350171 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.348167 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Mar 18 16:44:06.350171 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.348170 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Mar 18 16:44:06.350171 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.348172 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Mar 18 16:44:06.350679 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.348175 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Mar 18 16:44:06.350679 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.348177 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Mar 18 16:44:06.350679 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.348181 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Mar 18 16:44:06.350679 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.348186 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Mar 18 16:44:06.350679 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.348189 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Mar 18 16:44:06.350679 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.348193 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Mar 18 16:44:06.350679 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.348196 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Mar 18 16:44:06.350679 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.348199 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Mar 18 16:44:06.350679 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.348201 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Mar 18 16:44:06.350679 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.348204 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Mar 18 16:44:06.350679 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.348207 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Mar 18 16:44:06.350679 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.348210 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Mar 18 16:44:06.350679 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.348213 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Mar 18 16:44:06.350679 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.348215 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Mar 18 16:44:06.350679 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.348218 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Mar 18 16:44:06.350679 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.348221 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Mar 18 16:44:06.350679 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.348224 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 18 16:44:06.350679 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.348228 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Mar 18 16:44:06.350679 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.348231 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Mar 18 16:44:06.351143 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.348234 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Mar 18 16:44:06.351143 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.348237 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Mar 18 16:44:06.351143 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.348240 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Mar 18 16:44:06.351143 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.348243 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Mar 18 16:44:06.351143 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.348245 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Mar 18 16:44:06.351143 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.348248 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Mar 18 16:44:06.351143 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.348252 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 18 16:44:06.351143 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.348255 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Mar 18 16:44:06.351143 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.348258 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Mar 18 16:44:06.351143 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.348260 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Mar 18 16:44:06.351143 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.348263 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Mar 18 16:44:06.351143 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.348265 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Mar 18 16:44:06.351143 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.348268 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Mar 18 16:44:06.351143 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.348270 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Mar 18 16:44:06.351143 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.348273 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 18 16:44:06.351143 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.348275 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Mar 18 16:44:06.351143 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.348279 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Mar 18 16:44:06.351143 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.348281 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Mar 18 16:44:06.351143 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.348284 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Mar 18 16:44:06.351143 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.348286 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Mar 18 16:44:06.351657 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.348290 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Mar 18 16:44:06.351657 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.348292 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 18 16:44:06.351657 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.348295 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Mar 18 16:44:06.351657 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.348298 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Mar 18 16:44:06.351657 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.348300 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Mar 18 16:44:06.351657 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.348303 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Mar 18 16:44:06.351657 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.348305 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Mar 18 16:44:06.351657 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.348309 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Mar 18 16:44:06.351657 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.348311 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Mar 18 16:44:06.351657 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.348314 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Mar 18 16:44:06.351657 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.348317 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Mar 18 16:44:06.351657 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.348319 2578 feature_gate.go:328] unrecognized feature gate: Example Mar 18 16:44:06.351657 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.348322 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Mar 18 16:44:06.351657 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.348325 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Mar 18 16:44:06.351657 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.348328 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 18 16:44:06.351657 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.348330 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Mar 18 16:44:06.351657 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:06.348333 2578 feature_gate.go:328] unrecognized feature gate: Example2 Mar 18 16:44:06.352120 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.348337 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Mar 18 16:44:06.352120 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.349115 2578 server.go:962] "Client rotation is on, will bootstrap in background" Mar 18 16:44:06.352120 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.351829 2578 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 18 16:44:06.352871 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.352858 2578 server.go:1019] "Starting client certificate rotation" Mar 18 16:44:06.352977 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.352955 2578 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Mar 18 16:44:06.353031 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.353009 2578 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Mar 18 16:44:06.378286 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.378257 2578 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 18 16:44:06.381784 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.381758 2578 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 18 16:44:06.398090 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.397678 2578 log.go:25] "Validated CRI v1 runtime API" Mar 18 16:44:06.404230 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.404203 2578 log.go:25] "Validated CRI v1 image API" Mar 18 16:44:06.405448 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.405430 2578 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 18 16:44:06.409597 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.409518 2578 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 99aab877-c9f3-43d7-b682-8eba7cd9d349:/dev/nvme0n1p3 cd893aa3-f987-4b72-9bb0-dca469250b5b:/dev/nvme0n1p4] Mar 18 16:44:06.409597 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.409558 2578 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Mar 18 16:44:06.409821 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.409803 2578 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Mar 18 16:44:06.415343 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.415222 2578 manager.go:217] Machine: {Timestamp:2026-03-18 16:44:06.413373378 +0000 UTC m=+0.425482198 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3102431 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2c4993120c1b1150d50d79b2493e9a SystemUUID:ec2c4993-120c-1b11-50d5-0d79b2493e9a BootID:7f53d6e8-173c-4d0a-bba8-ec165d32daa5 Filesystems:[{Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6094848 Type:vfs Inodes:18446744073709551615 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:9a:19:54:ce:2f Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:9a:19:54:ce:2f Speed:0 Mtu:9001} {Name:ovs-system MacAddress:f6:e7:42:dd:8e:91 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 18 16:44:06.415343 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.415333 2578 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 18 16:44:06.415471 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.415429 2578 manager.go:233] Version: {KernelVersion:5.14.0-570.96.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260303-1 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 18 16:44:06.416497 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.416467 2578 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 18 16:44:06.416666 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.416499 2578 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-143-175.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 18 16:44:06.416718 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.416675 2578 topology_manager.go:138] "Creating topology manager with none policy" Mar 18 16:44:06.416718 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.416685 2578 container_manager_linux.go:306] "Creating device plugin manager" Mar 18 16:44:06.416718 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.416698 2578 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 18 16:44:06.417446 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.417433 2578 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 18 16:44:06.418743 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.418732 2578 state_mem.go:36] "Initialized new in-memory state store" Mar 18 16:44:06.418867 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.418858 2578 server.go:1267] "Using root directory" path="/var/lib/kubelet" Mar 18 16:44:06.421510 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.421497 2578 kubelet.go:491] "Attempting to sync node with API server" Mar 18 16:44:06.421582 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.421519 2578 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 18 16:44:06.421582 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.421544 2578 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 18 16:44:06.421582 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.421555 2578 kubelet.go:397] "Adding apiserver pod source" Mar 18 16:44:06.421582 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.421566 2578 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 18 16:44:06.423120 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.423103 2578 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Mar 18 16:44:06.423168 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.423126 2578 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Mar 18 16:44:06.426252 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.426230 2578 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.9-3.rhaos4.20.gitb9ac835.el9" apiVersion="v1" Mar 18 16:44:06.428138 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.428118 2578 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Mar 18 16:44:06.429821 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.429806 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 18 16:44:06.429895 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.429827 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 18 16:44:06.429895 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.429837 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 18 16:44:06.429895 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.429845 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 18 16:44:06.429895 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.429853 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 18 16:44:06.429895 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.429864 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 18 16:44:06.429895 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.429873 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 18 16:44:06.429895 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.429881 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 18 16:44:06.429895 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.429892 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 18 16:44:06.430142 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.429900 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 18 16:44:06.430142 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.429914 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 18 16:44:06.430142 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.429927 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 18 16:44:06.430828 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.430817 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 18 16:44:06.430876 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.430832 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Mar 18 16:44:06.434417 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.434401 2578 watchdog_linux.go:99] "Systemd watchdog is not enabled" Mar 18 16:44:06.434501 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.434447 2578 server.go:1295] "Started kubelet" Mar 18 16:44:06.434585 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.434552 2578 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Mar 18 16:44:06.435202 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.434591 2578 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 18 16:44:06.435299 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.435263 2578 server_v1.go:47] "podresources" method="list" useActivePods=true Mar 18 16:44:06.435257 ip-10-0-143-175 systemd[1]: Started Kubernetes Kubelet. Mar 18 16:44:06.436473 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.436457 2578 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 18 16:44:06.438102 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.438079 2578 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-143-175.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 16:44:06.438102 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.438098 2578 server.go:317] "Adding debug handlers to kubelet server" Mar 18 16:44:06.438261 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:06.438143 2578 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-143-175.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Mar 18 16:44:06.438329 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:06.438260 2578 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Mar 18 16:44:06.442736 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.442715 2578 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Mar 18 16:44:06.443324 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.443305 2578 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 18 16:44:06.444044 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.444028 2578 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Mar 18 16:44:06.444189 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.444176 2578 volume_manager.go:295] "The desired_state_of_world populator starts" Mar 18 16:44:06.444376 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.444351 2578 volume_manager.go:297] "Starting Kubelet Volume Manager" Mar 18 16:44:06.444545 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:06.444333 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-175.ec2.internal\" not found" Mar 18 16:44:06.444603 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.444543 2578 reconstruct.go:97] "Volume reconstruction finished" Mar 18 16:44:06.444603 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.444556 2578 reconciler.go:26] "Reconciler: start to sync state" Mar 18 16:44:06.446030 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:06.445114 2578 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-143-175.ec2.internal.189dfd3848c07fd6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-143-175.ec2.internal,UID:ip-10-0-143-175.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-143-175.ec2.internal,},FirstTimestamp:2026-03-18 16:44:06.43441455 +0000 UTC m=+0.446523369,LastTimestamp:2026-03-18 16:44:06.43441455 +0000 UTC m=+0.446523369,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-143-175.ec2.internal,}" Mar 18 16:44:06.446147 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.446131 2578 factory.go:55] Registering systemd factory Mar 18 16:44:06.446202 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.446170 2578 factory.go:223] Registration of the systemd container factory successfully Mar 18 16:44:06.446421 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.446400 2578 factory.go:153] Registering CRI-O factory Mar 18 16:44:06.446421 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.446420 2578 factory.go:223] Registration of the crio container factory successfully Mar 18 16:44:06.446571 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.446493 2578 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 18 16:44:06.446571 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.446515 2578 factory.go:103] Registering Raw factory Mar 18 16:44:06.446571 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.446547 2578 manager.go:1196] Started watching for new ooms in manager Mar 18 16:44:06.447231 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.447215 2578 manager.go:319] Starting recovery of all containers Mar 18 16:44:06.447311 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:06.447229 2578 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Mar 18 16:44:06.447778 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:06.447756 2578 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Mar 18 16:44:06.447852 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:06.447790 2578 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-143-175.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Mar 18 16:44:06.451062 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.451040 2578 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-hdcrw" Mar 18 16:44:06.453050 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.453002 2578 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Mar 18 16:44:06.458090 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.458071 2578 manager.go:324] Recovery completed Mar 18 16:44:06.458917 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.458898 2578 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-hdcrw" Mar 18 16:44:06.459896 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:06.459875 2578 watcher.go:152] Failed to watch directory "/sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service": inotify_add_watch /sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service: no such file or directory Mar 18 16:44:06.463005 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.462991 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 18 16:44:06.465278 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.465262 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-175.ec2.internal" event="NodeHasSufficientMemory" Mar 18 16:44:06.465343 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.465291 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-175.ec2.internal" event="NodeHasNoDiskPressure" Mar 18 16:44:06.465343 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.465301 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-175.ec2.internal" event="NodeHasSufficientPID" Mar 18 16:44:06.465862 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.465847 2578 cpu_manager.go:222] "Starting CPU manager" policy="none" Mar 18 16:44:06.465862 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.465861 2578 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Mar 18 16:44:06.465978 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.465879 2578 state_mem.go:36] "Initialized new in-memory state store" Mar 18 16:44:06.467228 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.467110 2578 policy_none.go:49] "None policy: Start" Mar 18 16:44:06.467278 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.467232 2578 memory_manager.go:186] "Starting memorymanager" policy="None" Mar 18 16:44:06.467278 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.467242 2578 state_mem.go:35] "Initializing new in-memory state store" Mar 18 16:44:06.467865 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:06.467790 2578 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-143-175.ec2.internal.189dfd384a976dd6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-143-175.ec2.internal,UID:ip-10-0-143-175.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-143-175.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-143-175.ec2.internal,},FirstTimestamp:2026-03-18 16:44:06.465277398 +0000 UTC m=+0.477386214,LastTimestamp:2026-03-18 16:44:06.465277398 +0000 UTC m=+0.477386214,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-143-175.ec2.internal,}" Mar 18 16:44:06.502286 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.502265 2578 manager.go:341] "Starting Device Plugin manager" Mar 18 16:44:06.521781 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:06.502308 2578 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Mar 18 16:44:06.521781 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.502321 2578 server.go:85] "Starting device plugin registration server" Mar 18 16:44:06.521781 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.502595 2578 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 18 16:44:06.521781 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.502610 2578 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 18 16:44:06.521781 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.502694 2578 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 18 16:44:06.521781 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.502784 2578 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 18 16:44:06.521781 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.502794 2578 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 18 16:44:06.521781 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:06.503336 2578 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Mar 18 16:44:06.521781 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:06.503369 2578 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-143-175.ec2.internal\" not found" Mar 18 16:44:06.587573 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.587536 2578 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Mar 18 16:44:06.587573 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.587576 2578 status_manager.go:230] "Starting to sync pod status with apiserver" Mar 18 16:44:06.587763 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.587600 2578 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Mar 18 16:44:06.587763 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.587609 2578 kubelet.go:2451] "Starting kubelet main sync loop" Mar 18 16:44:06.587763 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:06.587652 2578 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Mar 18 16:44:06.590927 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.590902 2578 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Mar 18 16:44:06.603192 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.603158 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 18 16:44:06.604154 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.604135 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-175.ec2.internal" event="NodeHasSufficientMemory" Mar 18 16:44:06.604265 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.604173 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-175.ec2.internal" event="NodeHasNoDiskPressure" Mar 18 16:44:06.604265 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.604189 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-175.ec2.internal" event="NodeHasSufficientPID" Mar 18 16:44:06.604265 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.604215 2578 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-143-175.ec2.internal" Mar 18 16:44:06.610711 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.610695 2578 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-143-175.ec2.internal" Mar 18 16:44:06.610773 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:06.610718 2578 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-143-175.ec2.internal\": node \"ip-10-0-143-175.ec2.internal\" not found" Mar 18 16:44:06.631396 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:06.631373 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-175.ec2.internal\" not found" Mar 18 16:44:06.687864 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.687761 2578 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-175.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-143-175.ec2.internal"] Mar 18 16:44:06.688000 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.687877 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 18 16:44:06.688853 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.688835 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-175.ec2.internal" event="NodeHasSufficientMemory" Mar 18 16:44:06.688938 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.688869 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-175.ec2.internal" event="NodeHasNoDiskPressure" Mar 18 16:44:06.688938 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.688881 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-175.ec2.internal" event="NodeHasSufficientPID" Mar 18 16:44:06.691288 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.691272 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 18 16:44:06.691438 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.691417 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-175.ec2.internal" Mar 18 16:44:06.691487 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.691451 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 18 16:44:06.692073 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.692052 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-175.ec2.internal" event="NodeHasSufficientMemory" Mar 18 16:44:06.692176 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.692081 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-175.ec2.internal" event="NodeHasNoDiskPressure" Mar 18 16:44:06.692176 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.692091 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-175.ec2.internal" event="NodeHasSufficientPID" Mar 18 16:44:06.692176 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.692119 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-175.ec2.internal" event="NodeHasSufficientMemory" Mar 18 16:44:06.692176 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.692138 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-175.ec2.internal" event="NodeHasNoDiskPressure" Mar 18 16:44:06.692176 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.692147 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-175.ec2.internal" event="NodeHasSufficientPID" Mar 18 16:44:06.694976 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.694961 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-175.ec2.internal" Mar 18 16:44:06.695077 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.694988 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 18 16:44:06.695844 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.695828 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-175.ec2.internal" event="NodeHasSufficientMemory" Mar 18 16:44:06.695923 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.695859 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-175.ec2.internal" event="NodeHasNoDiskPressure" Mar 18 16:44:06.695923 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.695873 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-175.ec2.internal" event="NodeHasSufficientPID" Mar 18 16:44:06.712278 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:06.712246 2578 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-143-175.ec2.internal\" not found" node="ip-10-0-143-175.ec2.internal" Mar 18 16:44:06.716338 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:06.716320 2578 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-143-175.ec2.internal\" not found" node="ip-10-0-143-175.ec2.internal" Mar 18 16:44:06.732406 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:06.732372 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-175.ec2.internal\" not found" Mar 18 16:44:06.745730 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.745697 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/9afad3d4608ba8d74c66b5a17d2a5def-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-143-175.ec2.internal\" (UID: \"9afad3d4608ba8d74c66b5a17d2a5def\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-175.ec2.internal" Mar 18 16:44:06.832763 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:06.832728 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-175.ec2.internal\" not found" Mar 18 16:44:06.846626 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.846587 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/a2c314ed716f4e99aa7d05b49c88fc46-config\") pod \"kube-apiserver-proxy-ip-10-0-143-175.ec2.internal\" (UID: \"a2c314ed716f4e99aa7d05b49c88fc46\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-143-175.ec2.internal" Mar 18 16:44:06.846746 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.846662 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/9afad3d4608ba8d74c66b5a17d2a5def-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-143-175.ec2.internal\" (UID: \"9afad3d4608ba8d74c66b5a17d2a5def\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-175.ec2.internal" Mar 18 16:44:06.846746 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.846686 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9afad3d4608ba8d74c66b5a17d2a5def-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-143-175.ec2.internal\" (UID: \"9afad3d4608ba8d74c66b5a17d2a5def\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-175.ec2.internal" Mar 18 16:44:06.846746 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.846736 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/9afad3d4608ba8d74c66b5a17d2a5def-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-143-175.ec2.internal\" (UID: \"9afad3d4608ba8d74c66b5a17d2a5def\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-175.ec2.internal" Mar 18 16:44:06.933636 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:06.933601 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-175.ec2.internal\" not found" Mar 18 16:44:06.947132 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.947003 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9afad3d4608ba8d74c66b5a17d2a5def-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-143-175.ec2.internal\" (UID: \"9afad3d4608ba8d74c66b5a17d2a5def\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-175.ec2.internal" Mar 18 16:44:06.947132 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.947032 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/a2c314ed716f4e99aa7d05b49c88fc46-config\") pod \"kube-apiserver-proxy-ip-10-0-143-175.ec2.internal\" (UID: \"a2c314ed716f4e99aa7d05b49c88fc46\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-143-175.ec2.internal" Mar 18 16:44:06.947132 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.947075 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/a2c314ed716f4e99aa7d05b49c88fc46-config\") pod \"kube-apiserver-proxy-ip-10-0-143-175.ec2.internal\" (UID: \"a2c314ed716f4e99aa7d05b49c88fc46\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-143-175.ec2.internal" Mar 18 16:44:06.947132 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:06.947080 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9afad3d4608ba8d74c66b5a17d2a5def-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-143-175.ec2.internal\" (UID: \"9afad3d4608ba8d74c66b5a17d2a5def\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-175.ec2.internal" Mar 18 16:44:07.015241 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:07.015194 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-175.ec2.internal" Mar 18 16:44:07.018053 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:07.018008 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-175.ec2.internal" Mar 18 16:44:07.034372 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:07.034338 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-175.ec2.internal\" not found" Mar 18 16:44:07.134925 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:07.134877 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-175.ec2.internal\" not found" Mar 18 16:44:07.235569 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:07.235471 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-175.ec2.internal\" not found" Mar 18 16:44:07.336112 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:07.336078 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-175.ec2.internal\" not found" Mar 18 16:44:07.353551 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:07.353517 2578 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 18 16:44:07.353684 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:07.353669 2578 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Mar 18 16:44:07.433057 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:07.433027 2578 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Mar 18 16:44:07.436573 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:07.436554 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-175.ec2.internal\" not found" Mar 18 16:44:07.443332 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:07.443313 2578 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Mar 18 16:44:07.451660 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:07.451638 2578 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Mar 18 16:44:07.461159 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:07.461133 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-03-17 16:39:06 +0000 UTC" deadline="2027-11-11 07:15:33.342332107 +0000 UTC" Mar 18 16:44:07.461236 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:07.461168 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14462h31m25.88116745s" Mar 18 16:44:07.476496 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:07.476474 2578 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-zf4kf" Mar 18 16:44:07.484974 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:07.484949 2578 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-zf4kf" Mar 18 16:44:07.537160 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:07.537084 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-175.ec2.internal\" not found" Mar 18 16:44:07.637837 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:07.637793 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-175.ec2.internal\" not found" Mar 18 16:44:07.670459 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:07.670422 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda2c314ed716f4e99aa7d05b49c88fc46.slice/crio-90902809f2340a43151da09f250f214e5a0f317a3793697abb9eec684280dbe0 WatchSource:0}: Error finding container 90902809f2340a43151da09f250f214e5a0f317a3793697abb9eec684280dbe0: Status 404 returned error can't find the container with id 90902809f2340a43151da09f250f214e5a0f317a3793697abb9eec684280dbe0 Mar 18 16:44:07.670880 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:07.670858 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9afad3d4608ba8d74c66b5a17d2a5def.slice/crio-df61040e538c4d311ce5c5eb935f0ac663f255906c9da3ce37e31d58541f1c9f WatchSource:0}: Error finding container df61040e538c4d311ce5c5eb935f0ac663f255906c9da3ce37e31d58541f1c9f: Status 404 returned error can't find the container with id df61040e538c4d311ce5c5eb935f0ac663f255906c9da3ce37e31d58541f1c9f Mar 18 16:44:07.678114 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:07.678093 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 16:44:07.738353 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:07.738315 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-175.ec2.internal\" not found" Mar 18 16:44:07.838844 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:07.838813 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-175.ec2.internal\" not found" Mar 18 16:44:07.939642 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:07.939598 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-175.ec2.internal\" not found" Mar 18 16:44:07.963630 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:07.963600 2578 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Mar 18 16:44:08.026014 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.025981 2578 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Mar 18 16:44:08.044310 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.044278 2578 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-175.ec2.internal" Mar 18 16:44:08.056507 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.056483 2578 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Mar 18 16:44:08.057664 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.057644 2578 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-175.ec2.internal" Mar 18 16:44:08.065795 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.065776 2578 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Mar 18 16:44:08.422779 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.422666 2578 apiserver.go:52] "Watching apiserver" Mar 18 16:44:08.429516 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.429034 2578 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Mar 18 16:44:08.430791 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.430758 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-6p9n7","openshift-network-operator/iptables-alerter-htm47","openshift-ovn-kubernetes/ovnkube-node-qrlf7","kube-system/konnectivity-agent-9fxzt","openshift-cluster-node-tuning-operator/tuned-bdr5f","openshift-image-registry/node-ca-bgzmm","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-175.ec2.internal","openshift-multus/multus-additional-cni-plugins-m75lt","openshift-multus/network-metrics-daemon-bcgtq","openshift-network-diagnostics/network-check-target-ml2kv","kube-system/kube-apiserver-proxy-ip-10-0-143-175.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xlssx","openshift-dns/node-resolver-s8lc2"] Mar 18 16:44:08.435906 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.435884 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ml2kv" Mar 18 16:44:08.436030 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:08.435973 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ml2kv" podUID="6fad1cd6-2abc-416f-8534-fda50c4cccd9" Mar 18 16:44:08.438415 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.438189 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-htm47" Mar 18 16:44:08.440335 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.440309 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-kg6fd\"" Mar 18 16:44:08.440588 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.440572 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Mar 18 16:44:08.440796 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.440781 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Mar 18 16:44:08.440965 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.440940 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-9fxzt" Mar 18 16:44:08.440965 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.440953 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Mar 18 16:44:08.443123 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.442758 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Mar 18 16:44:08.443123 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.442854 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Mar 18 16:44:08.443123 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.442924 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-974rr\"" Mar 18 16:44:08.443335 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.443215 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-qrlf7" Mar 18 16:44:08.443654 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.443636 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-bdr5f" Mar 18 16:44:08.445360 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.445338 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Mar 18 16:44:08.446371 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.446355 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Mar 18 16:44:08.446913 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.446416 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Mar 18 16:44:08.447165 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.447144 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-47rlt\"" Mar 18 16:44:08.447254 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.446505 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Mar 18 16:44:08.447338 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.446465 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-rq8lg\"" Mar 18 16:44:08.447398 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.446557 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Mar 18 16:44:08.447398 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.446610 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Mar 18 16:44:08.447499 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.446665 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Mar 18 16:44:08.447499 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.446596 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Mar 18 16:44:08.449436 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.449069 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-bgzmm" Mar 18 16:44:08.450788 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.450772 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Mar 18 16:44:08.450904 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.450886 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Mar 18 16:44:08.451022 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.450871 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Mar 18 16:44:08.451079 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.450819 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-mh6jl\"" Mar 18 16:44:08.453228 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.453091 2578 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Mar 18 16:44:08.454023 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.453832 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-m75lt" Mar 18 16:44:08.455802 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.455506 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/67114ccf-1178-47ed-acd5-eaf483936c9f-iptables-alerter-script\") pod \"iptables-alerter-htm47\" (UID: \"67114ccf-1178-47ed-acd5-eaf483936c9f\") " pod="openshift-network-operator/iptables-alerter-htm47" Mar 18 16:44:08.455802 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.455567 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa-run-systemd\") pod \"ovnkube-node-qrlf7\" (UID: \"5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-qrlf7" Mar 18 16:44:08.455802 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.455592 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa-ovnkube-script-lib\") pod \"ovnkube-node-qrlf7\" (UID: \"5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-qrlf7" Mar 18 16:44:08.455802 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.455635 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f79627fb-f2a3-4632-89c2-5d99716c1cc9-serviceca\") pod \"node-ca-bgzmm\" (UID: \"f79627fb-f2a3-4632-89c2-5d99716c1cc9\") " pod="openshift-image-registry/node-ca-bgzmm" Mar 18 16:44:08.455802 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.455678 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4vmd\" (UniqueName: \"kubernetes.io/projected/f79627fb-f2a3-4632-89c2-5d99716c1cc9-kube-api-access-z4vmd\") pod \"node-ca-bgzmm\" (UID: \"f79627fb-f2a3-4632-89c2-5d99716c1cc9\") " pod="openshift-image-registry/node-ca-bgzmm" Mar 18 16:44:08.455802 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.455705 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/67114ccf-1178-47ed-acd5-eaf483936c9f-host-slash\") pod \"iptables-alerter-htm47\" (UID: \"67114ccf-1178-47ed-acd5-eaf483936c9f\") " pod="openshift-network-operator/iptables-alerter-htm47" Mar 18 16:44:08.455802 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.455724 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2x5k\" (UniqueName: \"kubernetes.io/projected/67114ccf-1178-47ed-acd5-eaf483936c9f-kube-api-access-z2x5k\") pod \"iptables-alerter-htm47\" (UID: \"67114ccf-1178-47ed-acd5-eaf483936c9f\") " pod="openshift-network-operator/iptables-alerter-htm47" Mar 18 16:44:08.455802 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.455761 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa-env-overrides\") pod \"ovnkube-node-qrlf7\" (UID: \"5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-qrlf7" Mar 18 16:44:08.455802 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.455786 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/0faceadc-eddf-479d-b008-553314c47823-konnectivity-ca\") pod \"konnectivity-agent-9fxzt\" (UID: \"0faceadc-eddf-479d-b008-553314c47823\") " pod="kube-system/konnectivity-agent-9fxzt" Mar 18 16:44:08.455802 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.455801 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/dbc66209-3812-4679-96f3-4550a7abfa0c-etc-sysconfig\") pod \"tuned-bdr5f\" (UID: \"dbc66209-3812-4679-96f3-4550a7abfa0c\") " pod="openshift-cluster-node-tuning-operator/tuned-bdr5f" Mar 18 16:44:08.456305 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.455817 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/dbc66209-3812-4679-96f3-4550a7abfa0c-etc-sysctl-conf\") pod \"tuned-bdr5f\" (UID: \"dbc66209-3812-4679-96f3-4550a7abfa0c\") " pod="openshift-cluster-node-tuning-operator/tuned-bdr5f" Mar 18 16:44:08.456305 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.455829 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Mar 18 16:44:08.456305 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.455848 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/dbc66209-3812-4679-96f3-4550a7abfa0c-sys\") pod \"tuned-bdr5f\" (UID: \"dbc66209-3812-4679-96f3-4550a7abfa0c\") " pod="openshift-cluster-node-tuning-operator/tuned-bdr5f" Mar 18 16:44:08.456305 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.455871 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa-host-run-netns\") pod \"ovnkube-node-qrlf7\" (UID: \"5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-qrlf7" Mar 18 16:44:08.456305 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.455889 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa-etc-openvswitch\") pod \"ovnkube-node-qrlf7\" (UID: \"5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-qrlf7" Mar 18 16:44:08.456305 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.455913 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa-host-cni-bin\") pod \"ovnkube-node-qrlf7\" (UID: \"5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-qrlf7" Mar 18 16:44:08.456305 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.455936 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/dbc66209-3812-4679-96f3-4550a7abfa0c-etc-sysctl-d\") pod \"tuned-bdr5f\" (UID: \"dbc66209-3812-4679-96f3-4550a7abfa0c\") " pod="openshift-cluster-node-tuning-operator/tuned-bdr5f" Mar 18 16:44:08.456305 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.455958 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/dbc66209-3812-4679-96f3-4550a7abfa0c-etc-systemd\") pod \"tuned-bdr5f\" (UID: \"dbc66209-3812-4679-96f3-4550a7abfa0c\") " pod="openshift-cluster-node-tuning-operator/tuned-bdr5f" Mar 18 16:44:08.456305 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.455995 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8v7r\" (UniqueName: \"kubernetes.io/projected/dbc66209-3812-4679-96f3-4550a7abfa0c-kube-api-access-x8v7r\") pod \"tuned-bdr5f\" (UID: \"dbc66209-3812-4679-96f3-4550a7abfa0c\") " pod="openshift-cluster-node-tuning-operator/tuned-bdr5f" Mar 18 16:44:08.456305 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.456014 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Mar 18 16:44:08.456305 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.456055 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vd4t\" (UniqueName: \"kubernetes.io/projected/6fad1cd6-2abc-416f-8534-fda50c4cccd9-kube-api-access-8vd4t\") pod \"network-check-target-ml2kv\" (UID: \"6fad1cd6-2abc-416f-8534-fda50c4cccd9\") " pod="openshift-network-diagnostics/network-check-target-ml2kv" Mar 18 16:44:08.456305 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.456081 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa-host-kubelet\") pod \"ovnkube-node-qrlf7\" (UID: \"5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-qrlf7" Mar 18 16:44:08.456305 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.456124 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa-host-run-ovn-kubernetes\") pod \"ovnkube-node-qrlf7\" (UID: \"5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-qrlf7" Mar 18 16:44:08.456305 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.456144 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Mar 18 16:44:08.456305 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.456174 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/0faceadc-eddf-479d-b008-553314c47823-agent-certs\") pod \"konnectivity-agent-9fxzt\" (UID: \"0faceadc-eddf-479d-b008-553314c47823\") " pod="kube-system/konnectivity-agent-9fxzt" Mar 18 16:44:08.456305 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.456197 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dbc66209-3812-4679-96f3-4550a7abfa0c-etc-kubernetes\") pod \"tuned-bdr5f\" (UID: \"dbc66209-3812-4679-96f3-4550a7abfa0c\") " pod="openshift-cluster-node-tuning-operator/tuned-bdr5f" Mar 18 16:44:08.456305 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.456219 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/dbc66209-3812-4679-96f3-4550a7abfa0c-tmp\") pod \"tuned-bdr5f\" (UID: \"dbc66209-3812-4679-96f3-4550a7abfa0c\") " pod="openshift-cluster-node-tuning-operator/tuned-bdr5f" Mar 18 16:44:08.456305 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.456233 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-6p9n7" Mar 18 16:44:08.456305 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.456241 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa-systemd-units\") pod \"ovnkube-node-qrlf7\" (UID: \"5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-qrlf7" Mar 18 16:44:08.457177 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.456259 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa-run-openvswitch\") pod \"ovnkube-node-qrlf7\" (UID: \"5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-qrlf7" Mar 18 16:44:08.457177 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.456287 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa-log-socket\") pod \"ovnkube-node-qrlf7\" (UID: \"5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-qrlf7" Mar 18 16:44:08.457177 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.456301 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Mar 18 16:44:08.457177 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.456304 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa-host-cni-netd\") pod \"ovnkube-node-qrlf7\" (UID: \"5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-qrlf7" Mar 18 16:44:08.457177 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.456425 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/dbc66209-3812-4679-96f3-4550a7abfa0c-etc-modprobe-d\") pod \"tuned-bdr5f\" (UID: \"dbc66209-3812-4679-96f3-4550a7abfa0c\") " pod="openshift-cluster-node-tuning-operator/tuned-bdr5f" Mar 18 16:44:08.457177 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.456488 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/dbc66209-3812-4679-96f3-4550a7abfa0c-var-lib-kubelet\") pod \"tuned-bdr5f\" (UID: \"dbc66209-3812-4679-96f3-4550a7abfa0c\") " pod="openshift-cluster-node-tuning-operator/tuned-bdr5f" Mar 18 16:44:08.457177 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.456498 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Mar 18 16:44:08.457177 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.456558 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dbc66209-3812-4679-96f3-4550a7abfa0c-host\") pod \"tuned-bdr5f\" (UID: \"dbc66209-3812-4679-96f3-4550a7abfa0c\") " pod="openshift-cluster-node-tuning-operator/tuned-bdr5f" Mar 18 16:44:08.457177 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.456588 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/dbc66209-3812-4679-96f3-4550a7abfa0c-etc-tuned\") pod \"tuned-bdr5f\" (UID: \"dbc66209-3812-4679-96f3-4550a7abfa0c\") " pod="openshift-cluster-node-tuning-operator/tuned-bdr5f" Mar 18 16:44:08.457177 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.456602 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-rrr6x\"" Mar 18 16:44:08.457177 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.456633 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa-var-lib-openvswitch\") pod \"ovnkube-node-qrlf7\" (UID: \"5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-qrlf7" Mar 18 16:44:08.457177 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.456677 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9scm\" (UniqueName: \"kubernetes.io/projected/5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa-kube-api-access-n9scm\") pod \"ovnkube-node-qrlf7\" (UID: \"5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-qrlf7" Mar 18 16:44:08.457177 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.456703 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/dbc66209-3812-4679-96f3-4550a7abfa0c-run\") pod \"tuned-bdr5f\" (UID: \"dbc66209-3812-4679-96f3-4550a7abfa0c\") " pod="openshift-cluster-node-tuning-operator/tuned-bdr5f" Mar 18 16:44:08.457177 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.456725 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f79627fb-f2a3-4632-89c2-5d99716c1cc9-host\") pod \"node-ca-bgzmm\" (UID: \"f79627fb-f2a3-4632-89c2-5d99716c1cc9\") " pod="openshift-image-registry/node-ca-bgzmm" Mar 18 16:44:08.457177 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.456751 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa-run-ovn\") pod \"ovnkube-node-qrlf7\" (UID: \"5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-qrlf7" Mar 18 16:44:08.457177 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.456801 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qrlf7\" (UID: \"5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-qrlf7" Mar 18 16:44:08.457177 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.456838 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa-ovnkube-config\") pod \"ovnkube-node-qrlf7\" (UID: \"5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-qrlf7" Mar 18 16:44:08.457177 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.456862 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa-host-slash\") pod \"ovnkube-node-qrlf7\" (UID: \"5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-qrlf7" Mar 18 16:44:08.457962 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.456893 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa-node-log\") pod \"ovnkube-node-qrlf7\" (UID: \"5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-qrlf7" Mar 18 16:44:08.457962 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.456920 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa-ovn-node-metrics-cert\") pod \"ovnkube-node-qrlf7\" (UID: \"5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-qrlf7" Mar 18 16:44:08.457962 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.456945 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/dbc66209-3812-4679-96f3-4550a7abfa0c-lib-modules\") pod \"tuned-bdr5f\" (UID: \"dbc66209-3812-4679-96f3-4550a7abfa0c\") " pod="openshift-cluster-node-tuning-operator/tuned-bdr5f" Mar 18 16:44:08.458364 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.458339 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Mar 18 16:44:08.458364 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.458350 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-nnrtd\"" Mar 18 16:44:08.458749 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.458724 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bcgtq" Mar 18 16:44:08.458843 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:08.458823 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bcgtq" podUID="4bd683e7-9eae-4e40-ba74-10c5acb6fd8b" Mar 18 16:44:08.458902 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.458888 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xlssx" Mar 18 16:44:08.460537 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.460504 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Mar 18 16:44:08.460631 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.460579 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-sf4nn\"" Mar 18 16:44:08.460689 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.460667 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Mar 18 16:44:08.460766 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.460743 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Mar 18 16:44:08.461156 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.461139 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-s8lc2" Mar 18 16:44:08.462902 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.462878 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-msxvk\"" Mar 18 16:44:08.462999 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.462881 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Mar 18 16:44:08.463066 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.463010 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Mar 18 16:44:08.487646 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.487607 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-03-17 16:39:07 +0000 UTC" deadline="2027-12-02 10:03:55.250765126 +0000 UTC" Mar 18 16:44:08.487646 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.487642 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14969h19m46.763126993s" Mar 18 16:44:08.546069 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.546033 2578 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Mar 18 16:44:08.557762 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.557727 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/85ea82d7-7483-4563-a8fd-b12c358cdc2d-system-cni-dir\") pod \"multus-6p9n7\" (UID: \"85ea82d7-7483-4563-a8fd-b12c358cdc2d\") " pod="openshift-multus/multus-6p9n7" Mar 18 16:44:08.557762 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.557769 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/67114ccf-1178-47ed-acd5-eaf483936c9f-iptables-alerter-script\") pod \"iptables-alerter-htm47\" (UID: \"67114ccf-1178-47ed-acd5-eaf483936c9f\") " pod="openshift-network-operator/iptables-alerter-htm47" Mar 18 16:44:08.557980 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.557818 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa-run-systemd\") pod \"ovnkube-node-qrlf7\" (UID: \"5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-qrlf7" Mar 18 16:44:08.557980 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.557875 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa-run-systemd\") pod \"ovnkube-node-qrlf7\" (UID: \"5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-qrlf7" Mar 18 16:44:08.557980 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.557880 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa-ovnkube-script-lib\") pod \"ovnkube-node-qrlf7\" (UID: \"5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-qrlf7" Mar 18 16:44:08.557980 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.557909 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f79627fb-f2a3-4632-89c2-5d99716c1cc9-serviceca\") pod \"node-ca-bgzmm\" (UID: \"f79627fb-f2a3-4632-89c2-5d99716c1cc9\") " pod="openshift-image-registry/node-ca-bgzmm" Mar 18 16:44:08.558178 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.558009 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z4vmd\" (UniqueName: \"kubernetes.io/projected/f79627fb-f2a3-4632-89c2-5d99716c1cc9-kube-api-access-z4vmd\") pod \"node-ca-bgzmm\" (UID: \"f79627fb-f2a3-4632-89c2-5d99716c1cc9\") " pod="openshift-image-registry/node-ca-bgzmm" Mar 18 16:44:08.558178 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.558045 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8q68\" (UniqueName: \"kubernetes.io/projected/4bd683e7-9eae-4e40-ba74-10c5acb6fd8b-kube-api-access-v8q68\") pod \"network-metrics-daemon-bcgtq\" (UID: \"4bd683e7-9eae-4e40-ba74-10c5acb6fd8b\") " pod="openshift-multus/network-metrics-daemon-bcgtq" Mar 18 16:44:08.558178 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.558071 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/85ea82d7-7483-4563-a8fd-b12c358cdc2d-multus-socket-dir-parent\") pod \"multus-6p9n7\" (UID: \"85ea82d7-7483-4563-a8fd-b12c358cdc2d\") " pod="openshift-multus/multus-6p9n7" Mar 18 16:44:08.558178 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.558098 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z2x5k\" (UniqueName: \"kubernetes.io/projected/67114ccf-1178-47ed-acd5-eaf483936c9f-kube-api-access-z2x5k\") pod \"iptables-alerter-htm47\" (UID: \"67114ccf-1178-47ed-acd5-eaf483936c9f\") " pod="openshift-network-operator/iptables-alerter-htm47" Mar 18 16:44:08.558178 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.558126 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/dbc66209-3812-4679-96f3-4550a7abfa0c-etc-sysconfig\") pod \"tuned-bdr5f\" (UID: \"dbc66209-3812-4679-96f3-4550a7abfa0c\") " pod="openshift-cluster-node-tuning-operator/tuned-bdr5f" Mar 18 16:44:08.558178 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.558150 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/dbc66209-3812-4679-96f3-4550a7abfa0c-etc-sysctl-conf\") pod \"tuned-bdr5f\" (UID: \"dbc66209-3812-4679-96f3-4550a7abfa0c\") " pod="openshift-cluster-node-tuning-operator/tuned-bdr5f" Mar 18 16:44:08.558178 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.558173 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/dbc66209-3812-4679-96f3-4550a7abfa0c-sys\") pod \"tuned-bdr5f\" (UID: \"dbc66209-3812-4679-96f3-4550a7abfa0c\") " pod="openshift-cluster-node-tuning-operator/tuned-bdr5f" Mar 18 16:44:08.558489 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.558200 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/85ea82d7-7483-4563-a8fd-b12c358cdc2d-host-run-k8s-cni-cncf-io\") pod \"multus-6p9n7\" (UID: \"85ea82d7-7483-4563-a8fd-b12c358cdc2d\") " pod="openshift-multus/multus-6p9n7" Mar 18 16:44:08.558489 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.558226 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/85ea82d7-7483-4563-a8fd-b12c358cdc2d-hostroot\") pod \"multus-6p9n7\" (UID: \"85ea82d7-7483-4563-a8fd-b12c358cdc2d\") " pod="openshift-multus/multus-6p9n7" Mar 18 16:44:08.558489 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.558267 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/85ea82d7-7483-4563-a8fd-b12c358cdc2d-multus-conf-dir\") pod \"multus-6p9n7\" (UID: \"85ea82d7-7483-4563-a8fd-b12c358cdc2d\") " pod="openshift-multus/multus-6p9n7" Mar 18 16:44:08.558489 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.558292 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e05b6725-74f6-4c02-a1e5-690434981dd9-registration-dir\") pod \"aws-ebs-csi-driver-node-xlssx\" (UID: \"e05b6725-74f6-4c02-a1e5-690434981dd9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xlssx" Mar 18 16:44:08.558489 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.558320 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa-etc-openvswitch\") pod \"ovnkube-node-qrlf7\" (UID: \"5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-qrlf7" Mar 18 16:44:08.558489 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.558347 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/dbc66209-3812-4679-96f3-4550a7abfa0c-etc-sysctl-d\") pod \"tuned-bdr5f\" (UID: \"dbc66209-3812-4679-96f3-4550a7abfa0c\") " pod="openshift-cluster-node-tuning-operator/tuned-bdr5f" Mar 18 16:44:08.558489 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.558370 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/dbc66209-3812-4679-96f3-4550a7abfa0c-etc-systemd\") pod \"tuned-bdr5f\" (UID: \"dbc66209-3812-4679-96f3-4550a7abfa0c\") " pod="openshift-cluster-node-tuning-operator/tuned-bdr5f" Mar 18 16:44:08.558489 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.558395 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x8v7r\" (UniqueName: \"kubernetes.io/projected/dbc66209-3812-4679-96f3-4550a7abfa0c-kube-api-access-x8v7r\") pod \"tuned-bdr5f\" (UID: \"dbc66209-3812-4679-96f3-4550a7abfa0c\") " pod="openshift-cluster-node-tuning-operator/tuned-bdr5f" Mar 18 16:44:08.558489 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.558422 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4bd683e7-9eae-4e40-ba74-10c5acb6fd8b-metrics-certs\") pod \"network-metrics-daemon-bcgtq\" (UID: \"4bd683e7-9eae-4e40-ba74-10c5acb6fd8b\") " pod="openshift-multus/network-metrics-daemon-bcgtq" Mar 18 16:44:08.558489 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.558448 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/85ea82d7-7483-4563-a8fd-b12c358cdc2d-multus-daemon-config\") pod \"multus-6p9n7\" (UID: \"85ea82d7-7483-4563-a8fd-b12c358cdc2d\") " pod="openshift-multus/multus-6p9n7" Mar 18 16:44:08.558489 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.558467 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f79627fb-f2a3-4632-89c2-5d99716c1cc9-serviceca\") pod \"node-ca-bgzmm\" (UID: \"f79627fb-f2a3-4632-89c2-5d99716c1cc9\") " pod="openshift-image-registry/node-ca-bgzmm" Mar 18 16:44:08.558489 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.558472 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/85ea82d7-7483-4563-a8fd-b12c358cdc2d-etc-kubernetes\") pod \"multus-6p9n7\" (UID: \"85ea82d7-7483-4563-a8fd-b12c358cdc2d\") " pod="openshift-multus/multus-6p9n7" Mar 18 16:44:08.559048 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.558546 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dbc66209-3812-4679-96f3-4550a7abfa0c-etc-kubernetes\") pod \"tuned-bdr5f\" (UID: \"dbc66209-3812-4679-96f3-4550a7abfa0c\") " pod="openshift-cluster-node-tuning-operator/tuned-bdr5f" Mar 18 16:44:08.559048 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.558558 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/67114ccf-1178-47ed-acd5-eaf483936c9f-iptables-alerter-script\") pod \"iptables-alerter-htm47\" (UID: \"67114ccf-1178-47ed-acd5-eaf483936c9f\") " pod="openshift-network-operator/iptables-alerter-htm47" Mar 18 16:44:08.559048 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.558572 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa-ovnkube-script-lib\") pod \"ovnkube-node-qrlf7\" (UID: \"5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-qrlf7" Mar 18 16:44:08.559048 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.558580 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/dbc66209-3812-4679-96f3-4550a7abfa0c-tmp\") pod \"tuned-bdr5f\" (UID: \"dbc66209-3812-4679-96f3-4550a7abfa0c\") " pod="openshift-cluster-node-tuning-operator/tuned-bdr5f" Mar 18 16:44:08.559048 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.558608 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa-systemd-units\") pod \"ovnkube-node-qrlf7\" (UID: \"5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-qrlf7" Mar 18 16:44:08.559048 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.558639 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dbc66209-3812-4679-96f3-4550a7abfa0c-etc-kubernetes\") pod \"tuned-bdr5f\" (UID: \"dbc66209-3812-4679-96f3-4550a7abfa0c\") " pod="openshift-cluster-node-tuning-operator/tuned-bdr5f" Mar 18 16:44:08.559048 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.558679 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa-systemd-units\") pod \"ovnkube-node-qrlf7\" (UID: \"5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-qrlf7" Mar 18 16:44:08.559048 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.558715 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa-etc-openvswitch\") pod \"ovnkube-node-qrlf7\" (UID: \"5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-qrlf7" Mar 18 16:44:08.559048 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.558753 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa-run-openvswitch\") pod \"ovnkube-node-qrlf7\" (UID: \"5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-qrlf7" Mar 18 16:44:08.559048 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.558781 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/dbc66209-3812-4679-96f3-4550a7abfa0c-etc-systemd\") pod \"tuned-bdr5f\" (UID: \"dbc66209-3812-4679-96f3-4550a7abfa0c\") " pod="openshift-cluster-node-tuning-operator/tuned-bdr5f" Mar 18 16:44:08.559048 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.558785 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa-log-socket\") pod \"ovnkube-node-qrlf7\" (UID: \"5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-qrlf7" Mar 18 16:44:08.559048 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.558825 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/dbc66209-3812-4679-96f3-4550a7abfa0c-etc-sysctl-conf\") pod \"tuned-bdr5f\" (UID: \"dbc66209-3812-4679-96f3-4550a7abfa0c\") " pod="openshift-cluster-node-tuning-operator/tuned-bdr5f" Mar 18 16:44:08.559048 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.558852 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/dbc66209-3812-4679-96f3-4550a7abfa0c-sys\") pod \"tuned-bdr5f\" (UID: \"dbc66209-3812-4679-96f3-4550a7abfa0c\") " pod="openshift-cluster-node-tuning-operator/tuned-bdr5f" Mar 18 16:44:08.559048 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.558860 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa-run-openvswitch\") pod \"ovnkube-node-qrlf7\" (UID: \"5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-qrlf7" Mar 18 16:44:08.559048 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.558872 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dbc66209-3812-4679-96f3-4550a7abfa0c-host\") pod \"tuned-bdr5f\" (UID: \"dbc66209-3812-4679-96f3-4550a7abfa0c\") " pod="openshift-cluster-node-tuning-operator/tuned-bdr5f" Mar 18 16:44:08.559048 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.558877 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/dbc66209-3812-4679-96f3-4550a7abfa0c-etc-sysconfig\") pod \"tuned-bdr5f\" (UID: \"dbc66209-3812-4679-96f3-4550a7abfa0c\") " pod="openshift-cluster-node-tuning-operator/tuned-bdr5f" Mar 18 16:44:08.559048 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.558827 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa-log-socket\") pod \"ovnkube-node-qrlf7\" (UID: \"5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-qrlf7" Mar 18 16:44:08.559048 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.558825 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dbc66209-3812-4679-96f3-4550a7abfa0c-host\") pod \"tuned-bdr5f\" (UID: \"dbc66209-3812-4679-96f3-4550a7abfa0c\") " pod="openshift-cluster-node-tuning-operator/tuned-bdr5f" Mar 18 16:44:08.559884 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.559004 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/dbc66209-3812-4679-96f3-4550a7abfa0c-etc-tuned\") pod \"tuned-bdr5f\" (UID: \"dbc66209-3812-4679-96f3-4550a7abfa0c\") " pod="openshift-cluster-node-tuning-operator/tuned-bdr5f" Mar 18 16:44:08.559884 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.559033 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n9scm\" (UniqueName: \"kubernetes.io/projected/5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa-kube-api-access-n9scm\") pod \"ovnkube-node-qrlf7\" (UID: \"5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-qrlf7" Mar 18 16:44:08.559884 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.559003 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/dbc66209-3812-4679-96f3-4550a7abfa0c-etc-sysctl-d\") pod \"tuned-bdr5f\" (UID: \"dbc66209-3812-4679-96f3-4550a7abfa0c\") " pod="openshift-cluster-node-tuning-operator/tuned-bdr5f" Mar 18 16:44:08.559884 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.559088 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/dbc66209-3812-4679-96f3-4550a7abfa0c-run\") pod \"tuned-bdr5f\" (UID: \"dbc66209-3812-4679-96f3-4550a7abfa0c\") " pod="openshift-cluster-node-tuning-operator/tuned-bdr5f" Mar 18 16:44:08.559884 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.559109 2578 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 18 16:44:08.559884 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.559167 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f79627fb-f2a3-4632-89c2-5d99716c1cc9-host\") pod \"node-ca-bgzmm\" (UID: \"f79627fb-f2a3-4632-89c2-5d99716c1cc9\") " pod="openshift-image-registry/node-ca-bgzmm" Mar 18 16:44:08.559884 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.559201 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4822\" (UniqueName: \"kubernetes.io/projected/824b5ab3-8c23-48c6-a404-bc9472781b90-kube-api-access-v4822\") pod \"node-resolver-s8lc2\" (UID: \"824b5ab3-8c23-48c6-a404-bc9472781b90\") " pod="openshift-dns/node-resolver-s8lc2" Mar 18 16:44:08.559884 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.559254 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa-run-ovn\") pod \"ovnkube-node-qrlf7\" (UID: \"5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-qrlf7" Mar 18 16:44:08.559884 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.559283 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/85ea82d7-7483-4563-a8fd-b12c358cdc2d-host-var-lib-kubelet\") pod \"multus-6p9n7\" (UID: \"85ea82d7-7483-4563-a8fd-b12c358cdc2d\") " pod="openshift-multus/multus-6p9n7" Mar 18 16:44:08.559884 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.559311 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa-ovn-node-metrics-cert\") pod \"ovnkube-node-qrlf7\" (UID: \"5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-qrlf7" Mar 18 16:44:08.559884 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.559340 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/85ea82d7-7483-4563-a8fd-b12c358cdc2d-multus-cni-dir\") pod \"multus-6p9n7\" (UID: \"85ea82d7-7483-4563-a8fd-b12c358cdc2d\") " pod="openshift-multus/multus-6p9n7" Mar 18 16:44:08.559884 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.559399 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/85ea82d7-7483-4563-a8fd-b12c358cdc2d-cni-binary-copy\") pod \"multus-6p9n7\" (UID: \"85ea82d7-7483-4563-a8fd-b12c358cdc2d\") " pod="openshift-multus/multus-6p9n7" Mar 18 16:44:08.559884 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.559425 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/85ea82d7-7483-4563-a8fd-b12c358cdc2d-host-run-multus-certs\") pod \"multus-6p9n7\" (UID: \"85ea82d7-7483-4563-a8fd-b12c358cdc2d\") " pod="openshift-multus/multus-6p9n7" Mar 18 16:44:08.559884 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.559452 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/85ea82d7-7483-4563-a8fd-b12c358cdc2d-host-var-lib-cni-bin\") pod \"multus-6p9n7\" (UID: \"85ea82d7-7483-4563-a8fd-b12c358cdc2d\") " pod="openshift-multus/multus-6p9n7" Mar 18 16:44:08.559884 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.559476 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/85ea82d7-7483-4563-a8fd-b12c358cdc2d-host-var-lib-cni-multus\") pod \"multus-6p9n7\" (UID: \"85ea82d7-7483-4563-a8fd-b12c358cdc2d\") " pod="openshift-multus/multus-6p9n7" Mar 18 16:44:08.559884 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.559503 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbtm4\" (UniqueName: \"kubernetes.io/projected/e05b6725-74f6-4c02-a1e5-690434981dd9-kube-api-access-fbtm4\") pod \"aws-ebs-csi-driver-node-xlssx\" (UID: \"e05b6725-74f6-4c02-a1e5-690434981dd9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xlssx" Mar 18 16:44:08.559884 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.559555 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/67114ccf-1178-47ed-acd5-eaf483936c9f-host-slash\") pod \"iptables-alerter-htm47\" (UID: \"67114ccf-1178-47ed-acd5-eaf483936c9f\") " pod="openshift-network-operator/iptables-alerter-htm47" Mar 18 16:44:08.559884 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.559626 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/dbc66209-3812-4679-96f3-4550a7abfa0c-run\") pod \"tuned-bdr5f\" (UID: \"dbc66209-3812-4679-96f3-4550a7abfa0c\") " pod="openshift-cluster-node-tuning-operator/tuned-bdr5f" Mar 18 16:44:08.560710 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.559681 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa-env-overrides\") pod \"ovnkube-node-qrlf7\" (UID: \"5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-qrlf7" Mar 18 16:44:08.560710 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.559728 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/0faceadc-eddf-479d-b008-553314c47823-konnectivity-ca\") pod \"konnectivity-agent-9fxzt\" (UID: \"0faceadc-eddf-479d-b008-553314c47823\") " pod="kube-system/konnectivity-agent-9fxzt" Mar 18 16:44:08.560710 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.559757 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/42ddca27-69b7-4448-a710-9cc62df43c14-system-cni-dir\") pod \"multus-additional-cni-plugins-m75lt\" (UID: \"42ddca27-69b7-4448-a710-9cc62df43c14\") " pod="openshift-multus/multus-additional-cni-plugins-m75lt" Mar 18 16:44:08.560710 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.559784 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xgvw\" (UniqueName: \"kubernetes.io/projected/85ea82d7-7483-4563-a8fd-b12c358cdc2d-kube-api-access-8xgvw\") pod \"multus-6p9n7\" (UID: \"85ea82d7-7483-4563-a8fd-b12c358cdc2d\") " pod="openshift-multus/multus-6p9n7" Mar 18 16:44:08.560710 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.559810 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e05b6725-74f6-4c02-a1e5-690434981dd9-kubelet-dir\") pod \"aws-ebs-csi-driver-node-xlssx\" (UID: \"e05b6725-74f6-4c02-a1e5-690434981dd9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xlssx" Mar 18 16:44:08.560710 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.559842 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa-host-run-netns\") pod \"ovnkube-node-qrlf7\" (UID: \"5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-qrlf7" Mar 18 16:44:08.560710 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.559870 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa-host-cni-bin\") pod \"ovnkube-node-qrlf7\" (UID: \"5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-qrlf7" Mar 18 16:44:08.560710 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.559901 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8vd4t\" (UniqueName: \"kubernetes.io/projected/6fad1cd6-2abc-416f-8534-fda50c4cccd9-kube-api-access-8vd4t\") pod \"network-check-target-ml2kv\" (UID: \"6fad1cd6-2abc-416f-8534-fda50c4cccd9\") " pod="openshift-network-diagnostics/network-check-target-ml2kv" Mar 18 16:44:08.560710 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.559928 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/42ddca27-69b7-4448-a710-9cc62df43c14-os-release\") pod \"multus-additional-cni-plugins-m75lt\" (UID: \"42ddca27-69b7-4448-a710-9cc62df43c14\") " pod="openshift-multus/multus-additional-cni-plugins-m75lt" Mar 18 16:44:08.560710 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.559956 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa-host-kubelet\") pod \"ovnkube-node-qrlf7\" (UID: \"5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-qrlf7" Mar 18 16:44:08.560710 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.559995 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa-host-run-ovn-kubernetes\") pod \"ovnkube-node-qrlf7\" (UID: \"5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-qrlf7" Mar 18 16:44:08.560710 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.560027 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/0faceadc-eddf-479d-b008-553314c47823-agent-certs\") pod \"konnectivity-agent-9fxzt\" (UID: \"0faceadc-eddf-479d-b008-553314c47823\") " pod="kube-system/konnectivity-agent-9fxzt" Mar 18 16:44:08.560710 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.560173 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa-host-run-ovn-kubernetes\") pod \"ovnkube-node-qrlf7\" (UID: \"5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-qrlf7" Mar 18 16:44:08.560710 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.560164 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/e05b6725-74f6-4c02-a1e5-690434981dd9-device-dir\") pod \"aws-ebs-csi-driver-node-xlssx\" (UID: \"e05b6725-74f6-4c02-a1e5-690434981dd9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xlssx" Mar 18 16:44:08.560710 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.560221 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/e05b6725-74f6-4c02-a1e5-690434981dd9-etc-selinux\") pod \"aws-ebs-csi-driver-node-xlssx\" (UID: \"e05b6725-74f6-4c02-a1e5-690434981dd9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xlssx" Mar 18 16:44:08.560710 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.560248 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/824b5ab3-8c23-48c6-a404-bc9472781b90-hosts-file\") pod \"node-resolver-s8lc2\" (UID: \"824b5ab3-8c23-48c6-a404-bc9472781b90\") " pod="openshift-dns/node-resolver-s8lc2" Mar 18 16:44:08.560710 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.560277 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa-host-cni-netd\") pod \"ovnkube-node-qrlf7\" (UID: \"5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-qrlf7" Mar 18 16:44:08.561493 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.560302 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/dbc66209-3812-4679-96f3-4550a7abfa0c-etc-modprobe-d\") pod \"tuned-bdr5f\" (UID: \"dbc66209-3812-4679-96f3-4550a7abfa0c\") " pod="openshift-cluster-node-tuning-operator/tuned-bdr5f" Mar 18 16:44:08.561493 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.560325 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/dbc66209-3812-4679-96f3-4550a7abfa0c-var-lib-kubelet\") pod \"tuned-bdr5f\" (UID: \"dbc66209-3812-4679-96f3-4550a7abfa0c\") " pod="openshift-cluster-node-tuning-operator/tuned-bdr5f" Mar 18 16:44:08.561493 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.560350 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/42ddca27-69b7-4448-a710-9cc62df43c14-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-m75lt\" (UID: \"42ddca27-69b7-4448-a710-9cc62df43c14\") " pod="openshift-multus/multus-additional-cni-plugins-m75lt" Mar 18 16:44:08.561493 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.560376 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tlqc4\" (UniqueName: \"kubernetes.io/projected/42ddca27-69b7-4448-a710-9cc62df43c14-kube-api-access-tlqc4\") pod \"multus-additional-cni-plugins-m75lt\" (UID: \"42ddca27-69b7-4448-a710-9cc62df43c14\") " pod="openshift-multus/multus-additional-cni-plugins-m75lt" Mar 18 16:44:08.561493 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.560402 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa-var-lib-openvswitch\") pod \"ovnkube-node-qrlf7\" (UID: \"5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-qrlf7" Mar 18 16:44:08.561493 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.560426 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/42ddca27-69b7-4448-a710-9cc62df43c14-cni-binary-copy\") pod \"multus-additional-cni-plugins-m75lt\" (UID: \"42ddca27-69b7-4448-a710-9cc62df43c14\") " pod="openshift-multus/multus-additional-cni-plugins-m75lt" Mar 18 16:44:08.561493 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.560451 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/85ea82d7-7483-4563-a8fd-b12c358cdc2d-cnibin\") pod \"multus-6p9n7\" (UID: \"85ea82d7-7483-4563-a8fd-b12c358cdc2d\") " pod="openshift-multus/multus-6p9n7" Mar 18 16:44:08.561493 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.560477 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/85ea82d7-7483-4563-a8fd-b12c358cdc2d-os-release\") pod \"multus-6p9n7\" (UID: \"85ea82d7-7483-4563-a8fd-b12c358cdc2d\") " pod="openshift-multus/multus-6p9n7" Mar 18 16:44:08.561493 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.560508 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qrlf7\" (UID: \"5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-qrlf7" Mar 18 16:44:08.561493 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.560553 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa-ovnkube-config\") pod \"ovnkube-node-qrlf7\" (UID: \"5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-qrlf7" Mar 18 16:44:08.561493 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.560581 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/42ddca27-69b7-4448-a710-9cc62df43c14-cnibin\") pod \"multus-additional-cni-plugins-m75lt\" (UID: \"42ddca27-69b7-4448-a710-9cc62df43c14\") " pod="openshift-multus/multus-additional-cni-plugins-m75lt" Mar 18 16:44:08.561493 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.560608 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/42ddca27-69b7-4448-a710-9cc62df43c14-tuning-conf-dir\") pod \"multus-additional-cni-plugins-m75lt\" (UID: \"42ddca27-69b7-4448-a710-9cc62df43c14\") " pod="openshift-multus/multus-additional-cni-plugins-m75lt" Mar 18 16:44:08.561493 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.560637 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/85ea82d7-7483-4563-a8fd-b12c358cdc2d-host-run-netns\") pod \"multus-6p9n7\" (UID: \"85ea82d7-7483-4563-a8fd-b12c358cdc2d\") " pod="openshift-multus/multus-6p9n7" Mar 18 16:44:08.561493 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.560663 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e05b6725-74f6-4c02-a1e5-690434981dd9-socket-dir\") pod \"aws-ebs-csi-driver-node-xlssx\" (UID: \"e05b6725-74f6-4c02-a1e5-690434981dd9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xlssx" Mar 18 16:44:08.561493 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.560690 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/e05b6725-74f6-4c02-a1e5-690434981dd9-sys-fs\") pod \"aws-ebs-csi-driver-node-xlssx\" (UID: \"e05b6725-74f6-4c02-a1e5-690434981dd9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xlssx" Mar 18 16:44:08.561493 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.560745 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/824b5ab3-8c23-48c6-a404-bc9472781b90-tmp-dir\") pod \"node-resolver-s8lc2\" (UID: \"824b5ab3-8c23-48c6-a404-bc9472781b90\") " pod="openshift-dns/node-resolver-s8lc2" Mar 18 16:44:08.562217 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.560883 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa-host-slash\") pod \"ovnkube-node-qrlf7\" (UID: \"5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-qrlf7" Mar 18 16:44:08.562217 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.560914 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa-node-log\") pod \"ovnkube-node-qrlf7\" (UID: \"5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-qrlf7" Mar 18 16:44:08.562217 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.560941 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/dbc66209-3812-4679-96f3-4550a7abfa0c-lib-modules\") pod \"tuned-bdr5f\" (UID: \"dbc66209-3812-4679-96f3-4550a7abfa0c\") " pod="openshift-cluster-node-tuning-operator/tuned-bdr5f" Mar 18 16:44:08.562217 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.560978 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/42ddca27-69b7-4448-a710-9cc62df43c14-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-m75lt\" (UID: \"42ddca27-69b7-4448-a710-9cc62df43c14\") " pod="openshift-multus/multus-additional-cni-plugins-m75lt" Mar 18 16:44:08.562217 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.561057 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa-host-cni-netd\") pod \"ovnkube-node-qrlf7\" (UID: \"5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-qrlf7" Mar 18 16:44:08.562217 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.561135 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/dbc66209-3812-4679-96f3-4550a7abfa0c-etc-modprobe-d\") pod \"tuned-bdr5f\" (UID: \"dbc66209-3812-4679-96f3-4550a7abfa0c\") " pod="openshift-cluster-node-tuning-operator/tuned-bdr5f" Mar 18 16:44:08.562217 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.561176 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/dbc66209-3812-4679-96f3-4550a7abfa0c-var-lib-kubelet\") pod \"tuned-bdr5f\" (UID: \"dbc66209-3812-4679-96f3-4550a7abfa0c\") " pod="openshift-cluster-node-tuning-operator/tuned-bdr5f" Mar 18 16:44:08.562217 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.561215 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa-var-lib-openvswitch\") pod \"ovnkube-node-qrlf7\" (UID: \"5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-qrlf7" Mar 18 16:44:08.562217 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.561254 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qrlf7\" (UID: \"5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-qrlf7" Mar 18 16:44:08.562217 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.561705 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa-ovnkube-config\") pod \"ovnkube-node-qrlf7\" (UID: \"5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-qrlf7" Mar 18 16:44:08.562217 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.561773 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa-host-slash\") pod \"ovnkube-node-qrlf7\" (UID: \"5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-qrlf7" Mar 18 16:44:08.562217 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.561819 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa-node-log\") pod \"ovnkube-node-qrlf7\" (UID: \"5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-qrlf7" Mar 18 16:44:08.562217 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.561909 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/dbc66209-3812-4679-96f3-4550a7abfa0c-lib-modules\") pod \"tuned-bdr5f\" (UID: \"dbc66209-3812-4679-96f3-4550a7abfa0c\") " pod="openshift-cluster-node-tuning-operator/tuned-bdr5f" Mar 18 16:44:08.562217 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.561952 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/67114ccf-1178-47ed-acd5-eaf483936c9f-host-slash\") pod \"iptables-alerter-htm47\" (UID: \"67114ccf-1178-47ed-acd5-eaf483936c9f\") " pod="openshift-network-operator/iptables-alerter-htm47" Mar 18 16:44:08.562217 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.561992 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f79627fb-f2a3-4632-89c2-5d99716c1cc9-host\") pod \"node-ca-bgzmm\" (UID: \"f79627fb-f2a3-4632-89c2-5d99716c1cc9\") " pod="openshift-image-registry/node-ca-bgzmm" Mar 18 16:44:08.562217 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.562024 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa-run-ovn\") pod \"ovnkube-node-qrlf7\" (UID: \"5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-qrlf7" Mar 18 16:44:08.562921 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.562347 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa-env-overrides\") pod \"ovnkube-node-qrlf7\" (UID: \"5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-qrlf7" Mar 18 16:44:08.562921 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.562470 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa-host-kubelet\") pod \"ovnkube-node-qrlf7\" (UID: \"5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-qrlf7" Mar 18 16:44:08.562921 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.562516 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa-host-cni-bin\") pod \"ovnkube-node-qrlf7\" (UID: \"5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-qrlf7" Mar 18 16:44:08.562921 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.562572 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa-host-run-netns\") pod \"ovnkube-node-qrlf7\" (UID: \"5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-qrlf7" Mar 18 16:44:08.562921 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.562778 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/dbc66209-3812-4679-96f3-4550a7abfa0c-etc-tuned\") pod \"tuned-bdr5f\" (UID: \"dbc66209-3812-4679-96f3-4550a7abfa0c\") " pod="openshift-cluster-node-tuning-operator/tuned-bdr5f" Mar 18 16:44:08.562921 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.562835 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/0faceadc-eddf-479d-b008-553314c47823-konnectivity-ca\") pod \"konnectivity-agent-9fxzt\" (UID: \"0faceadc-eddf-479d-b008-553314c47823\") " pod="kube-system/konnectivity-agent-9fxzt" Mar 18 16:44:08.563575 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.563481 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/dbc66209-3812-4679-96f3-4550a7abfa0c-tmp\") pod \"tuned-bdr5f\" (UID: \"dbc66209-3812-4679-96f3-4550a7abfa0c\") " pod="openshift-cluster-node-tuning-operator/tuned-bdr5f" Mar 18 16:44:08.569755 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.564471 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa-ovn-node-metrics-cert\") pod \"ovnkube-node-qrlf7\" (UID: \"5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-qrlf7" Mar 18 16:44:08.569755 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.565284 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/0faceadc-eddf-479d-b008-553314c47823-agent-certs\") pod \"konnectivity-agent-9fxzt\" (UID: \"0faceadc-eddf-479d-b008-553314c47823\") " pod="kube-system/konnectivity-agent-9fxzt" Mar 18 16:44:08.569755 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.567925 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2x5k\" (UniqueName: \"kubernetes.io/projected/67114ccf-1178-47ed-acd5-eaf483936c9f-kube-api-access-z2x5k\") pod \"iptables-alerter-htm47\" (UID: \"67114ccf-1178-47ed-acd5-eaf483936c9f\") " pod="openshift-network-operator/iptables-alerter-htm47" Mar 18 16:44:08.569755 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.568206 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4vmd\" (UniqueName: \"kubernetes.io/projected/f79627fb-f2a3-4632-89c2-5d99716c1cc9-kube-api-access-z4vmd\") pod \"node-ca-bgzmm\" (UID: \"f79627fb-f2a3-4632-89c2-5d99716c1cc9\") " pod="openshift-image-registry/node-ca-bgzmm" Mar 18 16:44:08.569755 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:08.568411 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 16:44:08.569755 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:08.568431 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 16:44:08.569755 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:08.568447 2578 projected.go:194] Error preparing data for projected volume kube-api-access-8vd4t for pod openshift-network-diagnostics/network-check-target-ml2kv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:44:08.569755 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:08.568556 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6fad1cd6-2abc-416f-8534-fda50c4cccd9-kube-api-access-8vd4t podName:6fad1cd6-2abc-416f-8534-fda50c4cccd9 nodeName:}" failed. No retries permitted until 2026-03-18 16:44:09.068509576 +0000 UTC m=+3.080618381 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-8vd4t" (UniqueName: "kubernetes.io/projected/6fad1cd6-2abc-416f-8534-fda50c4cccd9-kube-api-access-8vd4t") pod "network-check-target-ml2kv" (UID: "6fad1cd6-2abc-416f-8534-fda50c4cccd9") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:44:08.570818 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.570781 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8v7r\" (UniqueName: \"kubernetes.io/projected/dbc66209-3812-4679-96f3-4550a7abfa0c-kube-api-access-x8v7r\") pod \"tuned-bdr5f\" (UID: \"dbc66209-3812-4679-96f3-4550a7abfa0c\") " pod="openshift-cluster-node-tuning-operator/tuned-bdr5f" Mar 18 16:44:08.571828 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.571803 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9scm\" (UniqueName: \"kubernetes.io/projected/5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa-kube-api-access-n9scm\") pod \"ovnkube-node-qrlf7\" (UID: \"5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-qrlf7" Mar 18 16:44:08.592117 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.592056 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-175.ec2.internal" event={"ID":"9afad3d4608ba8d74c66b5a17d2a5def","Type":"ContainerStarted","Data":"df61040e538c4d311ce5c5eb935f0ac663f255906c9da3ce37e31d58541f1c9f"} Mar 18 16:44:08.593070 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.593042 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-175.ec2.internal" event={"ID":"a2c314ed716f4e99aa7d05b49c88fc46","Type":"ContainerStarted","Data":"90902809f2340a43151da09f250f214e5a0f317a3793697abb9eec684280dbe0"} Mar 18 16:44:08.662162 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.662128 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v8q68\" (UniqueName: \"kubernetes.io/projected/4bd683e7-9eae-4e40-ba74-10c5acb6fd8b-kube-api-access-v8q68\") pod \"network-metrics-daemon-bcgtq\" (UID: \"4bd683e7-9eae-4e40-ba74-10c5acb6fd8b\") " pod="openshift-multus/network-metrics-daemon-bcgtq" Mar 18 16:44:08.662162 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.662163 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/85ea82d7-7483-4563-a8fd-b12c358cdc2d-multus-socket-dir-parent\") pod \"multus-6p9n7\" (UID: \"85ea82d7-7483-4563-a8fd-b12c358cdc2d\") " pod="openshift-multus/multus-6p9n7" Mar 18 16:44:08.662401 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.662188 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/85ea82d7-7483-4563-a8fd-b12c358cdc2d-host-run-k8s-cni-cncf-io\") pod \"multus-6p9n7\" (UID: \"85ea82d7-7483-4563-a8fd-b12c358cdc2d\") " pod="openshift-multus/multus-6p9n7" Mar 18 16:44:08.662401 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.662214 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/85ea82d7-7483-4563-a8fd-b12c358cdc2d-hostroot\") pod \"multus-6p9n7\" (UID: \"85ea82d7-7483-4563-a8fd-b12c358cdc2d\") " pod="openshift-multus/multus-6p9n7" Mar 18 16:44:08.662401 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.662237 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/85ea82d7-7483-4563-a8fd-b12c358cdc2d-multus-conf-dir\") pod \"multus-6p9n7\" (UID: \"85ea82d7-7483-4563-a8fd-b12c358cdc2d\") " pod="openshift-multus/multus-6p9n7" Mar 18 16:44:08.662401 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.662270 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/85ea82d7-7483-4563-a8fd-b12c358cdc2d-multus-socket-dir-parent\") pod \"multus-6p9n7\" (UID: \"85ea82d7-7483-4563-a8fd-b12c358cdc2d\") " pod="openshift-multus/multus-6p9n7" Mar 18 16:44:08.662401 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.662305 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/85ea82d7-7483-4563-a8fd-b12c358cdc2d-multus-conf-dir\") pod \"multus-6p9n7\" (UID: \"85ea82d7-7483-4563-a8fd-b12c358cdc2d\") " pod="openshift-multus/multus-6p9n7" Mar 18 16:44:08.662401 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.662301 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/85ea82d7-7483-4563-a8fd-b12c358cdc2d-host-run-k8s-cni-cncf-io\") pod \"multus-6p9n7\" (UID: \"85ea82d7-7483-4563-a8fd-b12c358cdc2d\") " pod="openshift-multus/multus-6p9n7" Mar 18 16:44:08.662401 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.662308 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/85ea82d7-7483-4563-a8fd-b12c358cdc2d-hostroot\") pod \"multus-6p9n7\" (UID: \"85ea82d7-7483-4563-a8fd-b12c358cdc2d\") " pod="openshift-multus/multus-6p9n7" Mar 18 16:44:08.662401 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.662336 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e05b6725-74f6-4c02-a1e5-690434981dd9-registration-dir\") pod \"aws-ebs-csi-driver-node-xlssx\" (UID: \"e05b6725-74f6-4c02-a1e5-690434981dd9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xlssx" Mar 18 16:44:08.662401 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.662375 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4bd683e7-9eae-4e40-ba74-10c5acb6fd8b-metrics-certs\") pod \"network-metrics-daemon-bcgtq\" (UID: \"4bd683e7-9eae-4e40-ba74-10c5acb6fd8b\") " pod="openshift-multus/network-metrics-daemon-bcgtq" Mar 18 16:44:08.662401 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.662401 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/85ea82d7-7483-4563-a8fd-b12c358cdc2d-multus-daemon-config\") pod \"multus-6p9n7\" (UID: \"85ea82d7-7483-4563-a8fd-b12c358cdc2d\") " pod="openshift-multus/multus-6p9n7" Mar 18 16:44:08.662890 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.662426 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/85ea82d7-7483-4563-a8fd-b12c358cdc2d-etc-kubernetes\") pod \"multus-6p9n7\" (UID: \"85ea82d7-7483-4563-a8fd-b12c358cdc2d\") " pod="openshift-multus/multus-6p9n7" Mar 18 16:44:08.662890 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.662454 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e05b6725-74f6-4c02-a1e5-690434981dd9-registration-dir\") pod \"aws-ebs-csi-driver-node-xlssx\" (UID: \"e05b6725-74f6-4c02-a1e5-690434981dd9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xlssx" Mar 18 16:44:08.662890 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.662478 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/85ea82d7-7483-4563-a8fd-b12c358cdc2d-etc-kubernetes\") pod \"multus-6p9n7\" (UID: \"85ea82d7-7483-4563-a8fd-b12c358cdc2d\") " pod="openshift-multus/multus-6p9n7" Mar 18 16:44:08.662890 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:08.662535 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:44:08.662890 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.662563 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v4822\" (UniqueName: \"kubernetes.io/projected/824b5ab3-8c23-48c6-a404-bc9472781b90-kube-api-access-v4822\") pod \"node-resolver-s8lc2\" (UID: \"824b5ab3-8c23-48c6-a404-bc9472781b90\") " pod="openshift-dns/node-resolver-s8lc2" Mar 18 16:44:08.662890 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.662594 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/85ea82d7-7483-4563-a8fd-b12c358cdc2d-host-var-lib-kubelet\") pod \"multus-6p9n7\" (UID: \"85ea82d7-7483-4563-a8fd-b12c358cdc2d\") " pod="openshift-multus/multus-6p9n7" Mar 18 16:44:08.662890 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:08.662615 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4bd683e7-9eae-4e40-ba74-10c5acb6fd8b-metrics-certs podName:4bd683e7-9eae-4e40-ba74-10c5acb6fd8b nodeName:}" failed. No retries permitted until 2026-03-18 16:44:09.162594057 +0000 UTC m=+3.174702877 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4bd683e7-9eae-4e40-ba74-10c5acb6fd8b-metrics-certs") pod "network-metrics-daemon-bcgtq" (UID: "4bd683e7-9eae-4e40-ba74-10c5acb6fd8b") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:44:08.662890 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.662630 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/85ea82d7-7483-4563-a8fd-b12c358cdc2d-host-var-lib-kubelet\") pod \"multus-6p9n7\" (UID: \"85ea82d7-7483-4563-a8fd-b12c358cdc2d\") " pod="openshift-multus/multus-6p9n7" Mar 18 16:44:08.662890 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.662648 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/85ea82d7-7483-4563-a8fd-b12c358cdc2d-multus-cni-dir\") pod \"multus-6p9n7\" (UID: \"85ea82d7-7483-4563-a8fd-b12c358cdc2d\") " pod="openshift-multus/multus-6p9n7" Mar 18 16:44:08.662890 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.662675 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/85ea82d7-7483-4563-a8fd-b12c358cdc2d-cni-binary-copy\") pod \"multus-6p9n7\" (UID: \"85ea82d7-7483-4563-a8fd-b12c358cdc2d\") " pod="openshift-multus/multus-6p9n7" Mar 18 16:44:08.662890 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.662701 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/85ea82d7-7483-4563-a8fd-b12c358cdc2d-host-run-multus-certs\") pod \"multus-6p9n7\" (UID: \"85ea82d7-7483-4563-a8fd-b12c358cdc2d\") " pod="openshift-multus/multus-6p9n7" Mar 18 16:44:08.662890 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.662727 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/85ea82d7-7483-4563-a8fd-b12c358cdc2d-host-var-lib-cni-bin\") pod \"multus-6p9n7\" (UID: \"85ea82d7-7483-4563-a8fd-b12c358cdc2d\") " pod="openshift-multus/multus-6p9n7" Mar 18 16:44:08.662890 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.662755 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/85ea82d7-7483-4563-a8fd-b12c358cdc2d-host-var-lib-cni-multus\") pod \"multus-6p9n7\" (UID: \"85ea82d7-7483-4563-a8fd-b12c358cdc2d\") " pod="openshift-multus/multus-6p9n7" Mar 18 16:44:08.662890 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.662800 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/85ea82d7-7483-4563-a8fd-b12c358cdc2d-host-var-lib-cni-multus\") pod \"multus-6p9n7\" (UID: \"85ea82d7-7483-4563-a8fd-b12c358cdc2d\") " pod="openshift-multus/multus-6p9n7" Mar 18 16:44:08.662890 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.662838 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/85ea82d7-7483-4563-a8fd-b12c358cdc2d-host-run-multus-certs\") pod \"multus-6p9n7\" (UID: \"85ea82d7-7483-4563-a8fd-b12c358cdc2d\") " pod="openshift-multus/multus-6p9n7" Mar 18 16:44:08.662890 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.662875 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/85ea82d7-7483-4563-a8fd-b12c358cdc2d-host-var-lib-cni-bin\") pod \"multus-6p9n7\" (UID: \"85ea82d7-7483-4563-a8fd-b12c358cdc2d\") " pod="openshift-multus/multus-6p9n7" Mar 18 16:44:08.663647 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.662904 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fbtm4\" (UniqueName: \"kubernetes.io/projected/e05b6725-74f6-4c02-a1e5-690434981dd9-kube-api-access-fbtm4\") pod \"aws-ebs-csi-driver-node-xlssx\" (UID: \"e05b6725-74f6-4c02-a1e5-690434981dd9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xlssx" Mar 18 16:44:08.663647 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.662936 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/42ddca27-69b7-4448-a710-9cc62df43c14-system-cni-dir\") pod \"multus-additional-cni-plugins-m75lt\" (UID: \"42ddca27-69b7-4448-a710-9cc62df43c14\") " pod="openshift-multus/multus-additional-cni-plugins-m75lt" Mar 18 16:44:08.663647 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.662961 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8xgvw\" (UniqueName: \"kubernetes.io/projected/85ea82d7-7483-4563-a8fd-b12c358cdc2d-kube-api-access-8xgvw\") pod \"multus-6p9n7\" (UID: \"85ea82d7-7483-4563-a8fd-b12c358cdc2d\") " pod="openshift-multus/multus-6p9n7" Mar 18 16:44:08.663647 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.662989 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e05b6725-74f6-4c02-a1e5-690434981dd9-kubelet-dir\") pod \"aws-ebs-csi-driver-node-xlssx\" (UID: \"e05b6725-74f6-4c02-a1e5-690434981dd9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xlssx" Mar 18 16:44:08.663647 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.662988 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/85ea82d7-7483-4563-a8fd-b12c358cdc2d-multus-cni-dir\") pod \"multus-6p9n7\" (UID: \"85ea82d7-7483-4563-a8fd-b12c358cdc2d\") " pod="openshift-multus/multus-6p9n7" Mar 18 16:44:08.663647 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.663037 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/42ddca27-69b7-4448-a710-9cc62df43c14-os-release\") pod \"multus-additional-cni-plugins-m75lt\" (UID: \"42ddca27-69b7-4448-a710-9cc62df43c14\") " pod="openshift-multus/multus-additional-cni-plugins-m75lt" Mar 18 16:44:08.663647 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.663046 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/42ddca27-69b7-4448-a710-9cc62df43c14-system-cni-dir\") pod \"multus-additional-cni-plugins-m75lt\" (UID: \"42ddca27-69b7-4448-a710-9cc62df43c14\") " pod="openshift-multus/multus-additional-cni-plugins-m75lt" Mar 18 16:44:08.663647 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.663065 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/e05b6725-74f6-4c02-a1e5-690434981dd9-device-dir\") pod \"aws-ebs-csi-driver-node-xlssx\" (UID: \"e05b6725-74f6-4c02-a1e5-690434981dd9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xlssx" Mar 18 16:44:08.663647 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.663091 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/e05b6725-74f6-4c02-a1e5-690434981dd9-etc-selinux\") pod \"aws-ebs-csi-driver-node-xlssx\" (UID: \"e05b6725-74f6-4c02-a1e5-690434981dd9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xlssx" Mar 18 16:44:08.663647 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.663121 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/824b5ab3-8c23-48c6-a404-bc9472781b90-hosts-file\") pod \"node-resolver-s8lc2\" (UID: \"824b5ab3-8c23-48c6-a404-bc9472781b90\") " pod="openshift-dns/node-resolver-s8lc2" Mar 18 16:44:08.663647 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.663123 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e05b6725-74f6-4c02-a1e5-690434981dd9-kubelet-dir\") pod \"aws-ebs-csi-driver-node-xlssx\" (UID: \"e05b6725-74f6-4c02-a1e5-690434981dd9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xlssx" Mar 18 16:44:08.663647 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.663148 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/85ea82d7-7483-4563-a8fd-b12c358cdc2d-multus-daemon-config\") pod \"multus-6p9n7\" (UID: \"85ea82d7-7483-4563-a8fd-b12c358cdc2d\") " pod="openshift-multus/multus-6p9n7" Mar 18 16:44:08.663647 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.663149 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/42ddca27-69b7-4448-a710-9cc62df43c14-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-m75lt\" (UID: \"42ddca27-69b7-4448-a710-9cc62df43c14\") " pod="openshift-multus/multus-additional-cni-plugins-m75lt" Mar 18 16:44:08.663647 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.663198 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tlqc4\" (UniqueName: \"kubernetes.io/projected/42ddca27-69b7-4448-a710-9cc62df43c14-kube-api-access-tlqc4\") pod \"multus-additional-cni-plugins-m75lt\" (UID: \"42ddca27-69b7-4448-a710-9cc62df43c14\") " pod="openshift-multus/multus-additional-cni-plugins-m75lt" Mar 18 16:44:08.663647 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.663216 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/85ea82d7-7483-4563-a8fd-b12c358cdc2d-cni-binary-copy\") pod \"multus-6p9n7\" (UID: \"85ea82d7-7483-4563-a8fd-b12c358cdc2d\") " pod="openshift-multus/multus-6p9n7" Mar 18 16:44:08.663647 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.663228 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/42ddca27-69b7-4448-a710-9cc62df43c14-cni-binary-copy\") pod \"multus-additional-cni-plugins-m75lt\" (UID: \"42ddca27-69b7-4448-a710-9cc62df43c14\") " pod="openshift-multus/multus-additional-cni-plugins-m75lt" Mar 18 16:44:08.663647 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.663246 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/42ddca27-69b7-4448-a710-9cc62df43c14-os-release\") pod \"multus-additional-cni-plugins-m75lt\" (UID: \"42ddca27-69b7-4448-a710-9cc62df43c14\") " pod="openshift-multus/multus-additional-cni-plugins-m75lt" Mar 18 16:44:08.664317 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.663265 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/85ea82d7-7483-4563-a8fd-b12c358cdc2d-cnibin\") pod \"multus-6p9n7\" (UID: \"85ea82d7-7483-4563-a8fd-b12c358cdc2d\") " pod="openshift-multus/multus-6p9n7" Mar 18 16:44:08.664317 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.663291 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/824b5ab3-8c23-48c6-a404-bc9472781b90-hosts-file\") pod \"node-resolver-s8lc2\" (UID: \"824b5ab3-8c23-48c6-a404-bc9472781b90\") " pod="openshift-dns/node-resolver-s8lc2" Mar 18 16:44:08.664317 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.663296 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/85ea82d7-7483-4563-a8fd-b12c358cdc2d-os-release\") pod \"multus-6p9n7\" (UID: \"85ea82d7-7483-4563-a8fd-b12c358cdc2d\") " pod="openshift-multus/multus-6p9n7" Mar 18 16:44:08.664317 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.663340 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/e05b6725-74f6-4c02-a1e5-690434981dd9-etc-selinux\") pod \"aws-ebs-csi-driver-node-xlssx\" (UID: \"e05b6725-74f6-4c02-a1e5-690434981dd9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xlssx" Mar 18 16:44:08.664317 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.663343 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/42ddca27-69b7-4448-a710-9cc62df43c14-cnibin\") pod \"multus-additional-cni-plugins-m75lt\" (UID: \"42ddca27-69b7-4448-a710-9cc62df43c14\") " pod="openshift-multus/multus-additional-cni-plugins-m75lt" Mar 18 16:44:08.664317 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.663200 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/e05b6725-74f6-4c02-a1e5-690434981dd9-device-dir\") pod \"aws-ebs-csi-driver-node-xlssx\" (UID: \"e05b6725-74f6-4c02-a1e5-690434981dd9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xlssx" Mar 18 16:44:08.664317 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.663391 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/42ddca27-69b7-4448-a710-9cc62df43c14-cnibin\") pod \"multus-additional-cni-plugins-m75lt\" (UID: \"42ddca27-69b7-4448-a710-9cc62df43c14\") " pod="openshift-multus/multus-additional-cni-plugins-m75lt" Mar 18 16:44:08.664317 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.663400 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/85ea82d7-7483-4563-a8fd-b12c358cdc2d-cnibin\") pod \"multus-6p9n7\" (UID: \"85ea82d7-7483-4563-a8fd-b12c358cdc2d\") " pod="openshift-multus/multus-6p9n7" Mar 18 16:44:08.664317 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.663383 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/42ddca27-69b7-4448-a710-9cc62df43c14-tuning-conf-dir\") pod \"multus-additional-cni-plugins-m75lt\" (UID: \"42ddca27-69b7-4448-a710-9cc62df43c14\") " pod="openshift-multus/multus-additional-cni-plugins-m75lt" Mar 18 16:44:08.664317 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.663447 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/85ea82d7-7483-4563-a8fd-b12c358cdc2d-os-release\") pod \"multus-6p9n7\" (UID: \"85ea82d7-7483-4563-a8fd-b12c358cdc2d\") " pod="openshift-multus/multus-6p9n7" Mar 18 16:44:08.664317 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.663481 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/42ddca27-69b7-4448-a710-9cc62df43c14-tuning-conf-dir\") pod \"multus-additional-cni-plugins-m75lt\" (UID: \"42ddca27-69b7-4448-a710-9cc62df43c14\") " pod="openshift-multus/multus-additional-cni-plugins-m75lt" Mar 18 16:44:08.664317 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.663491 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/85ea82d7-7483-4563-a8fd-b12c358cdc2d-host-run-netns\") pod \"multus-6p9n7\" (UID: \"85ea82d7-7483-4563-a8fd-b12c358cdc2d\") " pod="openshift-multus/multus-6p9n7" Mar 18 16:44:08.664317 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.663515 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e05b6725-74f6-4c02-a1e5-690434981dd9-socket-dir\") pod \"aws-ebs-csi-driver-node-xlssx\" (UID: \"e05b6725-74f6-4c02-a1e5-690434981dd9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xlssx" Mar 18 16:44:08.664317 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.663572 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/85ea82d7-7483-4563-a8fd-b12c358cdc2d-host-run-netns\") pod \"multus-6p9n7\" (UID: \"85ea82d7-7483-4563-a8fd-b12c358cdc2d\") " pod="openshift-multus/multus-6p9n7" Mar 18 16:44:08.664317 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.663605 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/e05b6725-74f6-4c02-a1e5-690434981dd9-sys-fs\") pod \"aws-ebs-csi-driver-node-xlssx\" (UID: \"e05b6725-74f6-4c02-a1e5-690434981dd9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xlssx" Mar 18 16:44:08.664317 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.663627 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e05b6725-74f6-4c02-a1e5-690434981dd9-socket-dir\") pod \"aws-ebs-csi-driver-node-xlssx\" (UID: \"e05b6725-74f6-4c02-a1e5-690434981dd9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xlssx" Mar 18 16:44:08.664317 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.663631 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/824b5ab3-8c23-48c6-a404-bc9472781b90-tmp-dir\") pod \"node-resolver-s8lc2\" (UID: \"824b5ab3-8c23-48c6-a404-bc9472781b90\") " pod="openshift-dns/node-resolver-s8lc2" Mar 18 16:44:08.664973 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.663661 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/42ddca27-69b7-4448-a710-9cc62df43c14-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-m75lt\" (UID: \"42ddca27-69b7-4448-a710-9cc62df43c14\") " pod="openshift-multus/multus-additional-cni-plugins-m75lt" Mar 18 16:44:08.664973 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.663665 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/e05b6725-74f6-4c02-a1e5-690434981dd9-sys-fs\") pod \"aws-ebs-csi-driver-node-xlssx\" (UID: \"e05b6725-74f6-4c02-a1e5-690434981dd9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xlssx" Mar 18 16:44:08.664973 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.663688 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/85ea82d7-7483-4563-a8fd-b12c358cdc2d-system-cni-dir\") pod \"multus-6p9n7\" (UID: \"85ea82d7-7483-4563-a8fd-b12c358cdc2d\") " pod="openshift-multus/multus-6p9n7" Mar 18 16:44:08.664973 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.663743 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/85ea82d7-7483-4563-a8fd-b12c358cdc2d-system-cni-dir\") pod \"multus-6p9n7\" (UID: \"85ea82d7-7483-4563-a8fd-b12c358cdc2d\") " pod="openshift-multus/multus-6p9n7" Mar 18 16:44:08.664973 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.663825 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/42ddca27-69b7-4448-a710-9cc62df43c14-cni-binary-copy\") pod \"multus-additional-cni-plugins-m75lt\" (UID: \"42ddca27-69b7-4448-a710-9cc62df43c14\") " pod="openshift-multus/multus-additional-cni-plugins-m75lt" Mar 18 16:44:08.664973 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.663830 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/42ddca27-69b7-4448-a710-9cc62df43c14-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-m75lt\" (UID: \"42ddca27-69b7-4448-a710-9cc62df43c14\") " pod="openshift-multus/multus-additional-cni-plugins-m75lt" Mar 18 16:44:08.664973 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.663999 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/824b5ab3-8c23-48c6-a404-bc9472781b90-tmp-dir\") pod \"node-resolver-s8lc2\" (UID: \"824b5ab3-8c23-48c6-a404-bc9472781b90\") " pod="openshift-dns/node-resolver-s8lc2" Mar 18 16:44:08.664973 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.664141 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/42ddca27-69b7-4448-a710-9cc62df43c14-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-m75lt\" (UID: \"42ddca27-69b7-4448-a710-9cc62df43c14\") " pod="openshift-multus/multus-additional-cni-plugins-m75lt" Mar 18 16:44:08.670952 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.670925 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4822\" (UniqueName: \"kubernetes.io/projected/824b5ab3-8c23-48c6-a404-bc9472781b90-kube-api-access-v4822\") pod \"node-resolver-s8lc2\" (UID: \"824b5ab3-8c23-48c6-a404-bc9472781b90\") " pod="openshift-dns/node-resolver-s8lc2" Mar 18 16:44:08.671094 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.671072 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8q68\" (UniqueName: \"kubernetes.io/projected/4bd683e7-9eae-4e40-ba74-10c5acb6fd8b-kube-api-access-v8q68\") pod \"network-metrics-daemon-bcgtq\" (UID: \"4bd683e7-9eae-4e40-ba74-10c5acb6fd8b\") " pod="openshift-multus/network-metrics-daemon-bcgtq" Mar 18 16:44:08.671151 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.671072 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbtm4\" (UniqueName: \"kubernetes.io/projected/e05b6725-74f6-4c02-a1e5-690434981dd9-kube-api-access-fbtm4\") pod \"aws-ebs-csi-driver-node-xlssx\" (UID: \"e05b6725-74f6-4c02-a1e5-690434981dd9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xlssx" Mar 18 16:44:08.671480 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.671450 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tlqc4\" (UniqueName: \"kubernetes.io/projected/42ddca27-69b7-4448-a710-9cc62df43c14-kube-api-access-tlqc4\") pod \"multus-additional-cni-plugins-m75lt\" (UID: \"42ddca27-69b7-4448-a710-9cc62df43c14\") " pod="openshift-multus/multus-additional-cni-plugins-m75lt" Mar 18 16:44:08.672152 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.672127 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xgvw\" (UniqueName: \"kubernetes.io/projected/85ea82d7-7483-4563-a8fd-b12c358cdc2d-kube-api-access-8xgvw\") pod \"multus-6p9n7\" (UID: \"85ea82d7-7483-4563-a8fd-b12c358cdc2d\") " pod="openshift-multus/multus-6p9n7" Mar 18 16:44:08.752824 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.752708 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-htm47" Mar 18 16:44:08.761803 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.761776 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-9fxzt" Mar 18 16:44:08.770663 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.770636 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-qrlf7" Mar 18 16:44:08.779391 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.779358 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-bdr5f" Mar 18 16:44:08.785028 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.785008 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-bgzmm" Mar 18 16:44:08.793617 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.793590 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-m75lt" Mar 18 16:44:08.801243 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.801222 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-6p9n7" Mar 18 16:44:08.814940 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.814913 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xlssx" Mar 18 16:44:08.821641 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:08.821613 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-s8lc2" Mar 18 16:44:09.167774 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:09.167688 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4bd683e7-9eae-4e40-ba74-10c5acb6fd8b-metrics-certs\") pod \"network-metrics-daemon-bcgtq\" (UID: \"4bd683e7-9eae-4e40-ba74-10c5acb6fd8b\") " pod="openshift-multus/network-metrics-daemon-bcgtq" Mar 18 16:44:09.167774 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:09.167745 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8vd4t\" (UniqueName: \"kubernetes.io/projected/6fad1cd6-2abc-416f-8534-fda50c4cccd9-kube-api-access-8vd4t\") pod \"network-check-target-ml2kv\" (UID: \"6fad1cd6-2abc-416f-8534-fda50c4cccd9\") " pod="openshift-network-diagnostics/network-check-target-ml2kv" Mar 18 16:44:09.168005 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:09.167855 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:44:09.168005 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:09.167872 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 16:44:09.168005 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:09.167893 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 16:44:09.168005 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:09.167905 2578 projected.go:194] Error preparing data for projected volume kube-api-access-8vd4t for pod openshift-network-diagnostics/network-check-target-ml2kv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:44:09.168005 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:09.167933 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4bd683e7-9eae-4e40-ba74-10c5acb6fd8b-metrics-certs podName:4bd683e7-9eae-4e40-ba74-10c5acb6fd8b nodeName:}" failed. No retries permitted until 2026-03-18 16:44:10.167912051 +0000 UTC m=+4.180020855 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4bd683e7-9eae-4e40-ba74-10c5acb6fd8b-metrics-certs") pod "network-metrics-daemon-bcgtq" (UID: "4bd683e7-9eae-4e40-ba74-10c5acb6fd8b") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:44:09.168005 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:09.167956 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6fad1cd6-2abc-416f-8534-fda50c4cccd9-kube-api-access-8vd4t podName:6fad1cd6-2abc-416f-8534-fda50c4cccd9 nodeName:}" failed. No retries permitted until 2026-03-18 16:44:10.167941166 +0000 UTC m=+4.180049974 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-8vd4t" (UniqueName: "kubernetes.io/projected/6fad1cd6-2abc-416f-8534-fda50c4cccd9-kube-api-access-8vd4t") pod "network-check-target-ml2kv" (UID: "6fad1cd6-2abc-416f-8534-fda50c4cccd9") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:44:09.364839 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:09.364805 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0faceadc_eddf_479d_b008_553314c47823.slice/crio-ac701d9f897bd63ef963ceae80cdd43e71ac4a260aa93808666cc66acc246d59 WatchSource:0}: Error finding container ac701d9f897bd63ef963ceae80cdd43e71ac4a260aa93808666cc66acc246d59: Status 404 returned error can't find the container with id ac701d9f897bd63ef963ceae80cdd43e71ac4a260aa93808666cc66acc246d59 Mar 18 16:44:09.366192 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:09.366153 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf79627fb_f2a3_4632_89c2_5d99716c1cc9.slice/crio-7a9454993a4b137e087b37d2a53322e3c79915667c6054cc33f57d8414c24185 WatchSource:0}: Error finding container 7a9454993a4b137e087b37d2a53322e3c79915667c6054cc33f57d8414c24185: Status 404 returned error can't find the container with id 7a9454993a4b137e087b37d2a53322e3c79915667c6054cc33f57d8414c24185 Mar 18 16:44:09.368716 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:09.368503 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod67114ccf_1178_47ed_acd5_eaf483936c9f.slice/crio-6256cb4a7504195dcd1e5d219038f5a4b812ee7b33487ab8fe52e2208216da0b WatchSource:0}: Error finding container 6256cb4a7504195dcd1e5d219038f5a4b812ee7b33487ab8fe52e2208216da0b: Status 404 returned error can't find the container with id 6256cb4a7504195dcd1e5d219038f5a4b812ee7b33487ab8fe52e2208216da0b Mar 18 16:44:09.371068 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:09.371044 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod824b5ab3_8c23_48c6_a404_bc9472781b90.slice/crio-25a39798b183ca57bf24f58b20c8ff3cd5451e38b27b892ee0b2e68138dd971a WatchSource:0}: Error finding container 25a39798b183ca57bf24f58b20c8ff3cd5451e38b27b892ee0b2e68138dd971a: Status 404 returned error can't find the container with id 25a39798b183ca57bf24f58b20c8ff3cd5451e38b27b892ee0b2e68138dd971a Mar 18 16:44:09.371764 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:09.371731 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode05b6725_74f6_4c02_a1e5_690434981dd9.slice/crio-e82647284ce950c368ee639a9b2a7dd0c7b546dfb702693a791a7ededdd8fc28 WatchSource:0}: Error finding container e82647284ce950c368ee639a9b2a7dd0c7b546dfb702693a791a7ededdd8fc28: Status 404 returned error can't find the container with id e82647284ce950c368ee639a9b2a7dd0c7b546dfb702693a791a7ededdd8fc28 Mar 18 16:44:09.372763 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:09.372730 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddbc66209_3812_4679_96f3_4550a7abfa0c.slice/crio-719ccd9b8b74dd2e1fb6c88da7470cf4ad94aa0ed1d288cc348d13738566ea55 WatchSource:0}: Error finding container 719ccd9b8b74dd2e1fb6c88da7470cf4ad94aa0ed1d288cc348d13738566ea55: Status 404 returned error can't find the container with id 719ccd9b8b74dd2e1fb6c88da7470cf4ad94aa0ed1d288cc348d13738566ea55 Mar 18 16:44:09.373846 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:09.373823 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b66f0ce_7ce3_4f1b_9d2c_c7f629da04fa.slice/crio-ecc2b69f6b59214e1e6d96d845b44deeb8023f12424277a487d8a85d39bbe38c WatchSource:0}: Error finding container ecc2b69f6b59214e1e6d96d845b44deeb8023f12424277a487d8a85d39bbe38c: Status 404 returned error can't find the container with id ecc2b69f6b59214e1e6d96d845b44deeb8023f12424277a487d8a85d39bbe38c Mar 18 16:44:09.374835 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:09.374631 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod85ea82d7_7483_4563_a8fd_b12c358cdc2d.slice/crio-efd9786b1df4927cd8a893f5bbdad6b2236a812bc3360971ec088ff33a64f396 WatchSource:0}: Error finding container efd9786b1df4927cd8a893f5bbdad6b2236a812bc3360971ec088ff33a64f396: Status 404 returned error can't find the container with id efd9786b1df4927cd8a893f5bbdad6b2236a812bc3360971ec088ff33a64f396 Mar 18 16:44:09.376889 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:09.376743 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42ddca27_69b7_4448_a710_9cc62df43c14.slice/crio-4202dfa24caa16a0141938dd89a7971e9e9254d43cd66975dd82f9e9dca2a187 WatchSource:0}: Error finding container 4202dfa24caa16a0141938dd89a7971e9e9254d43cd66975dd82f9e9dca2a187: Status 404 returned error can't find the container with id 4202dfa24caa16a0141938dd89a7971e9e9254d43cd66975dd82f9e9dca2a187 Mar 18 16:44:09.488780 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:09.488599 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-03-17 16:39:07 +0000 UTC" deadline="2027-10-04 22:18:48.28252788 +0000 UTC" Mar 18 16:44:09.488780 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:09.488770 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13565h34m38.793760347s" Mar 18 16:44:09.588372 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:09.588342 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ml2kv" Mar 18 16:44:09.588567 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:09.588452 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ml2kv" podUID="6fad1cd6-2abc-416f-8534-fda50c4cccd9" Mar 18 16:44:09.595344 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:09.595312 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-m75lt" event={"ID":"42ddca27-69b7-4448-a710-9cc62df43c14","Type":"ContainerStarted","Data":"4202dfa24caa16a0141938dd89a7971e9e9254d43cd66975dd82f9e9dca2a187"} Mar 18 16:44:09.596399 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:09.596376 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qrlf7" event={"ID":"5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa","Type":"ContainerStarted","Data":"ecc2b69f6b59214e1e6d96d845b44deeb8023f12424277a487d8a85d39bbe38c"} Mar 18 16:44:09.597355 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:09.597329 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-bdr5f" event={"ID":"dbc66209-3812-4679-96f3-4550a7abfa0c","Type":"ContainerStarted","Data":"719ccd9b8b74dd2e1fb6c88da7470cf4ad94aa0ed1d288cc348d13738566ea55"} Mar 18 16:44:09.598290 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:09.598264 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-s8lc2" event={"ID":"824b5ab3-8c23-48c6-a404-bc9472781b90","Type":"ContainerStarted","Data":"25a39798b183ca57bf24f58b20c8ff3cd5451e38b27b892ee0b2e68138dd971a"} Mar 18 16:44:09.599182 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:09.599162 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-htm47" event={"ID":"67114ccf-1178-47ed-acd5-eaf483936c9f","Type":"ContainerStarted","Data":"6256cb4a7504195dcd1e5d219038f5a4b812ee7b33487ab8fe52e2208216da0b"} Mar 18 16:44:09.600160 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:09.600139 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-bgzmm" event={"ID":"f79627fb-f2a3-4632-89c2-5d99716c1cc9","Type":"ContainerStarted","Data":"7a9454993a4b137e087b37d2a53322e3c79915667c6054cc33f57d8414c24185"} Mar 18 16:44:09.601615 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:09.601594 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-175.ec2.internal" event={"ID":"a2c314ed716f4e99aa7d05b49c88fc46","Type":"ContainerStarted","Data":"46f81efec4d6c58a6d9a82369e73db404b59a039ee511e510490f8461c0ccb52"} Mar 18 16:44:09.602719 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:09.602700 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6p9n7" event={"ID":"85ea82d7-7483-4563-a8fd-b12c358cdc2d","Type":"ContainerStarted","Data":"efd9786b1df4927cd8a893f5bbdad6b2236a812bc3360971ec088ff33a64f396"} Mar 18 16:44:09.603716 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:09.603684 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xlssx" event={"ID":"e05b6725-74f6-4c02-a1e5-690434981dd9","Type":"ContainerStarted","Data":"e82647284ce950c368ee639a9b2a7dd0c7b546dfb702693a791a7ededdd8fc28"} Mar 18 16:44:09.605857 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:09.605833 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-9fxzt" event={"ID":"0faceadc-eddf-479d-b008-553314c47823","Type":"ContainerStarted","Data":"ac701d9f897bd63ef963ceae80cdd43e71ac4a260aa93808666cc66acc246d59"} Mar 18 16:44:09.612942 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:09.612899 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-175.ec2.internal" podStartSLOduration=1.612884794 podStartE2EDuration="1.612884794s" podCreationTimestamp="2026-03-18 16:44:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:44:09.612571856 +0000 UTC m=+3.624680681" watchObservedRunningTime="2026-03-18 16:44:09.612884794 +0000 UTC m=+3.624993618" Mar 18 16:44:10.175008 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:10.174974 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8vd4t\" (UniqueName: \"kubernetes.io/projected/6fad1cd6-2abc-416f-8534-fda50c4cccd9-kube-api-access-8vd4t\") pod \"network-check-target-ml2kv\" (UID: \"6fad1cd6-2abc-416f-8534-fda50c4cccd9\") " pod="openshift-network-diagnostics/network-check-target-ml2kv" Mar 18 16:44:10.175149 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:10.175062 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4bd683e7-9eae-4e40-ba74-10c5acb6fd8b-metrics-certs\") pod \"network-metrics-daemon-bcgtq\" (UID: \"4bd683e7-9eae-4e40-ba74-10c5acb6fd8b\") " pod="openshift-multus/network-metrics-daemon-bcgtq" Mar 18 16:44:10.175207 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:10.175173 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:44:10.175272 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:10.175230 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4bd683e7-9eae-4e40-ba74-10c5acb6fd8b-metrics-certs podName:4bd683e7-9eae-4e40-ba74-10c5acb6fd8b nodeName:}" failed. No retries permitted until 2026-03-18 16:44:12.175214092 +0000 UTC m=+6.187322901 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4bd683e7-9eae-4e40-ba74-10c5acb6fd8b-metrics-certs") pod "network-metrics-daemon-bcgtq" (UID: "4bd683e7-9eae-4e40-ba74-10c5acb6fd8b") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:44:10.175272 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:10.175231 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 16:44:10.175272 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:10.175250 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 16:44:10.175272 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:10.175262 2578 projected.go:194] Error preparing data for projected volume kube-api-access-8vd4t for pod openshift-network-diagnostics/network-check-target-ml2kv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:44:10.175501 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:10.175298 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6fad1cd6-2abc-416f-8534-fda50c4cccd9-kube-api-access-8vd4t podName:6fad1cd6-2abc-416f-8534-fda50c4cccd9 nodeName:}" failed. No retries permitted until 2026-03-18 16:44:12.175286029 +0000 UTC m=+6.187394846 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-8vd4t" (UniqueName: "kubernetes.io/projected/6fad1cd6-2abc-416f-8534-fda50c4cccd9-kube-api-access-8vd4t") pod "network-check-target-ml2kv" (UID: "6fad1cd6-2abc-416f-8534-fda50c4cccd9") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:44:10.589412 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:10.588864 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bcgtq" Mar 18 16:44:10.589412 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:10.589015 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bcgtq" podUID="4bd683e7-9eae-4e40-ba74-10c5acb6fd8b" Mar 18 16:44:10.619843 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:10.619802 2578 generic.go:358] "Generic (PLEG): container finished" podID="9afad3d4608ba8d74c66b5a17d2a5def" containerID="045a6e76bd07dfc21f1daa6bc4c8c8ad53cf73910c4b005b80bfaa94728c0c47" exitCode=0 Mar 18 16:44:10.620344 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:10.620319 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-175.ec2.internal" event={"ID":"9afad3d4608ba8d74c66b5a17d2a5def","Type":"ContainerDied","Data":"045a6e76bd07dfc21f1daa6bc4c8c8ad53cf73910c4b005b80bfaa94728c0c47"} Mar 18 16:44:11.587810 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:11.587779 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ml2kv" Mar 18 16:44:11.587987 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:11.587924 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ml2kv" podUID="6fad1cd6-2abc-416f-8534-fda50c4cccd9" Mar 18 16:44:11.646619 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:11.646337 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-175.ec2.internal" event={"ID":"9afad3d4608ba8d74c66b5a17d2a5def","Type":"ContainerStarted","Data":"46ac140bc4e8d606fdac0839504f947b98bfc48bf5d6ee15556028099a93e6fb"} Mar 18 16:44:12.194194 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:12.193388 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8vd4t\" (UniqueName: \"kubernetes.io/projected/6fad1cd6-2abc-416f-8534-fda50c4cccd9-kube-api-access-8vd4t\") pod \"network-check-target-ml2kv\" (UID: \"6fad1cd6-2abc-416f-8534-fda50c4cccd9\") " pod="openshift-network-diagnostics/network-check-target-ml2kv" Mar 18 16:44:12.194194 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:12.193467 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4bd683e7-9eae-4e40-ba74-10c5acb6fd8b-metrics-certs\") pod \"network-metrics-daemon-bcgtq\" (UID: \"4bd683e7-9eae-4e40-ba74-10c5acb6fd8b\") " pod="openshift-multus/network-metrics-daemon-bcgtq" Mar 18 16:44:12.194194 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:12.193609 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:44:12.194194 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:12.193672 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4bd683e7-9eae-4e40-ba74-10c5acb6fd8b-metrics-certs podName:4bd683e7-9eae-4e40-ba74-10c5acb6fd8b nodeName:}" failed. No retries permitted until 2026-03-18 16:44:16.193653667 +0000 UTC m=+10.205762476 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4bd683e7-9eae-4e40-ba74-10c5acb6fd8b-metrics-certs") pod "network-metrics-daemon-bcgtq" (UID: "4bd683e7-9eae-4e40-ba74-10c5acb6fd8b") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:44:12.194194 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:12.194073 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 16:44:12.194194 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:12.194093 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 16:44:12.194194 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:12.194105 2578 projected.go:194] Error preparing data for projected volume kube-api-access-8vd4t for pod openshift-network-diagnostics/network-check-target-ml2kv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:44:12.194194 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:12.194149 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6fad1cd6-2abc-416f-8534-fda50c4cccd9-kube-api-access-8vd4t podName:6fad1cd6-2abc-416f-8534-fda50c4cccd9 nodeName:}" failed. No retries permitted until 2026-03-18 16:44:16.194133542 +0000 UTC m=+10.206242348 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-8vd4t" (UniqueName: "kubernetes.io/projected/6fad1cd6-2abc-416f-8534-fda50c4cccd9-kube-api-access-8vd4t") pod "network-check-target-ml2kv" (UID: "6fad1cd6-2abc-416f-8534-fda50c4cccd9") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:44:12.588178 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:12.588083 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bcgtq" Mar 18 16:44:12.588333 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:12.588249 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bcgtq" podUID="4bd683e7-9eae-4e40-ba74-10c5acb6fd8b" Mar 18 16:44:13.588773 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:13.588616 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ml2kv" Mar 18 16:44:13.588773 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:13.588758 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ml2kv" podUID="6fad1cd6-2abc-416f-8534-fda50c4cccd9" Mar 18 16:44:14.588925 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:14.588893 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bcgtq" Mar 18 16:44:14.589393 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:14.589051 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bcgtq" podUID="4bd683e7-9eae-4e40-ba74-10c5acb6fd8b" Mar 18 16:44:15.588688 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:15.588653 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ml2kv" Mar 18 16:44:15.588874 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:15.588809 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ml2kv" podUID="6fad1cd6-2abc-416f-8534-fda50c4cccd9" Mar 18 16:44:16.229372 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:16.229315 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8vd4t\" (UniqueName: \"kubernetes.io/projected/6fad1cd6-2abc-416f-8534-fda50c4cccd9-kube-api-access-8vd4t\") pod \"network-check-target-ml2kv\" (UID: \"6fad1cd6-2abc-416f-8534-fda50c4cccd9\") " pod="openshift-network-diagnostics/network-check-target-ml2kv" Mar 18 16:44:16.229938 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:16.229411 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4bd683e7-9eae-4e40-ba74-10c5acb6fd8b-metrics-certs\") pod \"network-metrics-daemon-bcgtq\" (UID: \"4bd683e7-9eae-4e40-ba74-10c5acb6fd8b\") " pod="openshift-multus/network-metrics-daemon-bcgtq" Mar 18 16:44:16.229938 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:16.229512 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 16:44:16.229938 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:16.229551 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 16:44:16.229938 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:16.229567 2578 projected.go:194] Error preparing data for projected volume kube-api-access-8vd4t for pod openshift-network-diagnostics/network-check-target-ml2kv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:44:16.229938 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:16.229571 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:44:16.229938 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:16.229633 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6fad1cd6-2abc-416f-8534-fda50c4cccd9-kube-api-access-8vd4t podName:6fad1cd6-2abc-416f-8534-fda50c4cccd9 nodeName:}" failed. No retries permitted until 2026-03-18 16:44:24.229613267 +0000 UTC m=+18.241722074 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-8vd4t" (UniqueName: "kubernetes.io/projected/6fad1cd6-2abc-416f-8534-fda50c4cccd9-kube-api-access-8vd4t") pod "network-check-target-ml2kv" (UID: "6fad1cd6-2abc-416f-8534-fda50c4cccd9") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:44:16.229938 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:16.229655 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4bd683e7-9eae-4e40-ba74-10c5acb6fd8b-metrics-certs podName:4bd683e7-9eae-4e40-ba74-10c5acb6fd8b nodeName:}" failed. No retries permitted until 2026-03-18 16:44:24.229644676 +0000 UTC m=+18.241753480 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4bd683e7-9eae-4e40-ba74-10c5acb6fd8b-metrics-certs") pod "network-metrics-daemon-bcgtq" (UID: "4bd683e7-9eae-4e40-ba74-10c5acb6fd8b") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:44:16.590299 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:16.589767 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bcgtq" Mar 18 16:44:16.590299 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:16.589890 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bcgtq" podUID="4bd683e7-9eae-4e40-ba74-10c5acb6fd8b" Mar 18 16:44:17.588809 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:17.588730 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ml2kv" Mar 18 16:44:17.589344 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:17.588858 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ml2kv" podUID="6fad1cd6-2abc-416f-8534-fda50c4cccd9" Mar 18 16:44:18.588810 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:18.588773 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bcgtq" Mar 18 16:44:18.589272 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:18.588927 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bcgtq" podUID="4bd683e7-9eae-4e40-ba74-10c5acb6fd8b" Mar 18 16:44:19.588444 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:19.588409 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ml2kv" Mar 18 16:44:19.588634 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:19.588559 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ml2kv" podUID="6fad1cd6-2abc-416f-8534-fda50c4cccd9" Mar 18 16:44:20.462830 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:20.462780 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-175.ec2.internal" podStartSLOduration=12.46276199 podStartE2EDuration="12.46276199s" podCreationTimestamp="2026-03-18 16:44:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:44:11.661179745 +0000 UTC m=+5.673288572" watchObservedRunningTime="2026-03-18 16:44:20.46276199 +0000 UTC m=+14.474870815" Mar 18 16:44:20.463232 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:20.463117 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-52khs"] Mar 18 16:44:20.485942 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:20.485905 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-52khs" Mar 18 16:44:20.486121 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:20.486001 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-52khs" podUID="31d1ce77-15fd-48f5-844f-07de4a0cdfc4" Mar 18 16:44:20.560098 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:20.560051 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/31d1ce77-15fd-48f5-844f-07de4a0cdfc4-original-pull-secret\") pod \"global-pull-secret-syncer-52khs\" (UID: \"31d1ce77-15fd-48f5-844f-07de4a0cdfc4\") " pod="kube-system/global-pull-secret-syncer-52khs" Mar 18 16:44:20.560298 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:20.560114 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/31d1ce77-15fd-48f5-844f-07de4a0cdfc4-dbus\") pod \"global-pull-secret-syncer-52khs\" (UID: \"31d1ce77-15fd-48f5-844f-07de4a0cdfc4\") " pod="kube-system/global-pull-secret-syncer-52khs" Mar 18 16:44:20.560298 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:20.560202 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/31d1ce77-15fd-48f5-844f-07de4a0cdfc4-kubelet-config\") pod \"global-pull-secret-syncer-52khs\" (UID: \"31d1ce77-15fd-48f5-844f-07de4a0cdfc4\") " pod="kube-system/global-pull-secret-syncer-52khs" Mar 18 16:44:20.588241 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:20.588199 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bcgtq" Mar 18 16:44:20.588404 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:20.588363 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bcgtq" podUID="4bd683e7-9eae-4e40-ba74-10c5acb6fd8b" Mar 18 16:44:20.661265 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:20.661224 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/31d1ce77-15fd-48f5-844f-07de4a0cdfc4-original-pull-secret\") pod \"global-pull-secret-syncer-52khs\" (UID: \"31d1ce77-15fd-48f5-844f-07de4a0cdfc4\") " pod="kube-system/global-pull-secret-syncer-52khs" Mar 18 16:44:20.661467 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:20.661282 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/31d1ce77-15fd-48f5-844f-07de4a0cdfc4-dbus\") pod \"global-pull-secret-syncer-52khs\" (UID: \"31d1ce77-15fd-48f5-844f-07de4a0cdfc4\") " pod="kube-system/global-pull-secret-syncer-52khs" Mar 18 16:44:20.661467 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:20.661332 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/31d1ce77-15fd-48f5-844f-07de4a0cdfc4-kubelet-config\") pod \"global-pull-secret-syncer-52khs\" (UID: \"31d1ce77-15fd-48f5-844f-07de4a0cdfc4\") " pod="kube-system/global-pull-secret-syncer-52khs" Mar 18 16:44:20.661467 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:20.661379 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/31d1ce77-15fd-48f5-844f-07de4a0cdfc4-dbus\") pod \"global-pull-secret-syncer-52khs\" (UID: \"31d1ce77-15fd-48f5-844f-07de4a0cdfc4\") " pod="kube-system/global-pull-secret-syncer-52khs" Mar 18 16:44:20.661467 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:20.661408 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/31d1ce77-15fd-48f5-844f-07de4a0cdfc4-kubelet-config\") pod \"global-pull-secret-syncer-52khs\" (UID: \"31d1ce77-15fd-48f5-844f-07de4a0cdfc4\") " pod="kube-system/global-pull-secret-syncer-52khs" Mar 18 16:44:20.661467 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:20.661382 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Mar 18 16:44:20.661712 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:20.661575 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/31d1ce77-15fd-48f5-844f-07de4a0cdfc4-original-pull-secret podName:31d1ce77-15fd-48f5-844f-07de4a0cdfc4 nodeName:}" failed. No retries permitted until 2026-03-18 16:44:21.161548737 +0000 UTC m=+15.173657540 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/31d1ce77-15fd-48f5-844f-07de4a0cdfc4-original-pull-secret") pod "global-pull-secret-syncer-52khs" (UID: "31d1ce77-15fd-48f5-844f-07de4a0cdfc4") : object "kube-system"/"original-pull-secret" not registered Mar 18 16:44:21.164393 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:21.164350 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/31d1ce77-15fd-48f5-844f-07de4a0cdfc4-original-pull-secret\") pod \"global-pull-secret-syncer-52khs\" (UID: \"31d1ce77-15fd-48f5-844f-07de4a0cdfc4\") " pod="kube-system/global-pull-secret-syncer-52khs" Mar 18 16:44:21.164615 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:21.164512 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Mar 18 16:44:21.164615 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:21.164595 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/31d1ce77-15fd-48f5-844f-07de4a0cdfc4-original-pull-secret podName:31d1ce77-15fd-48f5-844f-07de4a0cdfc4 nodeName:}" failed. No retries permitted until 2026-03-18 16:44:22.164580496 +0000 UTC m=+16.176689299 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/31d1ce77-15fd-48f5-844f-07de4a0cdfc4-original-pull-secret") pod "global-pull-secret-syncer-52khs" (UID: "31d1ce77-15fd-48f5-844f-07de4a0cdfc4") : object "kube-system"/"original-pull-secret" not registered Mar 18 16:44:21.588231 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:21.588140 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ml2kv" Mar 18 16:44:21.588640 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:21.588258 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ml2kv" podUID="6fad1cd6-2abc-416f-8534-fda50c4cccd9" Mar 18 16:44:22.172750 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:22.172663 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/31d1ce77-15fd-48f5-844f-07de4a0cdfc4-original-pull-secret\") pod \"global-pull-secret-syncer-52khs\" (UID: \"31d1ce77-15fd-48f5-844f-07de4a0cdfc4\") " pod="kube-system/global-pull-secret-syncer-52khs" Mar 18 16:44:22.172881 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:22.172779 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Mar 18 16:44:22.172881 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:22.172836 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/31d1ce77-15fd-48f5-844f-07de4a0cdfc4-original-pull-secret podName:31d1ce77-15fd-48f5-844f-07de4a0cdfc4 nodeName:}" failed. No retries permitted until 2026-03-18 16:44:24.172820446 +0000 UTC m=+18.184929250 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/31d1ce77-15fd-48f5-844f-07de4a0cdfc4-original-pull-secret") pod "global-pull-secret-syncer-52khs" (UID: "31d1ce77-15fd-48f5-844f-07de4a0cdfc4") : object "kube-system"/"original-pull-secret" not registered Mar 18 16:44:22.588627 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:22.588550 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bcgtq" Mar 18 16:44:22.588627 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:22.588593 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-52khs" Mar 18 16:44:22.589085 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:22.588671 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bcgtq" podUID="4bd683e7-9eae-4e40-ba74-10c5acb6fd8b" Mar 18 16:44:22.589085 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:22.588790 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-52khs" podUID="31d1ce77-15fd-48f5-844f-07de4a0cdfc4" Mar 18 16:44:23.588296 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:23.588262 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ml2kv" Mar 18 16:44:23.588544 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:23.588386 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ml2kv" podUID="6fad1cd6-2abc-416f-8534-fda50c4cccd9" Mar 18 16:44:24.191650 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:24.191616 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/31d1ce77-15fd-48f5-844f-07de4a0cdfc4-original-pull-secret\") pod \"global-pull-secret-syncer-52khs\" (UID: \"31d1ce77-15fd-48f5-844f-07de4a0cdfc4\") " pod="kube-system/global-pull-secret-syncer-52khs" Mar 18 16:44:24.192052 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:24.191792 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Mar 18 16:44:24.192052 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:24.191841 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/31d1ce77-15fd-48f5-844f-07de4a0cdfc4-original-pull-secret podName:31d1ce77-15fd-48f5-844f-07de4a0cdfc4 nodeName:}" failed. No retries permitted until 2026-03-18 16:44:28.191828575 +0000 UTC m=+22.203937377 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/31d1ce77-15fd-48f5-844f-07de4a0cdfc4-original-pull-secret") pod "global-pull-secret-syncer-52khs" (UID: "31d1ce77-15fd-48f5-844f-07de4a0cdfc4") : object "kube-system"/"original-pull-secret" not registered Mar 18 16:44:24.292679 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:24.292601 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4bd683e7-9eae-4e40-ba74-10c5acb6fd8b-metrics-certs\") pod \"network-metrics-daemon-bcgtq\" (UID: \"4bd683e7-9eae-4e40-ba74-10c5acb6fd8b\") " pod="openshift-multus/network-metrics-daemon-bcgtq" Mar 18 16:44:24.292887 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:24.292774 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:44:24.292887 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:24.292794 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8vd4t\" (UniqueName: \"kubernetes.io/projected/6fad1cd6-2abc-416f-8534-fda50c4cccd9-kube-api-access-8vd4t\") pod \"network-check-target-ml2kv\" (UID: \"6fad1cd6-2abc-416f-8534-fda50c4cccd9\") " pod="openshift-network-diagnostics/network-check-target-ml2kv" Mar 18 16:44:24.292887 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:24.292852 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4bd683e7-9eae-4e40-ba74-10c5acb6fd8b-metrics-certs podName:4bd683e7-9eae-4e40-ba74-10c5acb6fd8b nodeName:}" failed. No retries permitted until 2026-03-18 16:44:40.292828827 +0000 UTC m=+34.304937634 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4bd683e7-9eae-4e40-ba74-10c5acb6fd8b-metrics-certs") pod "network-metrics-daemon-bcgtq" (UID: "4bd683e7-9eae-4e40-ba74-10c5acb6fd8b") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:44:24.293066 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:24.292916 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 16:44:24.293066 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:24.292933 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 16:44:24.293066 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:24.292943 2578 projected.go:194] Error preparing data for projected volume kube-api-access-8vd4t for pod openshift-network-diagnostics/network-check-target-ml2kv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:44:24.293066 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:24.293003 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6fad1cd6-2abc-416f-8534-fda50c4cccd9-kube-api-access-8vd4t podName:6fad1cd6-2abc-416f-8534-fda50c4cccd9 nodeName:}" failed. No retries permitted until 2026-03-18 16:44:40.292986772 +0000 UTC m=+34.305095581 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-8vd4t" (UniqueName: "kubernetes.io/projected/6fad1cd6-2abc-416f-8534-fda50c4cccd9-kube-api-access-8vd4t") pod "network-check-target-ml2kv" (UID: "6fad1cd6-2abc-416f-8534-fda50c4cccd9") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:44:24.588098 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:24.588007 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-52khs" Mar 18 16:44:24.588236 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:24.588011 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bcgtq" Mar 18 16:44:24.588236 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:24.588147 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-52khs" podUID="31d1ce77-15fd-48f5-844f-07de4a0cdfc4" Mar 18 16:44:24.588236 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:24.588221 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bcgtq" podUID="4bd683e7-9eae-4e40-ba74-10c5acb6fd8b" Mar 18 16:44:25.588628 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:25.588591 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ml2kv" Mar 18 16:44:25.589098 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:25.588713 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ml2kv" podUID="6fad1cd6-2abc-416f-8534-fda50c4cccd9" Mar 18 16:44:26.588821 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:26.588792 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-52khs" Mar 18 16:44:26.589224 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:26.588895 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-52khs" podUID="31d1ce77-15fd-48f5-844f-07de4a0cdfc4" Mar 18 16:44:26.589224 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:26.588925 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bcgtq" Mar 18 16:44:26.589224 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:26.589010 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bcgtq" podUID="4bd683e7-9eae-4e40-ba74-10c5acb6fd8b" Mar 18 16:44:27.588421 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:27.588090 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ml2kv" Mar 18 16:44:27.588566 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:27.588458 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ml2kv" podUID="6fad1cd6-2abc-416f-8534-fda50c4cccd9" Mar 18 16:44:27.675833 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:27.675792 2578 generic.go:358] "Generic (PLEG): container finished" podID="42ddca27-69b7-4448-a710-9cc62df43c14" containerID="4c6e1baea8e24c63fd042603b05112008e49b405310681a916694e074260e824" exitCode=0 Mar 18 16:44:27.676701 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:27.675878 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-m75lt" event={"ID":"42ddca27-69b7-4448-a710-9cc62df43c14","Type":"ContainerDied","Data":"4c6e1baea8e24c63fd042603b05112008e49b405310681a916694e074260e824"} Mar 18 16:44:27.679040 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:27.678995 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qrlf7" event={"ID":"5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa","Type":"ContainerStarted","Data":"2773a7d8196036b811aab9be4562ea618fa17d34c4ad710721c09c8785c4269b"} Mar 18 16:44:27.679040 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:27.679029 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qrlf7" event={"ID":"5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa","Type":"ContainerStarted","Data":"761207fce2bcbd82d1b38daaff9d77c79791b6859b790d4429d8fd1cc3c8da48"} Mar 18 16:44:27.679156 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:27.679043 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qrlf7" event={"ID":"5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa","Type":"ContainerStarted","Data":"68a4c448d145529f224d81d4e8bf807b3e482966162cd84494c8a96fd933c50b"} Mar 18 16:44:27.679156 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:27.679056 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qrlf7" event={"ID":"5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa","Type":"ContainerStarted","Data":"8ee7c4cb3d84601bd25a616e4ffb004ad2094db62ec286c46ef7173ab7fd6e70"} Mar 18 16:44:27.679156 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:27.679068 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qrlf7" event={"ID":"5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa","Type":"ContainerStarted","Data":"29d1031de3e1c55925cc803db360b695d47e33a2ca9bf9083fb7b0cc23a7ccb3"} Mar 18 16:44:27.679156 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:27.679080 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qrlf7" event={"ID":"5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa","Type":"ContainerStarted","Data":"fd818da626adfd4a495158e36ee871614c0509c92f8585b82bea6a8deb5fd4d4"} Mar 18 16:44:27.680650 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:27.680625 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-bdr5f" event={"ID":"dbc66209-3812-4679-96f3-4550a7abfa0c","Type":"ContainerStarted","Data":"6363e7147f5c67f9d493742e205c23d3465b1ce3c0e810da9f9412c86ab09252"} Mar 18 16:44:27.682186 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:27.682150 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-s8lc2" event={"ID":"824b5ab3-8c23-48c6-a404-bc9472781b90","Type":"ContainerStarted","Data":"87a5c9d0602b984e6ee7835a6d05df94c10d77c06faa00ba52645b5b9834c343"} Mar 18 16:44:27.683800 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:27.683774 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-bgzmm" event={"ID":"f79627fb-f2a3-4632-89c2-5d99716c1cc9","Type":"ContainerStarted","Data":"73c9f529532ea4035a221ad91693d8b99f66d43f57403891a80e48ac10c4abe6"} Mar 18 16:44:27.685333 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:27.685288 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6p9n7" event={"ID":"85ea82d7-7483-4563-a8fd-b12c358cdc2d","Type":"ContainerStarted","Data":"d091e0b8536cc293ecb9de752c0d409ca7a8701e9f9688f735a3cd2824c41b08"} Mar 18 16:44:27.686733 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:27.686704 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xlssx" event={"ID":"e05b6725-74f6-4c02-a1e5-690434981dd9","Type":"ContainerStarted","Data":"3c6aa5c756021b39a4a4b919e15d08536f74d9eab81bf54865ea8698ad83ba8d"} Mar 18 16:44:27.688147 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:27.688114 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-9fxzt" event={"ID":"0faceadc-eddf-479d-b008-553314c47823","Type":"ContainerStarted","Data":"b9d9ca3a0abd3962f0eb31a3525db41a765f38df59bb054f9a3cee2e360c79b8"} Mar 18 16:44:27.725460 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:27.725415 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-bdr5f" podStartSLOduration=4.518086264 podStartE2EDuration="21.725399788s" podCreationTimestamp="2026-03-18 16:44:06 +0000 UTC" firstStartedPulling="2026-03-18 16:44:09.374678211 +0000 UTC m=+3.386787021" lastFinishedPulling="2026-03-18 16:44:26.581991727 +0000 UTC m=+20.594100545" observedRunningTime="2026-03-18 16:44:27.712991376 +0000 UTC m=+21.725100201" watchObservedRunningTime="2026-03-18 16:44:27.725399788 +0000 UTC m=+21.737508613" Mar 18 16:44:27.725622 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:27.725505 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-bgzmm" podStartSLOduration=9.240581373 podStartE2EDuration="21.725500835s" podCreationTimestamp="2026-03-18 16:44:06 +0000 UTC" firstStartedPulling="2026-03-18 16:44:09.368098507 +0000 UTC m=+3.380207313" lastFinishedPulling="2026-03-18 16:44:21.853017958 +0000 UTC m=+15.865126775" observedRunningTime="2026-03-18 16:44:27.725094411 +0000 UTC m=+21.737203235" watchObservedRunningTime="2026-03-18 16:44:27.725500835 +0000 UTC m=+21.737609660" Mar 18 16:44:27.740292 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:27.740246 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-6p9n7" podStartSLOduration=4.500174632 podStartE2EDuration="21.740233225s" podCreationTimestamp="2026-03-18 16:44:06 +0000 UTC" firstStartedPulling="2026-03-18 16:44:09.377360268 +0000 UTC m=+3.389469083" lastFinishedPulling="2026-03-18 16:44:26.617418859 +0000 UTC m=+20.629527676" observedRunningTime="2026-03-18 16:44:27.739804911 +0000 UTC m=+21.751913736" watchObservedRunningTime="2026-03-18 16:44:27.740233225 +0000 UTC m=+21.752342050" Mar 18 16:44:27.752191 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:27.752147 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-s8lc2" podStartSLOduration=4.544678037 podStartE2EDuration="21.752133918s" podCreationTimestamp="2026-03-18 16:44:06 +0000 UTC" firstStartedPulling="2026-03-18 16:44:09.372701582 +0000 UTC m=+3.384810400" lastFinishedPulling="2026-03-18 16:44:26.580157472 +0000 UTC m=+20.592266281" observedRunningTime="2026-03-18 16:44:27.751678025 +0000 UTC m=+21.763786850" watchObservedRunningTime="2026-03-18 16:44:27.752133918 +0000 UTC m=+21.764242742" Mar 18 16:44:27.764119 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:27.764044 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-9fxzt" podStartSLOduration=4.550776558 podStartE2EDuration="21.764026236s" podCreationTimestamp="2026-03-18 16:44:06 +0000 UTC" firstStartedPulling="2026-03-18 16:44:09.366830752 +0000 UTC m=+3.378939556" lastFinishedPulling="2026-03-18 16:44:26.580080417 +0000 UTC m=+20.592189234" observedRunningTime="2026-03-18 16:44:27.763817194 +0000 UTC m=+21.775926032" watchObservedRunningTime="2026-03-18 16:44:27.764026236 +0000 UTC m=+21.776135063" Mar 18 16:44:27.896908 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:27.896882 2578 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Mar 18 16:44:28.222957 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:28.222868 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/31d1ce77-15fd-48f5-844f-07de4a0cdfc4-original-pull-secret\") pod \"global-pull-secret-syncer-52khs\" (UID: \"31d1ce77-15fd-48f5-844f-07de4a0cdfc4\") " pod="kube-system/global-pull-secret-syncer-52khs" Mar 18 16:44:28.223085 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:28.222973 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Mar 18 16:44:28.223085 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:28.223022 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/31d1ce77-15fd-48f5-844f-07de4a0cdfc4-original-pull-secret podName:31d1ce77-15fd-48f5-844f-07de4a0cdfc4 nodeName:}" failed. No retries permitted until 2026-03-18 16:44:36.223009765 +0000 UTC m=+30.235118568 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/31d1ce77-15fd-48f5-844f-07de4a0cdfc4-original-pull-secret") pod "global-pull-secret-syncer-52khs" (UID: "31d1ce77-15fd-48f5-844f-07de4a0cdfc4") : object "kube-system"/"original-pull-secret" not registered Mar 18 16:44:28.514847 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:28.514602 2578 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-03-18T16:44:27.896905479Z","UUID":"19378658-2887-49ca-955e-bf9c34826445","Handler":null,"Name":"","Endpoint":""} Mar 18 16:44:28.516591 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:28.516569 2578 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Mar 18 16:44:28.516740 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:28.516599 2578 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Mar 18 16:44:28.588543 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:28.588483 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-52khs" Mar 18 16:44:28.588717 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:28.588555 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bcgtq" Mar 18 16:44:28.588717 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:28.588646 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-52khs" podUID="31d1ce77-15fd-48f5-844f-07de4a0cdfc4" Mar 18 16:44:28.588833 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:28.588787 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bcgtq" podUID="4bd683e7-9eae-4e40-ba74-10c5acb6fd8b" Mar 18 16:44:28.691743 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:28.691700 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-htm47" event={"ID":"67114ccf-1178-47ed-acd5-eaf483936c9f","Type":"ContainerStarted","Data":"78c2e61459c285de4b24b0dca86688156e14ae4ab20aab6040a99d46ab60ffa7"} Mar 18 16:44:28.694320 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:28.694293 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xlssx" event={"ID":"e05b6725-74f6-4c02-a1e5-690434981dd9","Type":"ContainerStarted","Data":"110c126d3d6d5c871fba87246256a53133a3e67d0500e84e0934c693394bb2ff"} Mar 18 16:44:28.707568 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:28.707491 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-htm47" podStartSLOduration=5.497816608 podStartE2EDuration="22.707475028s" podCreationTimestamp="2026-03-18 16:44:06 +0000 UTC" firstStartedPulling="2026-03-18 16:44:09.370477404 +0000 UTC m=+3.382586211" lastFinishedPulling="2026-03-18 16:44:26.580135815 +0000 UTC m=+20.592244631" observedRunningTime="2026-03-18 16:44:28.706976463 +0000 UTC m=+22.719085298" watchObservedRunningTime="2026-03-18 16:44:28.707475028 +0000 UTC m=+22.719583856" Mar 18 16:44:28.781029 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:28.780995 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-9fxzt" Mar 18 16:44:28.781717 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:28.781666 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-9fxzt" Mar 18 16:44:28.853788 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:28.853756 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-9fxzt" Mar 18 16:44:28.854333 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:28.854314 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-9fxzt" Mar 18 16:44:29.588442 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:29.588408 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ml2kv" Mar 18 16:44:29.588632 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:29.588544 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ml2kv" podUID="6fad1cd6-2abc-416f-8534-fda50c4cccd9" Mar 18 16:44:29.699213 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:29.699122 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qrlf7" event={"ID":"5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa","Type":"ContainerStarted","Data":"66bb603f2b677f22c6ce5bfff50fef2dfa455acd3a800b1912be7d039667e90b"} Mar 18 16:44:29.701124 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:29.701096 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xlssx" event={"ID":"e05b6725-74f6-4c02-a1e5-690434981dd9","Type":"ContainerStarted","Data":"38a3f705d7dc8903606ed8f79a59bab1f954a03d2eb32d720c6262496af12ee2"} Mar 18 16:44:29.720289 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:29.720245 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xlssx" podStartSLOduration=4.421401783 podStartE2EDuration="23.720231808s" podCreationTimestamp="2026-03-18 16:44:06 +0000 UTC" firstStartedPulling="2026-03-18 16:44:09.373688359 +0000 UTC m=+3.385797175" lastFinishedPulling="2026-03-18 16:44:28.672518378 +0000 UTC m=+22.684627200" observedRunningTime="2026-03-18 16:44:29.720200639 +0000 UTC m=+23.732309461" watchObservedRunningTime="2026-03-18 16:44:29.720231808 +0000 UTC m=+23.732340632" Mar 18 16:44:30.588758 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:30.588722 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-52khs" Mar 18 16:44:30.588949 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:30.588872 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-52khs" podUID="31d1ce77-15fd-48f5-844f-07de4a0cdfc4" Mar 18 16:44:30.588949 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:30.588915 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bcgtq" Mar 18 16:44:30.589073 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:30.589053 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bcgtq" podUID="4bd683e7-9eae-4e40-ba74-10c5acb6fd8b" Mar 18 16:44:31.588618 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:31.588420 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ml2kv" Mar 18 16:44:31.589098 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:31.588725 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ml2kv" podUID="6fad1cd6-2abc-416f-8534-fda50c4cccd9" Mar 18 16:44:32.588644 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:32.588564 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-52khs" Mar 18 16:44:32.589253 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:32.588670 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-52khs" podUID="31d1ce77-15fd-48f5-844f-07de4a0cdfc4" Mar 18 16:44:32.589253 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:32.588698 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bcgtq" Mar 18 16:44:32.589253 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:32.588764 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bcgtq" podUID="4bd683e7-9eae-4e40-ba74-10c5acb6fd8b" Mar 18 16:44:32.708354 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:32.708322 2578 generic.go:358] "Generic (PLEG): container finished" podID="42ddca27-69b7-4448-a710-9cc62df43c14" containerID="63301387c758434ccd128edd6409acdebabb4bfaf0559caeb7d7d578d8cc4b5b" exitCode=0 Mar 18 16:44:32.708553 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:32.708378 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-m75lt" event={"ID":"42ddca27-69b7-4448-a710-9cc62df43c14","Type":"ContainerDied","Data":"63301387c758434ccd128edd6409acdebabb4bfaf0559caeb7d7d578d8cc4b5b"} Mar 18 16:44:32.711578 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:32.711558 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qrlf7" event={"ID":"5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa","Type":"ContainerStarted","Data":"4515e426444b3eff84bc4338d6c84e9ecc33649a07781b9c9093f4a89666047e"} Mar 18 16:44:32.711868 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:32.711847 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-qrlf7" Mar 18 16:44:32.726096 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:32.726073 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qrlf7" Mar 18 16:44:32.761582 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:32.761497 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-qrlf7" podStartSLOduration=9.131190879 podStartE2EDuration="26.761483381s" podCreationTimestamp="2026-03-18 16:44:06 +0000 UTC" firstStartedPulling="2026-03-18 16:44:09.37617193 +0000 UTC m=+3.388280738" lastFinishedPulling="2026-03-18 16:44:27.006464431 +0000 UTC m=+21.018573240" observedRunningTime="2026-03-18 16:44:32.76114897 +0000 UTC m=+26.773257794" watchObservedRunningTime="2026-03-18 16:44:32.761483381 +0000 UTC m=+26.773592207" Mar 18 16:44:33.588771 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:33.588629 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ml2kv" Mar 18 16:44:33.589356 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:33.588839 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ml2kv" podUID="6fad1cd6-2abc-416f-8534-fda50c4cccd9" Mar 18 16:44:33.714005 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:33.713975 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-qrlf7" Mar 18 16:44:33.714005 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:33.714010 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-qrlf7" Mar 18 16:44:33.728043 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:33.728012 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qrlf7" Mar 18 16:44:34.170981 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:34.170893 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-ml2kv"] Mar 18 16:44:34.171142 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:34.170991 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ml2kv" Mar 18 16:44:34.171142 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:34.171076 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ml2kv" podUID="6fad1cd6-2abc-416f-8534-fda50c4cccd9" Mar 18 16:44:34.173409 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:34.173380 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-bcgtq"] Mar 18 16:44:34.173572 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:34.173514 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bcgtq" Mar 18 16:44:34.173694 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:34.173671 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bcgtq" podUID="4bd683e7-9eae-4e40-ba74-10c5acb6fd8b" Mar 18 16:44:34.174004 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:34.173986 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-52khs"] Mar 18 16:44:34.174084 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:34.174072 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-52khs" Mar 18 16:44:34.174175 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:34.174150 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-52khs" podUID="31d1ce77-15fd-48f5-844f-07de4a0cdfc4" Mar 18 16:44:34.719886 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:34.719853 2578 generic.go:358] "Generic (PLEG): container finished" podID="42ddca27-69b7-4448-a710-9cc62df43c14" containerID="ac00977923bfc6b2a068158b4c2cc7b780583644a2ec1933f4185f4b47f98217" exitCode=0 Mar 18 16:44:34.720422 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:34.719934 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-m75lt" event={"ID":"42ddca27-69b7-4448-a710-9cc62df43c14","Type":"ContainerDied","Data":"ac00977923bfc6b2a068158b4c2cc7b780583644a2ec1933f4185f4b47f98217"} Mar 18 16:44:35.588386 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:35.588351 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ml2kv" Mar 18 16:44:35.588603 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:35.588369 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-52khs" Mar 18 16:44:35.588603 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:35.588478 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ml2kv" podUID="6fad1cd6-2abc-416f-8534-fda50c4cccd9" Mar 18 16:44:35.588603 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:35.588573 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-52khs" podUID="31d1ce77-15fd-48f5-844f-07de4a0cdfc4" Mar 18 16:44:35.723286 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:35.723252 2578 generic.go:358] "Generic (PLEG): container finished" podID="42ddca27-69b7-4448-a710-9cc62df43c14" containerID="5cdce5173a03fa1553831c9c9ade213b1eda4dcdabc38003014129b994aec2e6" exitCode=0 Mar 18 16:44:35.723725 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:35.723293 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-m75lt" event={"ID":"42ddca27-69b7-4448-a710-9cc62df43c14","Type":"ContainerDied","Data":"5cdce5173a03fa1553831c9c9ade213b1eda4dcdabc38003014129b994aec2e6"} Mar 18 16:44:36.284560 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:36.284465 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/31d1ce77-15fd-48f5-844f-07de4a0cdfc4-original-pull-secret\") pod \"global-pull-secret-syncer-52khs\" (UID: \"31d1ce77-15fd-48f5-844f-07de4a0cdfc4\") " pod="kube-system/global-pull-secret-syncer-52khs" Mar 18 16:44:36.284707 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:36.284634 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Mar 18 16:44:36.284766 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:36.284707 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/31d1ce77-15fd-48f5-844f-07de4a0cdfc4-original-pull-secret podName:31d1ce77-15fd-48f5-844f-07de4a0cdfc4 nodeName:}" failed. No retries permitted until 2026-03-18 16:44:52.284686098 +0000 UTC m=+46.296794920 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/31d1ce77-15fd-48f5-844f-07de4a0cdfc4-original-pull-secret") pod "global-pull-secret-syncer-52khs" (UID: "31d1ce77-15fd-48f5-844f-07de4a0cdfc4") : object "kube-system"/"original-pull-secret" not registered Mar 18 16:44:36.589015 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:36.588930 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bcgtq" Mar 18 16:44:36.589171 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:36.589040 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bcgtq" podUID="4bd683e7-9eae-4e40-ba74-10c5acb6fd8b" Mar 18 16:44:37.588039 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:37.588009 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ml2kv" Mar 18 16:44:37.588428 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:37.588012 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-52khs" Mar 18 16:44:37.588428 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:37.588132 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ml2kv" podUID="6fad1cd6-2abc-416f-8534-fda50c4cccd9" Mar 18 16:44:37.588428 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:37.588220 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-52khs" podUID="31d1ce77-15fd-48f5-844f-07de4a0cdfc4" Mar 18 16:44:38.588302 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:38.588272 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bcgtq" Mar 18 16:44:38.588729 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:38.588420 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bcgtq" podUID="4bd683e7-9eae-4e40-ba74-10c5acb6fd8b" Mar 18 16:44:39.271135 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:39.270904 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-175.ec2.internal" event="NodeReady" Mar 18 16:44:39.271298 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:39.271255 2578 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Mar 18 16:44:39.309591 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:39.309554 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-86d7f745b7-6hv58"] Mar 18 16:44:39.336114 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:39.335831 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-bv7gz"] Mar 18 16:44:39.336114 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:39.336084 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-86d7f745b7-6hv58" Mar 18 16:44:39.340073 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:39.338884 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-rcgfm\"" Mar 18 16:44:39.340073 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:39.338974 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Mar 18 16:44:39.340073 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:39.338874 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Mar 18 16:44:39.340073 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:39.339133 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Mar 18 16:44:39.345575 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:39.345543 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Mar 18 16:44:39.352037 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:39.352012 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-gwqfh"] Mar 18 16:44:39.352197 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:39.352183 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-bv7gz" Mar 18 16:44:39.354472 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:39.354451 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Mar 18 16:44:39.354587 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:39.354481 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-775g5\"" Mar 18 16:44:39.354587 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:39.354481 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Mar 18 16:44:39.377819 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:39.377789 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-86d7f745b7-6hv58"] Mar 18 16:44:39.377819 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:39.377822 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-gwqfh"] Mar 18 16:44:39.378015 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:39.377836 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-bv7gz"] Mar 18 16:44:39.378015 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:39.377871 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-gwqfh" Mar 18 16:44:39.380282 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:39.380241 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Mar 18 16:44:39.380985 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:39.380865 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Mar 18 16:44:39.381158 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:39.381144 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-tcvm2\"" Mar 18 16:44:39.381343 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:39.381327 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Mar 18 16:44:39.404630 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:39.404597 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/73d3b30d-1d2b-4666-8183-1a58e717bac7-ca-trust-extracted\") pod \"image-registry-86d7f745b7-6hv58\" (UID: \"73d3b30d-1d2b-4666-8183-1a58e717bac7\") " pod="openshift-image-registry/image-registry-86d7f745b7-6hv58" Mar 18 16:44:39.404630 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:39.404633 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/73d3b30d-1d2b-4666-8183-1a58e717bac7-registry-tls\") pod \"image-registry-86d7f745b7-6hv58\" (UID: \"73d3b30d-1d2b-4666-8183-1a58e717bac7\") " pod="openshift-image-registry/image-registry-86d7f745b7-6hv58" Mar 18 16:44:39.404806 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:39.404661 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5pvc\" (UniqueName: \"kubernetes.io/projected/73d3b30d-1d2b-4666-8183-1a58e717bac7-kube-api-access-p5pvc\") pod \"image-registry-86d7f745b7-6hv58\" (UID: \"73d3b30d-1d2b-4666-8183-1a58e717bac7\") " pod="openshift-image-registry/image-registry-86d7f745b7-6hv58" Mar 18 16:44:39.404806 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:39.404776 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/73d3b30d-1d2b-4666-8183-1a58e717bac7-bound-sa-token\") pod \"image-registry-86d7f745b7-6hv58\" (UID: \"73d3b30d-1d2b-4666-8183-1a58e717bac7\") " pod="openshift-image-registry/image-registry-86d7f745b7-6hv58" Mar 18 16:44:39.404894 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:39.404817 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/73d3b30d-1d2b-4666-8183-1a58e717bac7-registry-certificates\") pod \"image-registry-86d7f745b7-6hv58\" (UID: \"73d3b30d-1d2b-4666-8183-1a58e717bac7\") " pod="openshift-image-registry/image-registry-86d7f745b7-6hv58" Mar 18 16:44:39.404894 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:39.404868 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/73d3b30d-1d2b-4666-8183-1a58e717bac7-image-registry-private-configuration\") pod \"image-registry-86d7f745b7-6hv58\" (UID: \"73d3b30d-1d2b-4666-8183-1a58e717bac7\") " pod="openshift-image-registry/image-registry-86d7f745b7-6hv58" Mar 18 16:44:39.405055 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:39.404914 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/73d3b30d-1d2b-4666-8183-1a58e717bac7-installation-pull-secrets\") pod \"image-registry-86d7f745b7-6hv58\" (UID: \"73d3b30d-1d2b-4666-8183-1a58e717bac7\") " pod="openshift-image-registry/image-registry-86d7f745b7-6hv58" Mar 18 16:44:39.405055 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:39.404960 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/73d3b30d-1d2b-4666-8183-1a58e717bac7-trusted-ca\") pod \"image-registry-86d7f745b7-6hv58\" (UID: \"73d3b30d-1d2b-4666-8183-1a58e717bac7\") " pod="openshift-image-registry/image-registry-86d7f745b7-6hv58" Mar 18 16:44:39.505734 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:39.505651 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p5pvc\" (UniqueName: \"kubernetes.io/projected/73d3b30d-1d2b-4666-8183-1a58e717bac7-kube-api-access-p5pvc\") pod \"image-registry-86d7f745b7-6hv58\" (UID: \"73d3b30d-1d2b-4666-8183-1a58e717bac7\") " pod="openshift-image-registry/image-registry-86d7f745b7-6hv58" Mar 18 16:44:39.505734 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:39.505698 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nthfq\" (UniqueName: \"kubernetes.io/projected/f604a9b3-7355-4f54-8113-531b6f45d7f2-kube-api-access-nthfq\") pod \"dns-default-bv7gz\" (UID: \"f604a9b3-7355-4f54-8113-531b6f45d7f2\") " pod="openshift-dns/dns-default-bv7gz" Mar 18 16:44:39.505934 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:39.505743 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/73d3b30d-1d2b-4666-8183-1a58e717bac7-bound-sa-token\") pod \"image-registry-86d7f745b7-6hv58\" (UID: \"73d3b30d-1d2b-4666-8183-1a58e717bac7\") " pod="openshift-image-registry/image-registry-86d7f745b7-6hv58" Mar 18 16:44:39.505934 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:39.505771 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqdpm\" (UniqueName: \"kubernetes.io/projected/3d42738c-ceaa-4925-8255-a2b61010e00f-kube-api-access-pqdpm\") pod \"ingress-canary-gwqfh\" (UID: \"3d42738c-ceaa-4925-8255-a2b61010e00f\") " pod="openshift-ingress-canary/ingress-canary-gwqfh" Mar 18 16:44:39.505934 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:39.505804 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/73d3b30d-1d2b-4666-8183-1a58e717bac7-registry-certificates\") pod \"image-registry-86d7f745b7-6hv58\" (UID: \"73d3b30d-1d2b-4666-8183-1a58e717bac7\") " pod="openshift-image-registry/image-registry-86d7f745b7-6hv58" Mar 18 16:44:39.505934 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:39.505834 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f604a9b3-7355-4f54-8113-531b6f45d7f2-config-volume\") pod \"dns-default-bv7gz\" (UID: \"f604a9b3-7355-4f54-8113-531b6f45d7f2\") " pod="openshift-dns/dns-default-bv7gz" Mar 18 16:44:39.505934 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:39.505862 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f604a9b3-7355-4f54-8113-531b6f45d7f2-tmp-dir\") pod \"dns-default-bv7gz\" (UID: \"f604a9b3-7355-4f54-8113-531b6f45d7f2\") " pod="openshift-dns/dns-default-bv7gz" Mar 18 16:44:39.506150 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:39.505964 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/73d3b30d-1d2b-4666-8183-1a58e717bac7-image-registry-private-configuration\") pod \"image-registry-86d7f745b7-6hv58\" (UID: \"73d3b30d-1d2b-4666-8183-1a58e717bac7\") " pod="openshift-image-registry/image-registry-86d7f745b7-6hv58" Mar 18 16:44:39.506150 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:39.506014 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/73d3b30d-1d2b-4666-8183-1a58e717bac7-installation-pull-secrets\") pod \"image-registry-86d7f745b7-6hv58\" (UID: \"73d3b30d-1d2b-4666-8183-1a58e717bac7\") " pod="openshift-image-registry/image-registry-86d7f745b7-6hv58" Mar 18 16:44:39.506150 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:39.506047 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/73d3b30d-1d2b-4666-8183-1a58e717bac7-trusted-ca\") pod \"image-registry-86d7f745b7-6hv58\" (UID: \"73d3b30d-1d2b-4666-8183-1a58e717bac7\") " pod="openshift-image-registry/image-registry-86d7f745b7-6hv58" Mar 18 16:44:39.506150 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:39.506077 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f604a9b3-7355-4f54-8113-531b6f45d7f2-metrics-tls\") pod \"dns-default-bv7gz\" (UID: \"f604a9b3-7355-4f54-8113-531b6f45d7f2\") " pod="openshift-dns/dns-default-bv7gz" Mar 18 16:44:39.506150 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:39.506118 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3d42738c-ceaa-4925-8255-a2b61010e00f-cert\") pod \"ingress-canary-gwqfh\" (UID: \"3d42738c-ceaa-4925-8255-a2b61010e00f\") " pod="openshift-ingress-canary/ingress-canary-gwqfh" Mar 18 16:44:39.506150 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:39.506148 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/73d3b30d-1d2b-4666-8183-1a58e717bac7-ca-trust-extracted\") pod \"image-registry-86d7f745b7-6hv58\" (UID: \"73d3b30d-1d2b-4666-8183-1a58e717bac7\") " pod="openshift-image-registry/image-registry-86d7f745b7-6hv58" Mar 18 16:44:39.506394 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:39.506169 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/73d3b30d-1d2b-4666-8183-1a58e717bac7-registry-tls\") pod \"image-registry-86d7f745b7-6hv58\" (UID: \"73d3b30d-1d2b-4666-8183-1a58e717bac7\") " pod="openshift-image-registry/image-registry-86d7f745b7-6hv58" Mar 18 16:44:39.506394 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:39.506345 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Mar 18 16:44:39.506394 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:39.506365 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-86d7f745b7-6hv58: secret "image-registry-tls" not found Mar 18 16:44:39.506510 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:39.506415 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/73d3b30d-1d2b-4666-8183-1a58e717bac7-registry-tls podName:73d3b30d-1d2b-4666-8183-1a58e717bac7 nodeName:}" failed. No retries permitted until 2026-03-18 16:44:40.006396844 +0000 UTC m=+34.018505648 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/73d3b30d-1d2b-4666-8183-1a58e717bac7-registry-tls") pod "image-registry-86d7f745b7-6hv58" (UID: "73d3b30d-1d2b-4666-8183-1a58e717bac7") : secret "image-registry-tls" not found Mar 18 16:44:39.506510 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:39.506475 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/73d3b30d-1d2b-4666-8183-1a58e717bac7-ca-trust-extracted\") pod \"image-registry-86d7f745b7-6hv58\" (UID: \"73d3b30d-1d2b-4666-8183-1a58e717bac7\") " pod="openshift-image-registry/image-registry-86d7f745b7-6hv58" Mar 18 16:44:39.506649 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:39.506594 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/73d3b30d-1d2b-4666-8183-1a58e717bac7-registry-certificates\") pod \"image-registry-86d7f745b7-6hv58\" (UID: \"73d3b30d-1d2b-4666-8183-1a58e717bac7\") " pod="openshift-image-registry/image-registry-86d7f745b7-6hv58" Mar 18 16:44:39.506922 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:39.506899 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/73d3b30d-1d2b-4666-8183-1a58e717bac7-trusted-ca\") pod \"image-registry-86d7f745b7-6hv58\" (UID: \"73d3b30d-1d2b-4666-8183-1a58e717bac7\") " pod="openshift-image-registry/image-registry-86d7f745b7-6hv58" Mar 18 16:44:39.510494 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:39.510473 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/73d3b30d-1d2b-4666-8183-1a58e717bac7-installation-pull-secrets\") pod \"image-registry-86d7f745b7-6hv58\" (UID: \"73d3b30d-1d2b-4666-8183-1a58e717bac7\") " pod="openshift-image-registry/image-registry-86d7f745b7-6hv58" Mar 18 16:44:39.510494 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:39.510488 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/73d3b30d-1d2b-4666-8183-1a58e717bac7-image-registry-private-configuration\") pod \"image-registry-86d7f745b7-6hv58\" (UID: \"73d3b30d-1d2b-4666-8183-1a58e717bac7\") " pod="openshift-image-registry/image-registry-86d7f745b7-6hv58" Mar 18 16:44:39.518175 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:39.518149 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/73d3b30d-1d2b-4666-8183-1a58e717bac7-bound-sa-token\") pod \"image-registry-86d7f745b7-6hv58\" (UID: \"73d3b30d-1d2b-4666-8183-1a58e717bac7\") " pod="openshift-image-registry/image-registry-86d7f745b7-6hv58" Mar 18 16:44:39.518296 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:39.518264 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5pvc\" (UniqueName: \"kubernetes.io/projected/73d3b30d-1d2b-4666-8183-1a58e717bac7-kube-api-access-p5pvc\") pod \"image-registry-86d7f745b7-6hv58\" (UID: \"73d3b30d-1d2b-4666-8183-1a58e717bac7\") " pod="openshift-image-registry/image-registry-86d7f745b7-6hv58" Mar 18 16:44:39.587937 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:39.587897 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-52khs" Mar 18 16:44:39.588109 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:39.588079 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ml2kv" Mar 18 16:44:39.590719 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:39.590693 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Mar 18 16:44:39.590719 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:39.590709 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Mar 18 16:44:39.591284 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:39.591263 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-gnx58\"" Mar 18 16:44:39.591414 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:39.591316 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Mar 18 16:44:39.606485 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:39.606465 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pqdpm\" (UniqueName: \"kubernetes.io/projected/3d42738c-ceaa-4925-8255-a2b61010e00f-kube-api-access-pqdpm\") pod \"ingress-canary-gwqfh\" (UID: \"3d42738c-ceaa-4925-8255-a2b61010e00f\") " pod="openshift-ingress-canary/ingress-canary-gwqfh" Mar 18 16:44:39.606600 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:39.606495 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f604a9b3-7355-4f54-8113-531b6f45d7f2-config-volume\") pod \"dns-default-bv7gz\" (UID: \"f604a9b3-7355-4f54-8113-531b6f45d7f2\") " pod="openshift-dns/dns-default-bv7gz" Mar 18 16:44:39.606600 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:39.606516 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f604a9b3-7355-4f54-8113-531b6f45d7f2-tmp-dir\") pod \"dns-default-bv7gz\" (UID: \"f604a9b3-7355-4f54-8113-531b6f45d7f2\") " pod="openshift-dns/dns-default-bv7gz" Mar 18 16:44:39.606600 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:39.606565 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f604a9b3-7355-4f54-8113-531b6f45d7f2-metrics-tls\") pod \"dns-default-bv7gz\" (UID: \"f604a9b3-7355-4f54-8113-531b6f45d7f2\") " pod="openshift-dns/dns-default-bv7gz" Mar 18 16:44:39.606751 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:39.606602 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3d42738c-ceaa-4925-8255-a2b61010e00f-cert\") pod \"ingress-canary-gwqfh\" (UID: \"3d42738c-ceaa-4925-8255-a2b61010e00f\") " pod="openshift-ingress-canary/ingress-canary-gwqfh" Mar 18 16:44:39.606751 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:39.606658 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nthfq\" (UniqueName: \"kubernetes.io/projected/f604a9b3-7355-4f54-8113-531b6f45d7f2-kube-api-access-nthfq\") pod \"dns-default-bv7gz\" (UID: \"f604a9b3-7355-4f54-8113-531b6f45d7f2\") " pod="openshift-dns/dns-default-bv7gz" Mar 18 16:44:39.606875 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:39.606853 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Mar 18 16:44:39.606986 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:39.606878 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f604a9b3-7355-4f54-8113-531b6f45d7f2-tmp-dir\") pod \"dns-default-bv7gz\" (UID: \"f604a9b3-7355-4f54-8113-531b6f45d7f2\") " pod="openshift-dns/dns-default-bv7gz" Mar 18 16:44:39.606986 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:39.606926 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f604a9b3-7355-4f54-8113-531b6f45d7f2-metrics-tls podName:f604a9b3-7355-4f54-8113-531b6f45d7f2 nodeName:}" failed. No retries permitted until 2026-03-18 16:44:40.106909124 +0000 UTC m=+34.119017936 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f604a9b3-7355-4f54-8113-531b6f45d7f2-metrics-tls") pod "dns-default-bv7gz" (UID: "f604a9b3-7355-4f54-8113-531b6f45d7f2") : secret "dns-default-metrics-tls" not found Mar 18 16:44:39.606986 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:39.606982 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Mar 18 16:44:39.607155 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:39.607030 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3d42738c-ceaa-4925-8255-a2b61010e00f-cert podName:3d42738c-ceaa-4925-8255-a2b61010e00f nodeName:}" failed. No retries permitted until 2026-03-18 16:44:40.107016448 +0000 UTC m=+34.119125251 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3d42738c-ceaa-4925-8255-a2b61010e00f-cert") pod "ingress-canary-gwqfh" (UID: "3d42738c-ceaa-4925-8255-a2b61010e00f") : secret "canary-serving-cert" not found Mar 18 16:44:39.607155 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:39.607092 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f604a9b3-7355-4f54-8113-531b6f45d7f2-config-volume\") pod \"dns-default-bv7gz\" (UID: \"f604a9b3-7355-4f54-8113-531b6f45d7f2\") " pod="openshift-dns/dns-default-bv7gz" Mar 18 16:44:39.618935 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:39.618909 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nthfq\" (UniqueName: \"kubernetes.io/projected/f604a9b3-7355-4f54-8113-531b6f45d7f2-kube-api-access-nthfq\") pod \"dns-default-bv7gz\" (UID: \"f604a9b3-7355-4f54-8113-531b6f45d7f2\") " pod="openshift-dns/dns-default-bv7gz" Mar 18 16:44:39.619047 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:39.618958 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqdpm\" (UniqueName: \"kubernetes.io/projected/3d42738c-ceaa-4925-8255-a2b61010e00f-kube-api-access-pqdpm\") pod \"ingress-canary-gwqfh\" (UID: \"3d42738c-ceaa-4925-8255-a2b61010e00f\") " pod="openshift-ingress-canary/ingress-canary-gwqfh" Mar 18 16:44:40.010089 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:40.010055 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/73d3b30d-1d2b-4666-8183-1a58e717bac7-registry-tls\") pod \"image-registry-86d7f745b7-6hv58\" (UID: \"73d3b30d-1d2b-4666-8183-1a58e717bac7\") " pod="openshift-image-registry/image-registry-86d7f745b7-6hv58" Mar 18 16:44:40.010274 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:40.010232 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Mar 18 16:44:40.010274 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:40.010255 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-86d7f745b7-6hv58: secret "image-registry-tls" not found Mar 18 16:44:40.010376 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:40.010331 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/73d3b30d-1d2b-4666-8183-1a58e717bac7-registry-tls podName:73d3b30d-1d2b-4666-8183-1a58e717bac7 nodeName:}" failed. No retries permitted until 2026-03-18 16:44:41.010307904 +0000 UTC m=+35.022416710 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/73d3b30d-1d2b-4666-8183-1a58e717bac7-registry-tls") pod "image-registry-86d7f745b7-6hv58" (UID: "73d3b30d-1d2b-4666-8183-1a58e717bac7") : secret "image-registry-tls" not found Mar 18 16:44:40.111249 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:40.111213 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f604a9b3-7355-4f54-8113-531b6f45d7f2-metrics-tls\") pod \"dns-default-bv7gz\" (UID: \"f604a9b3-7355-4f54-8113-531b6f45d7f2\") " pod="openshift-dns/dns-default-bv7gz" Mar 18 16:44:40.111427 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:40.111272 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3d42738c-ceaa-4925-8255-a2b61010e00f-cert\") pod \"ingress-canary-gwqfh\" (UID: \"3d42738c-ceaa-4925-8255-a2b61010e00f\") " pod="openshift-ingress-canary/ingress-canary-gwqfh" Mar 18 16:44:40.111427 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:40.111386 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Mar 18 16:44:40.111570 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:40.111438 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Mar 18 16:44:40.111570 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:40.111467 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f604a9b3-7355-4f54-8113-531b6f45d7f2-metrics-tls podName:f604a9b3-7355-4f54-8113-531b6f45d7f2 nodeName:}" failed. No retries permitted until 2026-03-18 16:44:41.111444617 +0000 UTC m=+35.123553425 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f604a9b3-7355-4f54-8113-531b6f45d7f2-metrics-tls") pod "dns-default-bv7gz" (UID: "f604a9b3-7355-4f54-8113-531b6f45d7f2") : secret "dns-default-metrics-tls" not found Mar 18 16:44:40.111570 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:40.111488 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3d42738c-ceaa-4925-8255-a2b61010e00f-cert podName:3d42738c-ceaa-4925-8255-a2b61010e00f nodeName:}" failed. No retries permitted until 2026-03-18 16:44:41.111474686 +0000 UTC m=+35.123583508 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3d42738c-ceaa-4925-8255-a2b61010e00f-cert") pod "ingress-canary-gwqfh" (UID: "3d42738c-ceaa-4925-8255-a2b61010e00f") : secret "canary-serving-cert" not found Mar 18 16:44:40.312686 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:40.312599 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4bd683e7-9eae-4e40-ba74-10c5acb6fd8b-metrics-certs\") pod \"network-metrics-daemon-bcgtq\" (UID: \"4bd683e7-9eae-4e40-ba74-10c5acb6fd8b\") " pod="openshift-multus/network-metrics-daemon-bcgtq" Mar 18 16:44:40.312855 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:40.312688 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8vd4t\" (UniqueName: \"kubernetes.io/projected/6fad1cd6-2abc-416f-8534-fda50c4cccd9-kube-api-access-8vd4t\") pod \"network-check-target-ml2kv\" (UID: \"6fad1cd6-2abc-416f-8534-fda50c4cccd9\") " pod="openshift-network-diagnostics/network-check-target-ml2kv" Mar 18 16:44:40.312855 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:40.312768 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:44:40.312855 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:40.312852 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4bd683e7-9eae-4e40-ba74-10c5acb6fd8b-metrics-certs podName:4bd683e7-9eae-4e40-ba74-10c5acb6fd8b nodeName:}" failed. No retries permitted until 2026-03-18 16:45:12.312832453 +0000 UTC m=+66.324941265 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4bd683e7-9eae-4e40-ba74-10c5acb6fd8b-metrics-certs") pod "network-metrics-daemon-bcgtq" (UID: "4bd683e7-9eae-4e40-ba74-10c5acb6fd8b") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:44:40.315504 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:40.315474 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vd4t\" (UniqueName: \"kubernetes.io/projected/6fad1cd6-2abc-416f-8534-fda50c4cccd9-kube-api-access-8vd4t\") pod \"network-check-target-ml2kv\" (UID: \"6fad1cd6-2abc-416f-8534-fda50c4cccd9\") " pod="openshift-network-diagnostics/network-check-target-ml2kv" Mar 18 16:44:40.504684 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:40.504650 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ml2kv" Mar 18 16:44:40.588438 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:40.588351 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bcgtq" Mar 18 16:44:40.593005 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:40.591088 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Mar 18 16:44:40.593005 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:40.591123 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-vrjl4\"" Mar 18 16:44:41.018764 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:41.018679 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/73d3b30d-1d2b-4666-8183-1a58e717bac7-registry-tls\") pod \"image-registry-86d7f745b7-6hv58\" (UID: \"73d3b30d-1d2b-4666-8183-1a58e717bac7\") " pod="openshift-image-registry/image-registry-86d7f745b7-6hv58" Mar 18 16:44:41.018939 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:41.018824 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Mar 18 16:44:41.018939 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:41.018839 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-86d7f745b7-6hv58: secret "image-registry-tls" not found Mar 18 16:44:41.018939 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:41.018909 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/73d3b30d-1d2b-4666-8183-1a58e717bac7-registry-tls podName:73d3b30d-1d2b-4666-8183-1a58e717bac7 nodeName:}" failed. No retries permitted until 2026-03-18 16:44:43.018889565 +0000 UTC m=+37.030998383 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/73d3b30d-1d2b-4666-8183-1a58e717bac7-registry-tls") pod "image-registry-86d7f745b7-6hv58" (UID: "73d3b30d-1d2b-4666-8183-1a58e717bac7") : secret "image-registry-tls" not found Mar 18 16:44:41.119732 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:41.119690 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f604a9b3-7355-4f54-8113-531b6f45d7f2-metrics-tls\") pod \"dns-default-bv7gz\" (UID: \"f604a9b3-7355-4f54-8113-531b6f45d7f2\") " pod="openshift-dns/dns-default-bv7gz" Mar 18 16:44:41.119923 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:41.119752 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3d42738c-ceaa-4925-8255-a2b61010e00f-cert\") pod \"ingress-canary-gwqfh\" (UID: \"3d42738c-ceaa-4925-8255-a2b61010e00f\") " pod="openshift-ingress-canary/ingress-canary-gwqfh" Mar 18 16:44:41.119923 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:41.119866 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Mar 18 16:44:41.120033 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:41.119947 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f604a9b3-7355-4f54-8113-531b6f45d7f2-metrics-tls podName:f604a9b3-7355-4f54-8113-531b6f45d7f2 nodeName:}" failed. No retries permitted until 2026-03-18 16:44:43.119926812 +0000 UTC m=+37.132035629 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f604a9b3-7355-4f54-8113-531b6f45d7f2-metrics-tls") pod "dns-default-bv7gz" (UID: "f604a9b3-7355-4f54-8113-531b6f45d7f2") : secret "dns-default-metrics-tls" not found Mar 18 16:44:41.120033 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:41.119956 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Mar 18 16:44:41.120033 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:41.120032 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3d42738c-ceaa-4925-8255-a2b61010e00f-cert podName:3d42738c-ceaa-4925-8255-a2b61010e00f nodeName:}" failed. No retries permitted until 2026-03-18 16:44:43.120015144 +0000 UTC m=+37.132123947 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3d42738c-ceaa-4925-8255-a2b61010e00f-cert") pod "ingress-canary-gwqfh" (UID: "3d42738c-ceaa-4925-8255-a2b61010e00f") : secret "canary-serving-cert" not found Mar 18 16:44:41.398105 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:41.398063 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-ml2kv"] Mar 18 16:44:41.402206 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:41.402169 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6fad1cd6_2abc_416f_8534_fda50c4cccd9.slice/crio-5e76ab3599f6b68cf41e9e4cd177cf45434feee2c3c1f76f87256da3a4ee136f WatchSource:0}: Error finding container 5e76ab3599f6b68cf41e9e4cd177cf45434feee2c3c1f76f87256da3a4ee136f: Status 404 returned error can't find the container with id 5e76ab3599f6b68cf41e9e4cd177cf45434feee2c3c1f76f87256da3a4ee136f Mar 18 16:44:41.735487 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:41.735300 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-ml2kv" event={"ID":"6fad1cd6-2abc-416f-8534-fda50c4cccd9","Type":"ContainerStarted","Data":"5e76ab3599f6b68cf41e9e4cd177cf45434feee2c3c1f76f87256da3a4ee136f"} Mar 18 16:44:41.737966 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:41.737936 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-m75lt" event={"ID":"42ddca27-69b7-4448-a710-9cc62df43c14","Type":"ContainerStarted","Data":"bf7020848a4fd4e66c302ace39995f3dc8faff4cf710738269a9c7590ed030dc"} Mar 18 16:44:42.743010 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:42.742938 2578 generic.go:358] "Generic (PLEG): container finished" podID="42ddca27-69b7-4448-a710-9cc62df43c14" containerID="bf7020848a4fd4e66c302ace39995f3dc8faff4cf710738269a9c7590ed030dc" exitCode=0 Mar 18 16:44:42.743408 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:42.743003 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-m75lt" event={"ID":"42ddca27-69b7-4448-a710-9cc62df43c14","Type":"ContainerDied","Data":"bf7020848a4fd4e66c302ace39995f3dc8faff4cf710738269a9c7590ed030dc"} Mar 18 16:44:43.037991 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:43.037901 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/73d3b30d-1d2b-4666-8183-1a58e717bac7-registry-tls\") pod \"image-registry-86d7f745b7-6hv58\" (UID: \"73d3b30d-1d2b-4666-8183-1a58e717bac7\") " pod="openshift-image-registry/image-registry-86d7f745b7-6hv58" Mar 18 16:44:43.038137 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:43.038055 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Mar 18 16:44:43.038137 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:43.038073 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-86d7f745b7-6hv58: secret "image-registry-tls" not found Mar 18 16:44:43.038237 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:43.038145 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/73d3b30d-1d2b-4666-8183-1a58e717bac7-registry-tls podName:73d3b30d-1d2b-4666-8183-1a58e717bac7 nodeName:}" failed. No retries permitted until 2026-03-18 16:44:47.038122952 +0000 UTC m=+41.050231762 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/73d3b30d-1d2b-4666-8183-1a58e717bac7-registry-tls") pod "image-registry-86d7f745b7-6hv58" (UID: "73d3b30d-1d2b-4666-8183-1a58e717bac7") : secret "image-registry-tls" not found Mar 18 16:44:43.139228 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:43.139186 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f604a9b3-7355-4f54-8113-531b6f45d7f2-metrics-tls\") pod \"dns-default-bv7gz\" (UID: \"f604a9b3-7355-4f54-8113-531b6f45d7f2\") " pod="openshift-dns/dns-default-bv7gz" Mar 18 16:44:43.139408 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:43.139238 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3d42738c-ceaa-4925-8255-a2b61010e00f-cert\") pod \"ingress-canary-gwqfh\" (UID: \"3d42738c-ceaa-4925-8255-a2b61010e00f\") " pod="openshift-ingress-canary/ingress-canary-gwqfh" Mar 18 16:44:43.139408 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:43.139345 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Mar 18 16:44:43.139501 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:43.139410 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Mar 18 16:44:43.139501 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:43.139425 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f604a9b3-7355-4f54-8113-531b6f45d7f2-metrics-tls podName:f604a9b3-7355-4f54-8113-531b6f45d7f2 nodeName:}" failed. No retries permitted until 2026-03-18 16:44:47.139405269 +0000 UTC m=+41.151514090 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f604a9b3-7355-4f54-8113-531b6f45d7f2-metrics-tls") pod "dns-default-bv7gz" (UID: "f604a9b3-7355-4f54-8113-531b6f45d7f2") : secret "dns-default-metrics-tls" not found Mar 18 16:44:43.139501 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:43.139473 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3d42738c-ceaa-4925-8255-a2b61010e00f-cert podName:3d42738c-ceaa-4925-8255-a2b61010e00f nodeName:}" failed. No retries permitted until 2026-03-18 16:44:47.139455329 +0000 UTC m=+41.151564135 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3d42738c-ceaa-4925-8255-a2b61010e00f-cert") pod "ingress-canary-gwqfh" (UID: "3d42738c-ceaa-4925-8255-a2b61010e00f") : secret "canary-serving-cert" not found Mar 18 16:44:43.748340 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:43.748300 2578 generic.go:358] "Generic (PLEG): container finished" podID="42ddca27-69b7-4448-a710-9cc62df43c14" containerID="4f3d5c09e7be2b4827991a9b46aa3b213dd3923c5c9a6e0a603f7e1e1ae6c9ef" exitCode=0 Mar 18 16:44:43.748806 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:43.748377 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-m75lt" event={"ID":"42ddca27-69b7-4448-a710-9cc62df43c14","Type":"ContainerDied","Data":"4f3d5c09e7be2b4827991a9b46aa3b213dd3923c5c9a6e0a603f7e1e1ae6c9ef"} Mar 18 16:44:44.753092 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:44.753056 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-m75lt" event={"ID":"42ddca27-69b7-4448-a710-9cc62df43c14","Type":"ContainerStarted","Data":"3d3c1c767b1f5cb404d5cade80b1c610fbecf29e8cf598b0c2a7584a36ea7907"} Mar 18 16:44:44.754461 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:44.754432 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-ml2kv" event={"ID":"6fad1cd6-2abc-416f-8534-fda50c4cccd9","Type":"ContainerStarted","Data":"74a119670b6674a12148a7b83e89a76b2ce5f105b9a42003cedb2cc60127cd73"} Mar 18 16:44:44.754607 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:44.754587 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-ml2kv" Mar 18 16:44:44.774211 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:44.774156 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-m75lt" podStartSLOduration=6.605883992 podStartE2EDuration="38.774139734s" podCreationTimestamp="2026-03-18 16:44:06 +0000 UTC" firstStartedPulling="2026-03-18 16:44:09.378105305 +0000 UTC m=+3.390214107" lastFinishedPulling="2026-03-18 16:44:41.546361046 +0000 UTC m=+35.558469849" observedRunningTime="2026-03-18 16:44:44.773320884 +0000 UTC m=+38.785429711" watchObservedRunningTime="2026-03-18 16:44:44.774139734 +0000 UTC m=+38.786248562" Mar 18 16:44:44.787983 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:44.787940 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-ml2kv" podStartSLOduration=35.724168895 podStartE2EDuration="38.787927262s" podCreationTimestamp="2026-03-18 16:44:06 +0000 UTC" firstStartedPulling="2026-03-18 16:44:41.523347019 +0000 UTC m=+35.535455835" lastFinishedPulling="2026-03-18 16:44:44.587105385 +0000 UTC m=+38.599214202" observedRunningTime="2026-03-18 16:44:44.787084129 +0000 UTC m=+38.799192954" watchObservedRunningTime="2026-03-18 16:44:44.787927262 +0000 UTC m=+38.800036087" Mar 18 16:44:47.070998 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:47.070954 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/73d3b30d-1d2b-4666-8183-1a58e717bac7-registry-tls\") pod \"image-registry-86d7f745b7-6hv58\" (UID: \"73d3b30d-1d2b-4666-8183-1a58e717bac7\") " pod="openshift-image-registry/image-registry-86d7f745b7-6hv58" Mar 18 16:44:47.071383 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:47.071115 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Mar 18 16:44:47.071383 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:47.071135 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-86d7f745b7-6hv58: secret "image-registry-tls" not found Mar 18 16:44:47.071383 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:47.071192 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/73d3b30d-1d2b-4666-8183-1a58e717bac7-registry-tls podName:73d3b30d-1d2b-4666-8183-1a58e717bac7 nodeName:}" failed. No retries permitted until 2026-03-18 16:44:55.071176926 +0000 UTC m=+49.083285738 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/73d3b30d-1d2b-4666-8183-1a58e717bac7-registry-tls") pod "image-registry-86d7f745b7-6hv58" (UID: "73d3b30d-1d2b-4666-8183-1a58e717bac7") : secret "image-registry-tls" not found Mar 18 16:44:47.172146 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:47.172102 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f604a9b3-7355-4f54-8113-531b6f45d7f2-metrics-tls\") pod \"dns-default-bv7gz\" (UID: \"f604a9b3-7355-4f54-8113-531b6f45d7f2\") " pod="openshift-dns/dns-default-bv7gz" Mar 18 16:44:47.172146 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:47.172148 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3d42738c-ceaa-4925-8255-a2b61010e00f-cert\") pod \"ingress-canary-gwqfh\" (UID: \"3d42738c-ceaa-4925-8255-a2b61010e00f\") " pod="openshift-ingress-canary/ingress-canary-gwqfh" Mar 18 16:44:47.172373 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:47.172255 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Mar 18 16:44:47.172373 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:47.172318 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f604a9b3-7355-4f54-8113-531b6f45d7f2-metrics-tls podName:f604a9b3-7355-4f54-8113-531b6f45d7f2 nodeName:}" failed. No retries permitted until 2026-03-18 16:44:55.172302356 +0000 UTC m=+49.184411158 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f604a9b3-7355-4f54-8113-531b6f45d7f2-metrics-tls") pod "dns-default-bv7gz" (UID: "f604a9b3-7355-4f54-8113-531b6f45d7f2") : secret "dns-default-metrics-tls" not found Mar 18 16:44:47.172373 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:47.172266 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Mar 18 16:44:47.172476 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:47.172393 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3d42738c-ceaa-4925-8255-a2b61010e00f-cert podName:3d42738c-ceaa-4925-8255-a2b61010e00f nodeName:}" failed. No retries permitted until 2026-03-18 16:44:55.172380337 +0000 UTC m=+49.184489145 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3d42738c-ceaa-4925-8255-a2b61010e00f-cert") pod "ingress-canary-gwqfh" (UID: "3d42738c-ceaa-4925-8255-a2b61010e00f") : secret "canary-serving-cert" not found Mar 18 16:44:52.308245 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:52.308195 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/31d1ce77-15fd-48f5-844f-07de4a0cdfc4-original-pull-secret\") pod \"global-pull-secret-syncer-52khs\" (UID: \"31d1ce77-15fd-48f5-844f-07de4a0cdfc4\") " pod="kube-system/global-pull-secret-syncer-52khs" Mar 18 16:44:52.312151 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:52.312122 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/31d1ce77-15fd-48f5-844f-07de4a0cdfc4-original-pull-secret\") pod \"global-pull-secret-syncer-52khs\" (UID: \"31d1ce77-15fd-48f5-844f-07de4a0cdfc4\") " pod="kube-system/global-pull-secret-syncer-52khs" Mar 18 16:44:52.499823 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:52.499782 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-52khs" Mar 18 16:44:52.631874 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:52.631825 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-52khs"] Mar 18 16:44:52.636267 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:44:52.636241 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod31d1ce77_15fd_48f5_844f_07de4a0cdfc4.slice/crio-eb84f8dbccfa0a5d9e82b2ae6275e535376509e21468652efe5fb6235ce3d72d WatchSource:0}: Error finding container eb84f8dbccfa0a5d9e82b2ae6275e535376509e21468652efe5fb6235ce3d72d: Status 404 returned error can't find the container with id eb84f8dbccfa0a5d9e82b2ae6275e535376509e21468652efe5fb6235ce3d72d Mar 18 16:44:52.771287 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:52.771256 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-52khs" event={"ID":"31d1ce77-15fd-48f5-844f-07de4a0cdfc4","Type":"ContainerStarted","Data":"eb84f8dbccfa0a5d9e82b2ae6275e535376509e21468652efe5fb6235ce3d72d"} Mar 18 16:44:55.130467 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:55.130431 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/73d3b30d-1d2b-4666-8183-1a58e717bac7-registry-tls\") pod \"image-registry-86d7f745b7-6hv58\" (UID: \"73d3b30d-1d2b-4666-8183-1a58e717bac7\") " pod="openshift-image-registry/image-registry-86d7f745b7-6hv58" Mar 18 16:44:55.130867 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:55.130599 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Mar 18 16:44:55.130867 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:55.130619 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-86d7f745b7-6hv58: secret "image-registry-tls" not found Mar 18 16:44:55.130867 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:55.130675 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/73d3b30d-1d2b-4666-8183-1a58e717bac7-registry-tls podName:73d3b30d-1d2b-4666-8183-1a58e717bac7 nodeName:}" failed. No retries permitted until 2026-03-18 16:45:11.130658689 +0000 UTC m=+65.142767493 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/73d3b30d-1d2b-4666-8183-1a58e717bac7-registry-tls") pod "image-registry-86d7f745b7-6hv58" (UID: "73d3b30d-1d2b-4666-8183-1a58e717bac7") : secret "image-registry-tls" not found Mar 18 16:44:55.231519 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:55.231473 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f604a9b3-7355-4f54-8113-531b6f45d7f2-metrics-tls\") pod \"dns-default-bv7gz\" (UID: \"f604a9b3-7355-4f54-8113-531b6f45d7f2\") " pod="openshift-dns/dns-default-bv7gz" Mar 18 16:44:55.231729 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:55.231546 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3d42738c-ceaa-4925-8255-a2b61010e00f-cert\") pod \"ingress-canary-gwqfh\" (UID: \"3d42738c-ceaa-4925-8255-a2b61010e00f\") " pod="openshift-ingress-canary/ingress-canary-gwqfh" Mar 18 16:44:55.231729 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:55.231644 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Mar 18 16:44:55.231729 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:55.231706 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Mar 18 16:44:55.231885 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:55.231751 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f604a9b3-7355-4f54-8113-531b6f45d7f2-metrics-tls podName:f604a9b3-7355-4f54-8113-531b6f45d7f2 nodeName:}" failed. No retries permitted until 2026-03-18 16:45:11.231728573 +0000 UTC m=+65.243837381 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f604a9b3-7355-4f54-8113-531b6f45d7f2-metrics-tls") pod "dns-default-bv7gz" (UID: "f604a9b3-7355-4f54-8113-531b6f45d7f2") : secret "dns-default-metrics-tls" not found Mar 18 16:44:55.231885 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:44:55.231771 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3d42738c-ceaa-4925-8255-a2b61010e00f-cert podName:3d42738c-ceaa-4925-8255-a2b61010e00f nodeName:}" failed. No retries permitted until 2026-03-18 16:45:11.231762237 +0000 UTC m=+65.243871044 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3d42738c-ceaa-4925-8255-a2b61010e00f-cert") pod "ingress-canary-gwqfh" (UID: "3d42738c-ceaa-4925-8255-a2b61010e00f") : secret "canary-serving-cert" not found Mar 18 16:44:57.783661 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:57.783625 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-52khs" event={"ID":"31d1ce77-15fd-48f5-844f-07de4a0cdfc4","Type":"ContainerStarted","Data":"2498e4d189e17a7329aef86220f325c2b9595dae2b8bc22249f0a45cdc7e55e5"} Mar 18 16:44:57.814054 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:44:57.814005 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-52khs" podStartSLOduration=33.716075936 podStartE2EDuration="37.813990271s" podCreationTimestamp="2026-03-18 16:44:20 +0000 UTC" firstStartedPulling="2026-03-18 16:44:52.63799262 +0000 UTC m=+46.650101427" lastFinishedPulling="2026-03-18 16:44:56.735906959 +0000 UTC m=+50.748015762" observedRunningTime="2026-03-18 16:44:57.813838634 +0000 UTC m=+51.825947459" watchObservedRunningTime="2026-03-18 16:44:57.813990271 +0000 UTC m=+51.826099095" Mar 18 16:45:05.734187 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:45:05.734156 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qrlf7" Mar 18 16:45:11.141175 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:45:11.141137 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/73d3b30d-1d2b-4666-8183-1a58e717bac7-registry-tls\") pod \"image-registry-86d7f745b7-6hv58\" (UID: \"73d3b30d-1d2b-4666-8183-1a58e717bac7\") " pod="openshift-image-registry/image-registry-86d7f745b7-6hv58" Mar 18 16:45:11.141617 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:45:11.141295 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Mar 18 16:45:11.141617 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:45:11.141316 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-86d7f745b7-6hv58: secret "image-registry-tls" not found Mar 18 16:45:11.141617 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:45:11.141386 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/73d3b30d-1d2b-4666-8183-1a58e717bac7-registry-tls podName:73d3b30d-1d2b-4666-8183-1a58e717bac7 nodeName:}" failed. No retries permitted until 2026-03-18 16:45:43.141366598 +0000 UTC m=+97.153475402 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/73d3b30d-1d2b-4666-8183-1a58e717bac7-registry-tls") pod "image-registry-86d7f745b7-6hv58" (UID: "73d3b30d-1d2b-4666-8183-1a58e717bac7") : secret "image-registry-tls" not found Mar 18 16:45:11.241839 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:45:11.241803 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f604a9b3-7355-4f54-8113-531b6f45d7f2-metrics-tls\") pod \"dns-default-bv7gz\" (UID: \"f604a9b3-7355-4f54-8113-531b6f45d7f2\") " pod="openshift-dns/dns-default-bv7gz" Mar 18 16:45:11.241839 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:45:11.241843 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3d42738c-ceaa-4925-8255-a2b61010e00f-cert\") pod \"ingress-canary-gwqfh\" (UID: \"3d42738c-ceaa-4925-8255-a2b61010e00f\") " pod="openshift-ingress-canary/ingress-canary-gwqfh" Mar 18 16:45:11.242061 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:45:11.241944 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Mar 18 16:45:11.242061 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:45:11.241997 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3d42738c-ceaa-4925-8255-a2b61010e00f-cert podName:3d42738c-ceaa-4925-8255-a2b61010e00f nodeName:}" failed. No retries permitted until 2026-03-18 16:45:43.241981513 +0000 UTC m=+97.254090316 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3d42738c-ceaa-4925-8255-a2b61010e00f-cert") pod "ingress-canary-gwqfh" (UID: "3d42738c-ceaa-4925-8255-a2b61010e00f") : secret "canary-serving-cert" not found Mar 18 16:45:11.242061 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:45:11.241944 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Mar 18 16:45:11.242165 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:45:11.242083 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f604a9b3-7355-4f54-8113-531b6f45d7f2-metrics-tls podName:f604a9b3-7355-4f54-8113-531b6f45d7f2 nodeName:}" failed. No retries permitted until 2026-03-18 16:45:43.242070565 +0000 UTC m=+97.254179367 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f604a9b3-7355-4f54-8113-531b6f45d7f2-metrics-tls") pod "dns-default-bv7gz" (UID: "f604a9b3-7355-4f54-8113-531b6f45d7f2") : secret "dns-default-metrics-tls" not found Mar 18 16:45:12.348387 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:45:12.348340 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4bd683e7-9eae-4e40-ba74-10c5acb6fd8b-metrics-certs\") pod \"network-metrics-daemon-bcgtq\" (UID: \"4bd683e7-9eae-4e40-ba74-10c5acb6fd8b\") " pod="openshift-multus/network-metrics-daemon-bcgtq" Mar 18 16:45:12.350645 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:45:12.350624 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Mar 18 16:45:12.359536 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:45:12.359506 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Mar 18 16:45:12.359623 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:45:12.359602 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4bd683e7-9eae-4e40-ba74-10c5acb6fd8b-metrics-certs podName:4bd683e7-9eae-4e40-ba74-10c5acb6fd8b nodeName:}" failed. No retries permitted until 2026-03-18 16:46:16.359583192 +0000 UTC m=+130.371691995 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4bd683e7-9eae-4e40-ba74-10c5acb6fd8b-metrics-certs") pod "network-metrics-daemon-bcgtq" (UID: "4bd683e7-9eae-4e40-ba74-10c5acb6fd8b") : secret "metrics-daemon-secret" not found Mar 18 16:45:15.758461 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:45:15.758434 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-ml2kv" Mar 18 16:45:43.166049 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:45:43.165917 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/73d3b30d-1d2b-4666-8183-1a58e717bac7-registry-tls\") pod \"image-registry-86d7f745b7-6hv58\" (UID: \"73d3b30d-1d2b-4666-8183-1a58e717bac7\") " pod="openshift-image-registry/image-registry-86d7f745b7-6hv58" Mar 18 16:45:43.166049 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:45:43.166037 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Mar 18 16:45:43.166049 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:45:43.166051 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-86d7f745b7-6hv58: secret "image-registry-tls" not found Mar 18 16:45:43.166608 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:45:43.166116 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/73d3b30d-1d2b-4666-8183-1a58e717bac7-registry-tls podName:73d3b30d-1d2b-4666-8183-1a58e717bac7 nodeName:}" failed. No retries permitted until 2026-03-18 16:46:47.166096678 +0000 UTC m=+161.178205487 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/73d3b30d-1d2b-4666-8183-1a58e717bac7-registry-tls") pod "image-registry-86d7f745b7-6hv58" (UID: "73d3b30d-1d2b-4666-8183-1a58e717bac7") : secret "image-registry-tls" not found Mar 18 16:45:43.266652 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:45:43.266605 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f604a9b3-7355-4f54-8113-531b6f45d7f2-metrics-tls\") pod \"dns-default-bv7gz\" (UID: \"f604a9b3-7355-4f54-8113-531b6f45d7f2\") " pod="openshift-dns/dns-default-bv7gz" Mar 18 16:45:43.266772 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:45:43.266663 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3d42738c-ceaa-4925-8255-a2b61010e00f-cert\") pod \"ingress-canary-gwqfh\" (UID: \"3d42738c-ceaa-4925-8255-a2b61010e00f\") " pod="openshift-ingress-canary/ingress-canary-gwqfh" Mar 18 16:45:43.266809 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:45:43.266770 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Mar 18 16:45:43.266840 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:45:43.266830 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3d42738c-ceaa-4925-8255-a2b61010e00f-cert podName:3d42738c-ceaa-4925-8255-a2b61010e00f nodeName:}" failed. No retries permitted until 2026-03-18 16:46:47.266815388 +0000 UTC m=+161.278924197 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3d42738c-ceaa-4925-8255-a2b61010e00f-cert") pod "ingress-canary-gwqfh" (UID: "3d42738c-ceaa-4925-8255-a2b61010e00f") : secret "canary-serving-cert" not found Mar 18 16:45:43.266882 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:45:43.266770 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Mar 18 16:45:43.266882 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:45:43.266866 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f604a9b3-7355-4f54-8113-531b6f45d7f2-metrics-tls podName:f604a9b3-7355-4f54-8113-531b6f45d7f2 nodeName:}" failed. No retries permitted until 2026-03-18 16:46:47.266856901 +0000 UTC m=+161.278965707 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f604a9b3-7355-4f54-8113-531b6f45d7f2-metrics-tls") pod "dns-default-bv7gz" (UID: "f604a9b3-7355-4f54-8113-531b6f45d7f2") : secret "dns-default-metrics-tls" not found Mar 18 16:46:16.388791 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:46:16.388749 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4bd683e7-9eae-4e40-ba74-10c5acb6fd8b-metrics-certs\") pod \"network-metrics-daemon-bcgtq\" (UID: \"4bd683e7-9eae-4e40-ba74-10c5acb6fd8b\") " pod="openshift-multus/network-metrics-daemon-bcgtq" Mar 18 16:46:16.389268 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:46:16.388862 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Mar 18 16:46:16.389268 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:46:16.388921 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4bd683e7-9eae-4e40-ba74-10c5acb6fd8b-metrics-certs podName:4bd683e7-9eae-4e40-ba74-10c5acb6fd8b nodeName:}" failed. No retries permitted until 2026-03-18 16:48:18.388907287 +0000 UTC m=+252.401016090 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4bd683e7-9eae-4e40-ba74-10c5acb6fd8b-metrics-certs") pod "network-metrics-daemon-bcgtq" (UID: "4bd683e7-9eae-4e40-ba74-10c5acb6fd8b") : secret "metrics-daemon-secret" not found Mar 18 16:46:32.875452 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:46:32.875415 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-67fdcb5769-8bt8k"] Mar 18 16:46:32.878262 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:46:32.878239 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-76bdd9f478-x922g"] Mar 18 16:46:32.878404 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:46:32.878386 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-67fdcb5769-8bt8k" Mar 18 16:46:32.881612 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:46:32.881584 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-76bdd9f478-x922g" Mar 18 16:46:32.881797 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:46:32.881760 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Mar 18 16:46:32.881891 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:46:32.881769 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-k5bg4\"" Mar 18 16:46:32.882591 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:46:32.882572 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Mar 18 16:46:32.883812 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:46:32.883793 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Mar 18 16:46:32.883922 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:46:32.883811 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-xhcft\"" Mar 18 16:46:32.883986 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:46:32.883942 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Mar 18 16:46:32.883986 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:46:32.883943 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Mar 18 16:46:32.884764 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:46:32.884746 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Mar 18 16:46:32.889600 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:46:32.889581 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Mar 18 16:46:32.893352 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:46:32.893330 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-76bdd9f478-x922g"] Mar 18 16:46:32.894423 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:46:32.894398 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-67fdcb5769-8bt8k"] Mar 18 16:46:32.959700 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:46:32.959668 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-cc88fdd44-2v9sz"] Mar 18 16:46:32.962475 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:46:32.962460 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-cc88fdd44-2v9sz" Mar 18 16:46:32.964567 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:46:32.964546 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-87nfg\"" Mar 18 16:46:32.968875 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:46:32.968853 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-cc88fdd44-2v9sz"] Mar 18 16:46:33.004044 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:46:33.004017 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/61e0afa5-670e-42e7-8743-980838b63847-service-ca-bundle\") pod \"insights-operator-76bdd9f478-x922g\" (UID: \"61e0afa5-670e-42e7-8743-980838b63847\") " pod="openshift-insights/insights-operator-76bdd9f478-x922g" Mar 18 16:46:33.004215 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:46:33.004053 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/61e0afa5-670e-42e7-8743-980838b63847-snapshots\") pod \"insights-operator-76bdd9f478-x922g\" (UID: \"61e0afa5-670e-42e7-8743-980838b63847\") " pod="openshift-insights/insights-operator-76bdd9f478-x922g" Mar 18 16:46:33.004215 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:46:33.004142 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/61e0afa5-670e-42e7-8743-980838b63847-trusted-ca-bundle\") pod \"insights-operator-76bdd9f478-x922g\" (UID: \"61e0afa5-670e-42e7-8743-980838b63847\") " pod="openshift-insights/insights-operator-76bdd9f478-x922g" Mar 18 16:46:33.004334 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:46:33.004233 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/61e0afa5-670e-42e7-8743-980838b63847-tmp\") pod \"insights-operator-76bdd9f478-x922g\" (UID: \"61e0afa5-670e-42e7-8743-980838b63847\") " pod="openshift-insights/insights-operator-76bdd9f478-x922g" Mar 18 16:46:33.004334 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:46:33.004297 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddlkl\" (UniqueName: \"kubernetes.io/projected/61e0afa5-670e-42e7-8743-980838b63847-kube-api-access-ddlkl\") pod \"insights-operator-76bdd9f478-x922g\" (UID: \"61e0afa5-670e-42e7-8743-980838b63847\") " pod="openshift-insights/insights-operator-76bdd9f478-x922g" Mar 18 16:46:33.004415 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:46:33.004339 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/61e0afa5-670e-42e7-8743-980838b63847-serving-cert\") pod \"insights-operator-76bdd9f478-x922g\" (UID: \"61e0afa5-670e-42e7-8743-980838b63847\") " pod="openshift-insights/insights-operator-76bdd9f478-x922g" Mar 18 16:46:33.004415 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:46:33.004394 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzl5x\" (UniqueName: \"kubernetes.io/projected/e11605b2-ec2b-4f4f-82c3-4db202827f31-kube-api-access-pzl5x\") pod \"volume-data-source-validator-67fdcb5769-8bt8k\" (UID: \"e11605b2-ec2b-4f4f-82c3-4db202827f31\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-67fdcb5769-8bt8k" Mar 18 16:46:33.105389 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:46:33.105350 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/61e0afa5-670e-42e7-8743-980838b63847-tmp\") pod \"insights-operator-76bdd9f478-x922g\" (UID: \"61e0afa5-670e-42e7-8743-980838b63847\") " pod="openshift-insights/insights-operator-76bdd9f478-x922g" Mar 18 16:46:33.105389 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:46:33.105394 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ddlkl\" (UniqueName: \"kubernetes.io/projected/61e0afa5-670e-42e7-8743-980838b63847-kube-api-access-ddlkl\") pod \"insights-operator-76bdd9f478-x922g\" (UID: \"61e0afa5-670e-42e7-8743-980838b63847\") " pod="openshift-insights/insights-operator-76bdd9f478-x922g" Mar 18 16:46:33.105606 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:46:33.105427 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/61e0afa5-670e-42e7-8743-980838b63847-serving-cert\") pod \"insights-operator-76bdd9f478-x922g\" (UID: \"61e0afa5-670e-42e7-8743-980838b63847\") " pod="openshift-insights/insights-operator-76bdd9f478-x922g" Mar 18 16:46:33.105606 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:46:33.105449 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pzl5x\" (UniqueName: \"kubernetes.io/projected/e11605b2-ec2b-4f4f-82c3-4db202827f31-kube-api-access-pzl5x\") pod \"volume-data-source-validator-67fdcb5769-8bt8k\" (UID: \"e11605b2-ec2b-4f4f-82c3-4db202827f31\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-67fdcb5769-8bt8k" Mar 18 16:46:33.105606 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:46:33.105481 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/61e0afa5-670e-42e7-8743-980838b63847-service-ca-bundle\") pod \"insights-operator-76bdd9f478-x922g\" (UID: \"61e0afa5-670e-42e7-8743-980838b63847\") " pod="openshift-insights/insights-operator-76bdd9f478-x922g" Mar 18 16:46:33.105606 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:46:33.105501 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/61e0afa5-670e-42e7-8743-980838b63847-snapshots\") pod \"insights-operator-76bdd9f478-x922g\" (UID: \"61e0afa5-670e-42e7-8743-980838b63847\") " pod="openshift-insights/insights-operator-76bdd9f478-x922g" Mar 18 16:46:33.105606 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:46:33.105552 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6r8q2\" (UniqueName: \"kubernetes.io/projected/8ed2bd8c-3d8b-4f0b-84ac-e32e792e1e83-kube-api-access-6r8q2\") pod \"network-check-source-cc88fdd44-2v9sz\" (UID: \"8ed2bd8c-3d8b-4f0b-84ac-e32e792e1e83\") " pod="openshift-network-diagnostics/network-check-source-cc88fdd44-2v9sz" Mar 18 16:46:33.105606 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:46:33.105587 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/61e0afa5-670e-42e7-8743-980838b63847-trusted-ca-bundle\") pod \"insights-operator-76bdd9f478-x922g\" (UID: \"61e0afa5-670e-42e7-8743-980838b63847\") " pod="openshift-insights/insights-operator-76bdd9f478-x922g" Mar 18 16:46:33.105868 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:46:33.105847 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/61e0afa5-670e-42e7-8743-980838b63847-tmp\") pod \"insights-operator-76bdd9f478-x922g\" (UID: \"61e0afa5-670e-42e7-8743-980838b63847\") " pod="openshift-insights/insights-operator-76bdd9f478-x922g" Mar 18 16:46:33.106109 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:46:33.106082 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/61e0afa5-670e-42e7-8743-980838b63847-snapshots\") pod \"insights-operator-76bdd9f478-x922g\" (UID: \"61e0afa5-670e-42e7-8743-980838b63847\") " pod="openshift-insights/insights-operator-76bdd9f478-x922g" Mar 18 16:46:33.106247 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:46:33.106221 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/61e0afa5-670e-42e7-8743-980838b63847-service-ca-bundle\") pod \"insights-operator-76bdd9f478-x922g\" (UID: \"61e0afa5-670e-42e7-8743-980838b63847\") " pod="openshift-insights/insights-operator-76bdd9f478-x922g" Mar 18 16:46:33.106418 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:46:33.106402 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/61e0afa5-670e-42e7-8743-980838b63847-trusted-ca-bundle\") pod \"insights-operator-76bdd9f478-x922g\" (UID: \"61e0afa5-670e-42e7-8743-980838b63847\") " pod="openshift-insights/insights-operator-76bdd9f478-x922g" Mar 18 16:46:33.107918 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:46:33.107890 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/61e0afa5-670e-42e7-8743-980838b63847-serving-cert\") pod \"insights-operator-76bdd9f478-x922g\" (UID: \"61e0afa5-670e-42e7-8743-980838b63847\") " pod="openshift-insights/insights-operator-76bdd9f478-x922g" Mar 18 16:46:33.113543 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:46:33.113508 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddlkl\" (UniqueName: \"kubernetes.io/projected/61e0afa5-670e-42e7-8743-980838b63847-kube-api-access-ddlkl\") pod \"insights-operator-76bdd9f478-x922g\" (UID: \"61e0afa5-670e-42e7-8743-980838b63847\") " pod="openshift-insights/insights-operator-76bdd9f478-x922g" Mar 18 16:46:33.113646 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:46:33.113619 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzl5x\" (UniqueName: \"kubernetes.io/projected/e11605b2-ec2b-4f4f-82c3-4db202827f31-kube-api-access-pzl5x\") pod \"volume-data-source-validator-67fdcb5769-8bt8k\" (UID: \"e11605b2-ec2b-4f4f-82c3-4db202827f31\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-67fdcb5769-8bt8k" Mar 18 16:46:33.189245 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:46:33.189161 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-67fdcb5769-8bt8k" Mar 18 16:46:33.194915 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:46:33.194890 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-76bdd9f478-x922g" Mar 18 16:46:33.206612 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:46:33.206591 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6r8q2\" (UniqueName: \"kubernetes.io/projected/8ed2bd8c-3d8b-4f0b-84ac-e32e792e1e83-kube-api-access-6r8q2\") pod \"network-check-source-cc88fdd44-2v9sz\" (UID: \"8ed2bd8c-3d8b-4f0b-84ac-e32e792e1e83\") " pod="openshift-network-diagnostics/network-check-source-cc88fdd44-2v9sz" Mar 18 16:46:33.215428 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:46:33.215399 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6r8q2\" (UniqueName: \"kubernetes.io/projected/8ed2bd8c-3d8b-4f0b-84ac-e32e792e1e83-kube-api-access-6r8q2\") pod \"network-check-source-cc88fdd44-2v9sz\" (UID: \"8ed2bd8c-3d8b-4f0b-84ac-e32e792e1e83\") " pod="openshift-network-diagnostics/network-check-source-cc88fdd44-2v9sz" Mar 18 16:46:33.270940 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:46:33.270913 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-cc88fdd44-2v9sz" Mar 18 16:46:33.315678 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:46:33.315644 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-67fdcb5769-8bt8k"] Mar 18 16:46:33.318558 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:46:33.318515 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode11605b2_ec2b_4f4f_82c3_4db202827f31.slice/crio-3adcabc33f854c9ea7c17d4d7dbe1cd32df6127931c40cf94ff1e4de90a0f6aa WatchSource:0}: Error finding container 3adcabc33f854c9ea7c17d4d7dbe1cd32df6127931c40cf94ff1e4de90a0f6aa: Status 404 returned error can't find the container with id 3adcabc33f854c9ea7c17d4d7dbe1cd32df6127931c40cf94ff1e4de90a0f6aa Mar 18 16:46:33.328704 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:46:33.328672 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-76bdd9f478-x922g"] Mar 18 16:46:33.333734 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:46:33.333705 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod61e0afa5_670e_42e7_8743_980838b63847.slice/crio-21e805c64f1a003e229f5d27d5939860836b70a620340628a39cff0810af003f WatchSource:0}: Error finding container 21e805c64f1a003e229f5d27d5939860836b70a620340628a39cff0810af003f: Status 404 returned error can't find the container with id 21e805c64f1a003e229f5d27d5939860836b70a620340628a39cff0810af003f Mar 18 16:46:33.393790 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:46:33.393752 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-cc88fdd44-2v9sz"] Mar 18 16:46:33.397458 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:46:33.397415 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ed2bd8c_3d8b_4f0b_84ac_e32e792e1e83.slice/crio-4217e107881679eb32ba71797d4f7df07c2cd9fcd3c4a3d21cca72e87acba897 WatchSource:0}: Error finding container 4217e107881679eb32ba71797d4f7df07c2cd9fcd3c4a3d21cca72e87acba897: Status 404 returned error can't find the container with id 4217e107881679eb32ba71797d4f7df07c2cd9fcd3c4a3d21cca72e87acba897 Mar 18 16:46:33.961476 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:46:33.961437 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-cc88fdd44-2v9sz" event={"ID":"8ed2bd8c-3d8b-4f0b-84ac-e32e792e1e83","Type":"ContainerStarted","Data":"2519a30f5d29b8616a5209a1858b72d75abb68b23f21f7351b990f70dc82c09b"} Mar 18 16:46:33.961476 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:46:33.961478 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-cc88fdd44-2v9sz" event={"ID":"8ed2bd8c-3d8b-4f0b-84ac-e32e792e1e83","Type":"ContainerStarted","Data":"4217e107881679eb32ba71797d4f7df07c2cd9fcd3c4a3d21cca72e87acba897"} Mar 18 16:46:33.962703 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:46:33.962670 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-76bdd9f478-x922g" event={"ID":"61e0afa5-670e-42e7-8743-980838b63847","Type":"ContainerStarted","Data":"21e805c64f1a003e229f5d27d5939860836b70a620340628a39cff0810af003f"} Mar 18 16:46:33.963763 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:46:33.963733 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-67fdcb5769-8bt8k" event={"ID":"e11605b2-ec2b-4f4f-82c3-4db202827f31","Type":"ContainerStarted","Data":"3adcabc33f854c9ea7c17d4d7dbe1cd32df6127931c40cf94ff1e4de90a0f6aa"} Mar 18 16:46:33.993403 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:46:33.993249 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-cc88fdd44-2v9sz" podStartSLOduration=1.993231535 podStartE2EDuration="1.993231535s" podCreationTimestamp="2026-03-18 16:46:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:46:33.993024612 +0000 UTC m=+148.005133438" watchObservedRunningTime="2026-03-18 16:46:33.993231535 +0000 UTC m=+148.005340359" Mar 18 16:46:34.966940 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:46:34.966849 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-67fdcb5769-8bt8k" event={"ID":"e11605b2-ec2b-4f4f-82c3-4db202827f31","Type":"ContainerStarted","Data":"fb65bff15dd8c0273eefec2d1aeec5bb075d4618a7fc9237d2370bc1e7d7c863"} Mar 18 16:46:34.981015 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:46:34.980953 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-67fdcb5769-8bt8k" podStartSLOduration=1.754897124 podStartE2EDuration="2.980934105s" podCreationTimestamp="2026-03-18 16:46:32 +0000 UTC" firstStartedPulling="2026-03-18 16:46:33.323276352 +0000 UTC m=+147.335385168" lastFinishedPulling="2026-03-18 16:46:34.549313338 +0000 UTC m=+148.561422149" observedRunningTime="2026-03-18 16:46:34.980277744 +0000 UTC m=+148.992386568" watchObservedRunningTime="2026-03-18 16:46:34.980934105 +0000 UTC m=+148.993042932" Mar 18 16:46:35.970158 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:46:35.970070 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-76bdd9f478-x922g" event={"ID":"61e0afa5-670e-42e7-8743-980838b63847","Type":"ContainerStarted","Data":"288332924e1de03a5bc1cf8b18e3caba9648b6f1116374355537fc7d9b0b6ec7"} Mar 18 16:46:35.985573 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:46:35.985507 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-76bdd9f478-x922g" podStartSLOduration=1.76056247 podStartE2EDuration="3.985492684s" podCreationTimestamp="2026-03-18 16:46:32 +0000 UTC" firstStartedPulling="2026-03-18 16:46:33.335642336 +0000 UTC m=+147.347751139" lastFinishedPulling="2026-03-18 16:46:35.560572546 +0000 UTC m=+149.572681353" observedRunningTime="2026-03-18 16:46:35.985055722 +0000 UTC m=+149.997164553" watchObservedRunningTime="2026-03-18 16:46:35.985492684 +0000 UTC m=+149.997601508" Mar 18 16:46:39.499375 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:46:39.499345 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-8bb587b94-m547p"] Mar 18 16:46:39.502428 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:46:39.502412 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-8bb587b94-m547p" Mar 18 16:46:39.504480 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:46:39.504458 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Mar 18 16:46:39.504639 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:46:39.504458 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Mar 18 16:46:39.504744 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:46:39.504730 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Mar 18 16:46:39.505437 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:46:39.505422 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-gwdt2\"" Mar 18 16:46:39.505487 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:46:39.505469 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Mar 18 16:46:39.508750 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:46:39.508729 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-8bb587b94-m547p"] Mar 18 16:46:39.659710 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:46:39.659672 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/735def57-8d36-430a-9310-ce8da7f3acde-signing-cabundle\") pod \"service-ca-8bb587b94-m547p\" (UID: \"735def57-8d36-430a-9310-ce8da7f3acde\") " pod="openshift-service-ca/service-ca-8bb587b94-m547p" Mar 18 16:46:39.659710 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:46:39.659716 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8sq5\" (UniqueName: \"kubernetes.io/projected/735def57-8d36-430a-9310-ce8da7f3acde-kube-api-access-x8sq5\") pod \"service-ca-8bb587b94-m547p\" (UID: \"735def57-8d36-430a-9310-ce8da7f3acde\") " pod="openshift-service-ca/service-ca-8bb587b94-m547p" Mar 18 16:46:39.659930 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:46:39.659747 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/735def57-8d36-430a-9310-ce8da7f3acde-signing-key\") pod \"service-ca-8bb587b94-m547p\" (UID: \"735def57-8d36-430a-9310-ce8da7f3acde\") " pod="openshift-service-ca/service-ca-8bb587b94-m547p" Mar 18 16:46:39.760137 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:46:39.760048 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x8sq5\" (UniqueName: \"kubernetes.io/projected/735def57-8d36-430a-9310-ce8da7f3acde-kube-api-access-x8sq5\") pod \"service-ca-8bb587b94-m547p\" (UID: \"735def57-8d36-430a-9310-ce8da7f3acde\") " pod="openshift-service-ca/service-ca-8bb587b94-m547p" Mar 18 16:46:39.760137 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:46:39.760101 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/735def57-8d36-430a-9310-ce8da7f3acde-signing-key\") pod \"service-ca-8bb587b94-m547p\" (UID: \"735def57-8d36-430a-9310-ce8da7f3acde\") " pod="openshift-service-ca/service-ca-8bb587b94-m547p" Mar 18 16:46:39.760361 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:46:39.760231 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/735def57-8d36-430a-9310-ce8da7f3acde-signing-cabundle\") pod \"service-ca-8bb587b94-m547p\" (UID: \"735def57-8d36-430a-9310-ce8da7f3acde\") " pod="openshift-service-ca/service-ca-8bb587b94-m547p" Mar 18 16:46:39.760836 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:46:39.760805 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/735def57-8d36-430a-9310-ce8da7f3acde-signing-cabundle\") pod \"service-ca-8bb587b94-m547p\" (UID: \"735def57-8d36-430a-9310-ce8da7f3acde\") " pod="openshift-service-ca/service-ca-8bb587b94-m547p" Mar 18 16:46:39.762676 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:46:39.762647 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/735def57-8d36-430a-9310-ce8da7f3acde-signing-key\") pod \"service-ca-8bb587b94-m547p\" (UID: \"735def57-8d36-430a-9310-ce8da7f3acde\") " pod="openshift-service-ca/service-ca-8bb587b94-m547p" Mar 18 16:46:39.769000 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:46:39.768981 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8sq5\" (UniqueName: \"kubernetes.io/projected/735def57-8d36-430a-9310-ce8da7f3acde-kube-api-access-x8sq5\") pod \"service-ca-8bb587b94-m547p\" (UID: \"735def57-8d36-430a-9310-ce8da7f3acde\") " pod="openshift-service-ca/service-ca-8bb587b94-m547p" Mar 18 16:46:39.811863 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:46:39.811833 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-8bb587b94-m547p" Mar 18 16:46:39.936245 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:46:39.936212 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-8bb587b94-m547p"] Mar 18 16:46:39.939112 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:46:39.939084 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod735def57_8d36_430a_9310_ce8da7f3acde.slice/crio-9ccbde354e6ece630f9bd55cad4437de41d8f2bd6ff5f9e08949bcdd925e77bc WatchSource:0}: Error finding container 9ccbde354e6ece630f9bd55cad4437de41d8f2bd6ff5f9e08949bcdd925e77bc: Status 404 returned error can't find the container with id 9ccbde354e6ece630f9bd55cad4437de41d8f2bd6ff5f9e08949bcdd925e77bc Mar 18 16:46:39.979024 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:46:39.978995 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-8bb587b94-m547p" event={"ID":"735def57-8d36-430a-9310-ce8da7f3acde","Type":"ContainerStarted","Data":"9ccbde354e6ece630f9bd55cad4437de41d8f2bd6ff5f9e08949bcdd925e77bc"} Mar 18 16:46:40.408253 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:46:40.408225 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-s8lc2_824b5ab3-8c23-48c6-a404-bc9472781b90/dns-node-resolver/0.log" Mar 18 16:46:41.007690 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:46:41.007666 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-bgzmm_f79627fb-f2a3-4632-89c2-5d99716c1cc9/node-ca/0.log" Mar 18 16:46:41.985495 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:46:41.985408 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-8bb587b94-m547p" event={"ID":"735def57-8d36-430a-9310-ce8da7f3acde","Type":"ContainerStarted","Data":"b1457a693368fc6a0aa8e57f3a1abe170265dd7ee5ebe54f5cfa6bf8a626de73"} Mar 18 16:46:42.001572 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:46:42.001455 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-8bb587b94-m547p" podStartSLOduration=1.407725874 podStartE2EDuration="3.001440104s" podCreationTimestamp="2026-03-18 16:46:39 +0000 UTC" firstStartedPulling="2026-03-18 16:46:39.940924058 +0000 UTC m=+153.953032861" lastFinishedPulling="2026-03-18 16:46:41.534638286 +0000 UTC m=+155.546747091" observedRunningTime="2026-03-18 16:46:42.001112276 +0000 UTC m=+156.013221103" watchObservedRunningTime="2026-03-18 16:46:42.001440104 +0000 UTC m=+156.013548936" Mar 18 16:46:42.351227 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:46:42.351182 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-86d7f745b7-6hv58" podUID="73d3b30d-1d2b-4666-8183-1a58e717bac7" Mar 18 16:46:42.363254 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:46:42.363228 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-bv7gz" podUID="f604a9b3-7355-4f54-8113-531b6f45d7f2" Mar 18 16:46:42.388720 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:46:42.388688 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-gwqfh" podUID="3d42738c-ceaa-4925-8255-a2b61010e00f" Mar 18 16:46:42.987751 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:46:42.987713 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-gwqfh" Mar 18 16:46:42.987751 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:46:42.987739 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-86d7f745b7-6hv58" Mar 18 16:46:42.987969 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:46:42.987795 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-bv7gz" Mar 18 16:46:43.602861 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:46:43.602816 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-bcgtq" podUID="4bd683e7-9eae-4e40-ba74-10c5acb6fd8b" Mar 18 16:46:47.224107 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:46:47.224074 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/73d3b30d-1d2b-4666-8183-1a58e717bac7-registry-tls\") pod \"image-registry-86d7f745b7-6hv58\" (UID: \"73d3b30d-1d2b-4666-8183-1a58e717bac7\") " pod="openshift-image-registry/image-registry-86d7f745b7-6hv58" Mar 18 16:46:47.226314 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:46:47.226291 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/73d3b30d-1d2b-4666-8183-1a58e717bac7-registry-tls\") pod \"image-registry-86d7f745b7-6hv58\" (UID: \"73d3b30d-1d2b-4666-8183-1a58e717bac7\") " pod="openshift-image-registry/image-registry-86d7f745b7-6hv58" Mar 18 16:46:47.324584 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:46:47.324509 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f604a9b3-7355-4f54-8113-531b6f45d7f2-metrics-tls\") pod \"dns-default-bv7gz\" (UID: \"f604a9b3-7355-4f54-8113-531b6f45d7f2\") " pod="openshift-dns/dns-default-bv7gz" Mar 18 16:46:47.324756 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:46:47.324611 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3d42738c-ceaa-4925-8255-a2b61010e00f-cert\") pod \"ingress-canary-gwqfh\" (UID: \"3d42738c-ceaa-4925-8255-a2b61010e00f\") " pod="openshift-ingress-canary/ingress-canary-gwqfh" Mar 18 16:46:47.324801 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:46:47.324770 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Mar 18 16:46:47.324850 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:46:47.324839 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3d42738c-ceaa-4925-8255-a2b61010e00f-cert podName:3d42738c-ceaa-4925-8255-a2b61010e00f nodeName:}" failed. No retries permitted until 2026-03-18 16:48:49.324823475 +0000 UTC m=+283.336932278 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3d42738c-ceaa-4925-8255-a2b61010e00f-cert") pod "ingress-canary-gwqfh" (UID: "3d42738c-ceaa-4925-8255-a2b61010e00f") : secret "canary-serving-cert" not found Mar 18 16:46:47.326812 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:46:47.326783 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f604a9b3-7355-4f54-8113-531b6f45d7f2-metrics-tls\") pod \"dns-default-bv7gz\" (UID: \"f604a9b3-7355-4f54-8113-531b6f45d7f2\") " pod="openshift-dns/dns-default-bv7gz" Mar 18 16:46:47.491782 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:46:47.491705 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-775g5\"" Mar 18 16:46:47.491782 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:46:47.491718 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-rcgfm\"" Mar 18 16:46:47.499420 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:46:47.499395 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-bv7gz" Mar 18 16:46:47.499547 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:46:47.499483 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-86d7f745b7-6hv58" Mar 18 16:46:47.843914 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:46:47.843884 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-bv7gz"] Mar 18 16:46:47.846514 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:46:47.846485 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-86d7f745b7-6hv58"] Mar 18 16:46:47.847757 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:46:47.847732 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf604a9b3_7355_4f54_8113_531b6f45d7f2.slice/crio-aadfd38a0f2547d2f47beb8f56b82d0bb01d033edafe02baa4e56b4f2acc50f8 WatchSource:0}: Error finding container aadfd38a0f2547d2f47beb8f56b82d0bb01d033edafe02baa4e56b4f2acc50f8: Status 404 returned error can't find the container with id aadfd38a0f2547d2f47beb8f56b82d0bb01d033edafe02baa4e56b4f2acc50f8 Mar 18 16:46:47.850316 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:46:47.850297 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73d3b30d_1d2b_4666_8183_1a58e717bac7.slice/crio-dc43ee6408b863137d0157aed3e95e5aac15b5fee1c3753e732aeabfcc722e07 WatchSource:0}: Error finding container dc43ee6408b863137d0157aed3e95e5aac15b5fee1c3753e732aeabfcc722e07: Status 404 returned error can't find the container with id dc43ee6408b863137d0157aed3e95e5aac15b5fee1c3753e732aeabfcc722e07 Mar 18 16:46:48.007383 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:46:48.007349 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-86d7f745b7-6hv58" event={"ID":"73d3b30d-1d2b-4666-8183-1a58e717bac7","Type":"ContainerStarted","Data":"d5278eabe108076f3f93eef7d93d1da92b3951ac8cd1a67552b7539859ee9236"} Mar 18 16:46:48.007588 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:46:48.007391 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-86d7f745b7-6hv58" event={"ID":"73d3b30d-1d2b-4666-8183-1a58e717bac7","Type":"ContainerStarted","Data":"dc43ee6408b863137d0157aed3e95e5aac15b5fee1c3753e732aeabfcc722e07"} Mar 18 16:46:48.007588 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:46:48.007540 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-86d7f745b7-6hv58" Mar 18 16:46:48.008845 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:46:48.008816 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-bv7gz" event={"ID":"f604a9b3-7355-4f54-8113-531b6f45d7f2","Type":"ContainerStarted","Data":"aadfd38a0f2547d2f47beb8f56b82d0bb01d033edafe02baa4e56b4f2acc50f8"} Mar 18 16:46:48.030248 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:46:48.030131 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-86d7f745b7-6hv58" podStartSLOduration=171.030097583 podStartE2EDuration="2m51.030097583s" podCreationTimestamp="2026-03-18 16:43:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:46:48.02913352 +0000 UTC m=+162.041242339" watchObservedRunningTime="2026-03-18 16:46:48.030097583 +0000 UTC m=+162.042206409" Mar 18 16:46:50.017763 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:46:50.017725 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-bv7gz" event={"ID":"f604a9b3-7355-4f54-8113-531b6f45d7f2","Type":"ContainerStarted","Data":"84b968ca50866d1edb188f17e3a3e9686e1393699e0b01a85fce0636e42ca22c"} Mar 18 16:46:50.018170 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:46:50.017770 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-bv7gz" event={"ID":"f604a9b3-7355-4f54-8113-531b6f45d7f2","Type":"ContainerStarted","Data":"991879af94a892e80efe5fc1478d7a22c0c62707a2db5efa47da17597e3ed65d"} Mar 18 16:46:50.018170 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:46:50.017847 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-bv7gz" Mar 18 16:46:50.035368 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:46:50.035324 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-bv7gz" podStartSLOduration=129.664692941 podStartE2EDuration="2m11.035309032s" podCreationTimestamp="2026-03-18 16:44:39 +0000 UTC" firstStartedPulling="2026-03-18 16:46:47.849710666 +0000 UTC m=+161.861819473" lastFinishedPulling="2026-03-18 16:46:49.220326751 +0000 UTC m=+163.232435564" observedRunningTime="2026-03-18 16:46:50.034867603 +0000 UTC m=+164.046976430" watchObservedRunningTime="2026-03-18 16:46:50.035309032 +0000 UTC m=+164.047417848" Mar 18 16:46:57.588586 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:46:57.588480 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bcgtq" Mar 18 16:47:00.023486 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:00.023451 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-bv7gz" Mar 18 16:47:01.257456 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:01.257402 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-86d7f745b7-6hv58"] Mar 18 16:47:01.280880 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:01.280855 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-gn87n"] Mar 18 16:47:01.285420 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:01.285403 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-gn87n" Mar 18 16:47:01.287381 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:01.287356 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-jqp6n\"" Mar 18 16:47:01.287748 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:01.287731 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Mar 18 16:47:01.287820 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:01.287734 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Mar 18 16:47:01.297808 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:01.297782 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-gn87n"] Mar 18 16:47:01.301390 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:01.301367 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-6ddd9cbcbd-xgxcr"] Mar 18 16:47:01.304690 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:01.304671 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6ddd9cbcbd-xgxcr" Mar 18 16:47:01.312944 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:01.312920 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6ddd9cbcbd-xgxcr"] Mar 18 16:47:01.329103 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:01.329072 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/21b7e61b-bd0b-483d-a381-12ff0e42fe1f-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-gn87n\" (UID: \"21b7e61b-bd0b-483d-a381-12ff0e42fe1f\") " pod="openshift-insights/insights-runtime-extractor-gn87n" Mar 18 16:47:01.329235 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:01.329120 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/21b7e61b-bd0b-483d-a381-12ff0e42fe1f-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-gn87n\" (UID: \"21b7e61b-bd0b-483d-a381-12ff0e42fe1f\") " pod="openshift-insights/insights-runtime-extractor-gn87n" Mar 18 16:47:01.329235 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:01.329199 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqvhz\" (UniqueName: \"kubernetes.io/projected/21b7e61b-bd0b-483d-a381-12ff0e42fe1f-kube-api-access-cqvhz\") pod \"insights-runtime-extractor-gn87n\" (UID: \"21b7e61b-bd0b-483d-a381-12ff0e42fe1f\") " pod="openshift-insights/insights-runtime-extractor-gn87n" Mar 18 16:47:01.329305 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:01.329234 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/21b7e61b-bd0b-483d-a381-12ff0e42fe1f-crio-socket\") pod \"insights-runtime-extractor-gn87n\" (UID: \"21b7e61b-bd0b-483d-a381-12ff0e42fe1f\") " pod="openshift-insights/insights-runtime-extractor-gn87n" Mar 18 16:47:01.329305 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:01.329249 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/21b7e61b-bd0b-483d-a381-12ff0e42fe1f-data-volume\") pod \"insights-runtime-extractor-gn87n\" (UID: \"21b7e61b-bd0b-483d-a381-12ff0e42fe1f\") " pod="openshift-insights/insights-runtime-extractor-gn87n" Mar 18 16:47:01.430298 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:01.430259 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/415e43ce-3750-4e1f-9f99-17d7cebca805-trusted-ca\") pod \"image-registry-6ddd9cbcbd-xgxcr\" (UID: \"415e43ce-3750-4e1f-9f99-17d7cebca805\") " pod="openshift-image-registry/image-registry-6ddd9cbcbd-xgxcr" Mar 18 16:47:01.430469 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:01.430320 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6mwv\" (UniqueName: \"kubernetes.io/projected/415e43ce-3750-4e1f-9f99-17d7cebca805-kube-api-access-l6mwv\") pod \"image-registry-6ddd9cbcbd-xgxcr\" (UID: \"415e43ce-3750-4e1f-9f99-17d7cebca805\") " pod="openshift-image-registry/image-registry-6ddd9cbcbd-xgxcr" Mar 18 16:47:01.430469 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:01.430363 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cqvhz\" (UniqueName: \"kubernetes.io/projected/21b7e61b-bd0b-483d-a381-12ff0e42fe1f-kube-api-access-cqvhz\") pod \"insights-runtime-extractor-gn87n\" (UID: \"21b7e61b-bd0b-483d-a381-12ff0e42fe1f\") " pod="openshift-insights/insights-runtime-extractor-gn87n" Mar 18 16:47:01.430469 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:01.430385 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/415e43ce-3750-4e1f-9f99-17d7cebca805-registry-certificates\") pod \"image-registry-6ddd9cbcbd-xgxcr\" (UID: \"415e43ce-3750-4e1f-9f99-17d7cebca805\") " pod="openshift-image-registry/image-registry-6ddd9cbcbd-xgxcr" Mar 18 16:47:01.430469 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:01.430405 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/21b7e61b-bd0b-483d-a381-12ff0e42fe1f-crio-socket\") pod \"insights-runtime-extractor-gn87n\" (UID: \"21b7e61b-bd0b-483d-a381-12ff0e42fe1f\") " pod="openshift-insights/insights-runtime-extractor-gn87n" Mar 18 16:47:01.430469 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:01.430428 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/21b7e61b-bd0b-483d-a381-12ff0e42fe1f-data-volume\") pod \"insights-runtime-extractor-gn87n\" (UID: \"21b7e61b-bd0b-483d-a381-12ff0e42fe1f\") " pod="openshift-insights/insights-runtime-extractor-gn87n" Mar 18 16:47:01.430469 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:01.430450 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/415e43ce-3750-4e1f-9f99-17d7cebca805-bound-sa-token\") pod \"image-registry-6ddd9cbcbd-xgxcr\" (UID: \"415e43ce-3750-4e1f-9f99-17d7cebca805\") " pod="openshift-image-registry/image-registry-6ddd9cbcbd-xgxcr" Mar 18 16:47:01.430469 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:01.430469 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/21b7e61b-bd0b-483d-a381-12ff0e42fe1f-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-gn87n\" (UID: \"21b7e61b-bd0b-483d-a381-12ff0e42fe1f\") " pod="openshift-insights/insights-runtime-extractor-gn87n" Mar 18 16:47:01.430807 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:01.430484 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/415e43ce-3750-4e1f-9f99-17d7cebca805-installation-pull-secrets\") pod \"image-registry-6ddd9cbcbd-xgxcr\" (UID: \"415e43ce-3750-4e1f-9f99-17d7cebca805\") " pod="openshift-image-registry/image-registry-6ddd9cbcbd-xgxcr" Mar 18 16:47:01.430807 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:01.430546 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/21b7e61b-bd0b-483d-a381-12ff0e42fe1f-crio-socket\") pod \"insights-runtime-extractor-gn87n\" (UID: \"21b7e61b-bd0b-483d-a381-12ff0e42fe1f\") " pod="openshift-insights/insights-runtime-extractor-gn87n" Mar 18 16:47:01.430807 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:01.430706 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/21b7e61b-bd0b-483d-a381-12ff0e42fe1f-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-gn87n\" (UID: \"21b7e61b-bd0b-483d-a381-12ff0e42fe1f\") " pod="openshift-insights/insights-runtime-extractor-gn87n" Mar 18 16:47:01.430807 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:01.430732 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/415e43ce-3750-4e1f-9f99-17d7cebca805-ca-trust-extracted\") pod \"image-registry-6ddd9cbcbd-xgxcr\" (UID: \"415e43ce-3750-4e1f-9f99-17d7cebca805\") " pod="openshift-image-registry/image-registry-6ddd9cbcbd-xgxcr" Mar 18 16:47:01.430973 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:01.430811 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/21b7e61b-bd0b-483d-a381-12ff0e42fe1f-data-volume\") pod \"insights-runtime-extractor-gn87n\" (UID: \"21b7e61b-bd0b-483d-a381-12ff0e42fe1f\") " pod="openshift-insights/insights-runtime-extractor-gn87n" Mar 18 16:47:01.430973 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:01.430858 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/415e43ce-3750-4e1f-9f99-17d7cebca805-image-registry-private-configuration\") pod \"image-registry-6ddd9cbcbd-xgxcr\" (UID: \"415e43ce-3750-4e1f-9f99-17d7cebca805\") " pod="openshift-image-registry/image-registry-6ddd9cbcbd-xgxcr" Mar 18 16:47:01.430973 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:01.430881 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/415e43ce-3750-4e1f-9f99-17d7cebca805-registry-tls\") pod \"image-registry-6ddd9cbcbd-xgxcr\" (UID: \"415e43ce-3750-4e1f-9f99-17d7cebca805\") " pod="openshift-image-registry/image-registry-6ddd9cbcbd-xgxcr" Mar 18 16:47:01.431699 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:01.431679 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/21b7e61b-bd0b-483d-a381-12ff0e42fe1f-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-gn87n\" (UID: \"21b7e61b-bd0b-483d-a381-12ff0e42fe1f\") " pod="openshift-insights/insights-runtime-extractor-gn87n" Mar 18 16:47:01.432897 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:01.432877 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/21b7e61b-bd0b-483d-a381-12ff0e42fe1f-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-gn87n\" (UID: \"21b7e61b-bd0b-483d-a381-12ff0e42fe1f\") " pod="openshift-insights/insights-runtime-extractor-gn87n" Mar 18 16:47:01.439462 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:01.439441 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqvhz\" (UniqueName: \"kubernetes.io/projected/21b7e61b-bd0b-483d-a381-12ff0e42fe1f-kube-api-access-cqvhz\") pod \"insights-runtime-extractor-gn87n\" (UID: \"21b7e61b-bd0b-483d-a381-12ff0e42fe1f\") " pod="openshift-insights/insights-runtime-extractor-gn87n" Mar 18 16:47:01.531339 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:01.531313 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/415e43ce-3750-4e1f-9f99-17d7cebca805-ca-trust-extracted\") pod \"image-registry-6ddd9cbcbd-xgxcr\" (UID: \"415e43ce-3750-4e1f-9f99-17d7cebca805\") " pod="openshift-image-registry/image-registry-6ddd9cbcbd-xgxcr" Mar 18 16:47:01.531502 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:01.531359 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/415e43ce-3750-4e1f-9f99-17d7cebca805-image-registry-private-configuration\") pod \"image-registry-6ddd9cbcbd-xgxcr\" (UID: \"415e43ce-3750-4e1f-9f99-17d7cebca805\") " pod="openshift-image-registry/image-registry-6ddd9cbcbd-xgxcr" Mar 18 16:47:01.531502 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:01.531475 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/415e43ce-3750-4e1f-9f99-17d7cebca805-registry-tls\") pod \"image-registry-6ddd9cbcbd-xgxcr\" (UID: \"415e43ce-3750-4e1f-9f99-17d7cebca805\") " pod="openshift-image-registry/image-registry-6ddd9cbcbd-xgxcr" Mar 18 16:47:01.531596 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:01.531546 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/415e43ce-3750-4e1f-9f99-17d7cebca805-trusted-ca\") pod \"image-registry-6ddd9cbcbd-xgxcr\" (UID: \"415e43ce-3750-4e1f-9f99-17d7cebca805\") " pod="openshift-image-registry/image-registry-6ddd9cbcbd-xgxcr" Mar 18 16:47:01.531640 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:01.531605 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l6mwv\" (UniqueName: \"kubernetes.io/projected/415e43ce-3750-4e1f-9f99-17d7cebca805-kube-api-access-l6mwv\") pod \"image-registry-6ddd9cbcbd-xgxcr\" (UID: \"415e43ce-3750-4e1f-9f99-17d7cebca805\") " pod="openshift-image-registry/image-registry-6ddd9cbcbd-xgxcr" Mar 18 16:47:01.531675 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:01.531654 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/415e43ce-3750-4e1f-9f99-17d7cebca805-registry-certificates\") pod \"image-registry-6ddd9cbcbd-xgxcr\" (UID: \"415e43ce-3750-4e1f-9f99-17d7cebca805\") " pod="openshift-image-registry/image-registry-6ddd9cbcbd-xgxcr" Mar 18 16:47:01.531721 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:01.531699 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/415e43ce-3750-4e1f-9f99-17d7cebca805-bound-sa-token\") pod \"image-registry-6ddd9cbcbd-xgxcr\" (UID: \"415e43ce-3750-4e1f-9f99-17d7cebca805\") " pod="openshift-image-registry/image-registry-6ddd9cbcbd-xgxcr" Mar 18 16:47:01.531772 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:01.531730 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/415e43ce-3750-4e1f-9f99-17d7cebca805-installation-pull-secrets\") pod \"image-registry-6ddd9cbcbd-xgxcr\" (UID: \"415e43ce-3750-4e1f-9f99-17d7cebca805\") " pod="openshift-image-registry/image-registry-6ddd9cbcbd-xgxcr" Mar 18 16:47:01.531822 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:01.531767 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/415e43ce-3750-4e1f-9f99-17d7cebca805-ca-trust-extracted\") pod \"image-registry-6ddd9cbcbd-xgxcr\" (UID: \"415e43ce-3750-4e1f-9f99-17d7cebca805\") " pod="openshift-image-registry/image-registry-6ddd9cbcbd-xgxcr" Mar 18 16:47:01.532520 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:01.532483 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/415e43ce-3750-4e1f-9f99-17d7cebca805-registry-certificates\") pod \"image-registry-6ddd9cbcbd-xgxcr\" (UID: \"415e43ce-3750-4e1f-9f99-17d7cebca805\") " pod="openshift-image-registry/image-registry-6ddd9cbcbd-xgxcr" Mar 18 16:47:01.532751 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:01.532721 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/415e43ce-3750-4e1f-9f99-17d7cebca805-trusted-ca\") pod \"image-registry-6ddd9cbcbd-xgxcr\" (UID: \"415e43ce-3750-4e1f-9f99-17d7cebca805\") " pod="openshift-image-registry/image-registry-6ddd9cbcbd-xgxcr" Mar 18 16:47:01.534069 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:01.534049 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/415e43ce-3750-4e1f-9f99-17d7cebca805-installation-pull-secrets\") pod \"image-registry-6ddd9cbcbd-xgxcr\" (UID: \"415e43ce-3750-4e1f-9f99-17d7cebca805\") " pod="openshift-image-registry/image-registry-6ddd9cbcbd-xgxcr" Mar 18 16:47:01.534285 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:01.534266 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/415e43ce-3750-4e1f-9f99-17d7cebca805-registry-tls\") pod \"image-registry-6ddd9cbcbd-xgxcr\" (UID: \"415e43ce-3750-4e1f-9f99-17d7cebca805\") " pod="openshift-image-registry/image-registry-6ddd9cbcbd-xgxcr" Mar 18 16:47:01.534352 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:01.534303 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/415e43ce-3750-4e1f-9f99-17d7cebca805-image-registry-private-configuration\") pod \"image-registry-6ddd9cbcbd-xgxcr\" (UID: \"415e43ce-3750-4e1f-9f99-17d7cebca805\") " pod="openshift-image-registry/image-registry-6ddd9cbcbd-xgxcr" Mar 18 16:47:01.548544 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:01.548497 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6mwv\" (UniqueName: \"kubernetes.io/projected/415e43ce-3750-4e1f-9f99-17d7cebca805-kube-api-access-l6mwv\") pod \"image-registry-6ddd9cbcbd-xgxcr\" (UID: \"415e43ce-3750-4e1f-9f99-17d7cebca805\") " pod="openshift-image-registry/image-registry-6ddd9cbcbd-xgxcr" Mar 18 16:47:01.548685 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:01.548644 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/415e43ce-3750-4e1f-9f99-17d7cebca805-bound-sa-token\") pod \"image-registry-6ddd9cbcbd-xgxcr\" (UID: \"415e43ce-3750-4e1f-9f99-17d7cebca805\") " pod="openshift-image-registry/image-registry-6ddd9cbcbd-xgxcr" Mar 18 16:47:01.594974 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:01.594943 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-gn87n" Mar 18 16:47:01.614070 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:01.614038 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6ddd9cbcbd-xgxcr" Mar 18 16:47:01.728446 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:01.728388 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-gn87n"] Mar 18 16:47:01.732753 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:47:01.732723 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21b7e61b_bd0b_483d_a381_12ff0e42fe1f.slice/crio-fb6cd37922d8c3daadc22cfb44c124d3bc2b7c46db082bcb0e54d934dbedce60 WatchSource:0}: Error finding container fb6cd37922d8c3daadc22cfb44c124d3bc2b7c46db082bcb0e54d934dbedce60: Status 404 returned error can't find the container with id fb6cd37922d8c3daadc22cfb44c124d3bc2b7c46db082bcb0e54d934dbedce60 Mar 18 16:47:01.749459 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:01.749430 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6ddd9cbcbd-xgxcr"] Mar 18 16:47:01.752373 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:47:01.752345 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod415e43ce_3750_4e1f_9f99_17d7cebca805.slice/crio-b694afd31d6f593919b62510d08f4e1e857a4528455ec895c8ff669b2dad4485 WatchSource:0}: Error finding container b694afd31d6f593919b62510d08f4e1e857a4528455ec895c8ff669b2dad4485: Status 404 returned error can't find the container with id b694afd31d6f593919b62510d08f4e1e857a4528455ec895c8ff669b2dad4485 Mar 18 16:47:02.053980 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:02.053880 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-gn87n" event={"ID":"21b7e61b-bd0b-483d-a381-12ff0e42fe1f","Type":"ContainerStarted","Data":"1e95112c8993085c39e94061d28e37687c3d0a34870171e670fb5fa76313e2b8"} Mar 18 16:47:02.053980 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:02.053926 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-gn87n" event={"ID":"21b7e61b-bd0b-483d-a381-12ff0e42fe1f","Type":"ContainerStarted","Data":"fb6cd37922d8c3daadc22cfb44c124d3bc2b7c46db082bcb0e54d934dbedce60"} Mar 18 16:47:02.055210 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:02.055187 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6ddd9cbcbd-xgxcr" event={"ID":"415e43ce-3750-4e1f-9f99-17d7cebca805","Type":"ContainerStarted","Data":"f7b2fbefd3ee6a9249de9b33eb881d839e2b073ebd49452db03e28bf987647df"} Mar 18 16:47:02.055210 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:02.055213 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6ddd9cbcbd-xgxcr" event={"ID":"415e43ce-3750-4e1f-9f99-17d7cebca805","Type":"ContainerStarted","Data":"b694afd31d6f593919b62510d08f4e1e857a4528455ec895c8ff669b2dad4485"} Mar 18 16:47:02.055455 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:02.055432 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-6ddd9cbcbd-xgxcr" Mar 18 16:47:02.074253 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:02.074193 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-6ddd9cbcbd-xgxcr" podStartSLOduration=1.074174154 podStartE2EDuration="1.074174154s" podCreationTimestamp="2026-03-18 16:47:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:47:02.07363419 +0000 UTC m=+176.085743017" watchObservedRunningTime="2026-03-18 16:47:02.074174154 +0000 UTC m=+176.086282983" Mar 18 16:47:03.059533 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:03.059499 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-gn87n" event={"ID":"21b7e61b-bd0b-483d-a381-12ff0e42fe1f","Type":"ContainerStarted","Data":"c1f104c89c6d787110f172d54a4a9b22133831fe185aaf6250ed4b2d80d4a103"} Mar 18 16:47:04.063926 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:04.063892 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-gn87n" event={"ID":"21b7e61b-bd0b-483d-a381-12ff0e42fe1f","Type":"ContainerStarted","Data":"09da38f3b942b180f9ff344cadabbf67fa51828942f5c1b4b06d8c76c2416196"} Mar 18 16:47:04.080845 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:04.080787 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-gn87n" podStartSLOduration=0.925320595 podStartE2EDuration="3.080767691s" podCreationTimestamp="2026-03-18 16:47:01 +0000 UTC" firstStartedPulling="2026-03-18 16:47:01.804131841 +0000 UTC m=+175.816240652" lastFinishedPulling="2026-03-18 16:47:03.959578928 +0000 UTC m=+177.971687748" observedRunningTime="2026-03-18 16:47:04.080325863 +0000 UTC m=+178.092434688" watchObservedRunningTime="2026-03-18 16:47:04.080767691 +0000 UTC m=+178.092876513" Mar 18 16:47:11.262299 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:11.262263 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-86d7f745b7-6hv58" Mar 18 16:47:15.779738 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:15.779705 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-hnxlk"] Mar 18 16:47:15.784385 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:15.784366 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-hnxlk" Mar 18 16:47:15.786587 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:15.786562 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Mar 18 16:47:15.786777 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:15.786644 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Mar 18 16:47:15.786898 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:15.786880 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Mar 18 16:47:15.787053 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:15.787033 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-qqtwn\"" Mar 18 16:47:15.787682 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:15.787553 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Mar 18 16:47:15.787682 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:15.787565 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Mar 18 16:47:15.787682 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:15.787640 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Mar 18 16:47:15.841803 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:15.841757 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/6d1cbe17-16d8-49c6-9433-bee7ad721a0d-node-exporter-wtmp\") pod \"node-exporter-hnxlk\" (UID: \"6d1cbe17-16d8-49c6-9433-bee7ad721a0d\") " pod="openshift-monitoring/node-exporter-hnxlk" Mar 18 16:47:15.841986 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:15.841812 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/6d1cbe17-16d8-49c6-9433-bee7ad721a0d-node-exporter-tls\") pod \"node-exporter-hnxlk\" (UID: \"6d1cbe17-16d8-49c6-9433-bee7ad721a0d\") " pod="openshift-monitoring/node-exporter-hnxlk" Mar 18 16:47:15.841986 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:15.841840 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6d1cbe17-16d8-49c6-9433-bee7ad721a0d-sys\") pod \"node-exporter-hnxlk\" (UID: \"6d1cbe17-16d8-49c6-9433-bee7ad721a0d\") " pod="openshift-monitoring/node-exporter-hnxlk" Mar 18 16:47:15.841986 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:15.841918 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrd42\" (UniqueName: \"kubernetes.io/projected/6d1cbe17-16d8-49c6-9433-bee7ad721a0d-kube-api-access-rrd42\") pod \"node-exporter-hnxlk\" (UID: \"6d1cbe17-16d8-49c6-9433-bee7ad721a0d\") " pod="openshift-monitoring/node-exporter-hnxlk" Mar 18 16:47:15.841986 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:15.841946 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/6d1cbe17-16d8-49c6-9433-bee7ad721a0d-node-exporter-textfile\") pod \"node-exporter-hnxlk\" (UID: \"6d1cbe17-16d8-49c6-9433-bee7ad721a0d\") " pod="openshift-monitoring/node-exporter-hnxlk" Mar 18 16:47:15.842132 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:15.842014 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6d1cbe17-16d8-49c6-9433-bee7ad721a0d-metrics-client-ca\") pod \"node-exporter-hnxlk\" (UID: \"6d1cbe17-16d8-49c6-9433-bee7ad721a0d\") " pod="openshift-monitoring/node-exporter-hnxlk" Mar 18 16:47:15.842132 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:15.842100 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/6d1cbe17-16d8-49c6-9433-bee7ad721a0d-node-exporter-accelerators-collector-config\") pod \"node-exporter-hnxlk\" (UID: \"6d1cbe17-16d8-49c6-9433-bee7ad721a0d\") " pod="openshift-monitoring/node-exporter-hnxlk" Mar 18 16:47:15.842203 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:15.842166 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6d1cbe17-16d8-49c6-9433-bee7ad721a0d-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-hnxlk\" (UID: \"6d1cbe17-16d8-49c6-9433-bee7ad721a0d\") " pod="openshift-monitoring/node-exporter-hnxlk" Mar 18 16:47:15.842203 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:15.842197 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/6d1cbe17-16d8-49c6-9433-bee7ad721a0d-root\") pod \"node-exporter-hnxlk\" (UID: \"6d1cbe17-16d8-49c6-9433-bee7ad721a0d\") " pod="openshift-monitoring/node-exporter-hnxlk" Mar 18 16:47:15.942630 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:15.942589 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6d1cbe17-16d8-49c6-9433-bee7ad721a0d-metrics-client-ca\") pod \"node-exporter-hnxlk\" (UID: \"6d1cbe17-16d8-49c6-9433-bee7ad721a0d\") " pod="openshift-monitoring/node-exporter-hnxlk" Mar 18 16:47:15.942823 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:15.942657 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/6d1cbe17-16d8-49c6-9433-bee7ad721a0d-node-exporter-accelerators-collector-config\") pod \"node-exporter-hnxlk\" (UID: \"6d1cbe17-16d8-49c6-9433-bee7ad721a0d\") " pod="openshift-monitoring/node-exporter-hnxlk" Mar 18 16:47:15.942823 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:15.942710 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6d1cbe17-16d8-49c6-9433-bee7ad721a0d-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-hnxlk\" (UID: \"6d1cbe17-16d8-49c6-9433-bee7ad721a0d\") " pod="openshift-monitoring/node-exporter-hnxlk" Mar 18 16:47:15.942823 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:15.942737 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/6d1cbe17-16d8-49c6-9433-bee7ad721a0d-root\") pod \"node-exporter-hnxlk\" (UID: \"6d1cbe17-16d8-49c6-9433-bee7ad721a0d\") " pod="openshift-monitoring/node-exporter-hnxlk" Mar 18 16:47:15.942823 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:15.942768 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/6d1cbe17-16d8-49c6-9433-bee7ad721a0d-node-exporter-wtmp\") pod \"node-exporter-hnxlk\" (UID: \"6d1cbe17-16d8-49c6-9433-bee7ad721a0d\") " pod="openshift-monitoring/node-exporter-hnxlk" Mar 18 16:47:15.942823 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:15.942793 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/6d1cbe17-16d8-49c6-9433-bee7ad721a0d-node-exporter-tls\") pod \"node-exporter-hnxlk\" (UID: \"6d1cbe17-16d8-49c6-9433-bee7ad721a0d\") " pod="openshift-monitoring/node-exporter-hnxlk" Mar 18 16:47:15.942823 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:15.942820 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6d1cbe17-16d8-49c6-9433-bee7ad721a0d-sys\") pod \"node-exporter-hnxlk\" (UID: \"6d1cbe17-16d8-49c6-9433-bee7ad721a0d\") " pod="openshift-monitoring/node-exporter-hnxlk" Mar 18 16:47:15.943132 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:15.942856 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rrd42\" (UniqueName: \"kubernetes.io/projected/6d1cbe17-16d8-49c6-9433-bee7ad721a0d-kube-api-access-rrd42\") pod \"node-exporter-hnxlk\" (UID: \"6d1cbe17-16d8-49c6-9433-bee7ad721a0d\") " pod="openshift-monitoring/node-exporter-hnxlk" Mar 18 16:47:15.943132 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:15.942883 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/6d1cbe17-16d8-49c6-9433-bee7ad721a0d-node-exporter-textfile\") pod \"node-exporter-hnxlk\" (UID: \"6d1cbe17-16d8-49c6-9433-bee7ad721a0d\") " pod="openshift-monitoring/node-exporter-hnxlk" Mar 18 16:47:15.943132 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:15.942935 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/6d1cbe17-16d8-49c6-9433-bee7ad721a0d-node-exporter-wtmp\") pod \"node-exporter-hnxlk\" (UID: \"6d1cbe17-16d8-49c6-9433-bee7ad721a0d\") " pod="openshift-monitoring/node-exporter-hnxlk" Mar 18 16:47:15.943132 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:15.943004 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/6d1cbe17-16d8-49c6-9433-bee7ad721a0d-root\") pod \"node-exporter-hnxlk\" (UID: \"6d1cbe17-16d8-49c6-9433-bee7ad721a0d\") " pod="openshift-monitoring/node-exporter-hnxlk" Mar 18 16:47:15.943132 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:15.943057 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6d1cbe17-16d8-49c6-9433-bee7ad721a0d-sys\") pod \"node-exporter-hnxlk\" (UID: \"6d1cbe17-16d8-49c6-9433-bee7ad721a0d\") " pod="openshift-monitoring/node-exporter-hnxlk" Mar 18 16:47:15.943132 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:47:15.943108 2578 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Mar 18 16:47:15.943425 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:47:15.943166 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d1cbe17-16d8-49c6-9433-bee7ad721a0d-node-exporter-tls podName:6d1cbe17-16d8-49c6-9433-bee7ad721a0d nodeName:}" failed. No retries permitted until 2026-03-18 16:47:16.443146803 +0000 UTC m=+190.455255622 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/6d1cbe17-16d8-49c6-9433-bee7ad721a0d-node-exporter-tls") pod "node-exporter-hnxlk" (UID: "6d1cbe17-16d8-49c6-9433-bee7ad721a0d") : secret "node-exporter-tls" not found Mar 18 16:47:15.943425 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:15.943259 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6d1cbe17-16d8-49c6-9433-bee7ad721a0d-metrics-client-ca\") pod \"node-exporter-hnxlk\" (UID: \"6d1cbe17-16d8-49c6-9433-bee7ad721a0d\") " pod="openshift-monitoring/node-exporter-hnxlk" Mar 18 16:47:15.943425 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:15.943329 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/6d1cbe17-16d8-49c6-9433-bee7ad721a0d-node-exporter-accelerators-collector-config\") pod \"node-exporter-hnxlk\" (UID: \"6d1cbe17-16d8-49c6-9433-bee7ad721a0d\") " pod="openshift-monitoring/node-exporter-hnxlk" Mar 18 16:47:15.943625 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:15.943466 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/6d1cbe17-16d8-49c6-9433-bee7ad721a0d-node-exporter-textfile\") pod \"node-exporter-hnxlk\" (UID: \"6d1cbe17-16d8-49c6-9433-bee7ad721a0d\") " pod="openshift-monitoring/node-exporter-hnxlk" Mar 18 16:47:15.945247 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:15.945223 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6d1cbe17-16d8-49c6-9433-bee7ad721a0d-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-hnxlk\" (UID: \"6d1cbe17-16d8-49c6-9433-bee7ad721a0d\") " pod="openshift-monitoring/node-exporter-hnxlk" Mar 18 16:47:15.953342 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:15.953311 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrd42\" (UniqueName: \"kubernetes.io/projected/6d1cbe17-16d8-49c6-9433-bee7ad721a0d-kube-api-access-rrd42\") pod \"node-exporter-hnxlk\" (UID: \"6d1cbe17-16d8-49c6-9433-bee7ad721a0d\") " pod="openshift-monitoring/node-exporter-hnxlk" Mar 18 16:47:16.447910 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:16.447860 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/6d1cbe17-16d8-49c6-9433-bee7ad721a0d-node-exporter-tls\") pod \"node-exporter-hnxlk\" (UID: \"6d1cbe17-16d8-49c6-9433-bee7ad721a0d\") " pod="openshift-monitoring/node-exporter-hnxlk" Mar 18 16:47:16.450122 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:16.450093 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/6d1cbe17-16d8-49c6-9433-bee7ad721a0d-node-exporter-tls\") pod \"node-exporter-hnxlk\" (UID: \"6d1cbe17-16d8-49c6-9433-bee7ad721a0d\") " pod="openshift-monitoring/node-exporter-hnxlk" Mar 18 16:47:16.694141 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:16.694104 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-hnxlk" Mar 18 16:47:16.701989 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:47:16.701954 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d1cbe17_16d8_49c6_9433_bee7ad721a0d.slice/crio-464361f5fdfdfa80e0ea318e723829802d6df8b079b31dc0c7f66cf5ac3c488f WatchSource:0}: Error finding container 464361f5fdfdfa80e0ea318e723829802d6df8b079b31dc0c7f66cf5ac3c488f: Status 404 returned error can't find the container with id 464361f5fdfdfa80e0ea318e723829802d6df8b079b31dc0c7f66cf5ac3c488f Mar 18 16:47:17.095861 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:17.095827 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-hnxlk" event={"ID":"6d1cbe17-16d8-49c6-9433-bee7ad721a0d","Type":"ContainerStarted","Data":"464361f5fdfdfa80e0ea318e723829802d6df8b079b31dc0c7f66cf5ac3c488f"} Mar 18 16:47:18.099799 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:18.099719 2578 generic.go:358] "Generic (PLEG): container finished" podID="6d1cbe17-16d8-49c6-9433-bee7ad721a0d" containerID="5c68bcc6c834ef7b8e2578f70889ee36e869e390456cf84d95e38521c9d668ce" exitCode=0 Mar 18 16:47:18.099799 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:18.099782 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-hnxlk" event={"ID":"6d1cbe17-16d8-49c6-9433-bee7ad721a0d","Type":"ContainerDied","Data":"5c68bcc6c834ef7b8e2578f70889ee36e869e390456cf84d95e38521c9d668ce"} Mar 18 16:47:19.105754 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:19.105721 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-hnxlk" event={"ID":"6d1cbe17-16d8-49c6-9433-bee7ad721a0d","Type":"ContainerStarted","Data":"41a42ed7baebadce9b928973815994bbacc1714cfa75a90735fda096b50fba02"} Mar 18 16:47:19.105754 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:19.105757 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-hnxlk" event={"ID":"6d1cbe17-16d8-49c6-9433-bee7ad721a0d","Type":"ContainerStarted","Data":"18e6c2d427da2a204f49fc2685415a98459f9fd672aada41228ed7e2b8ac906d"} Mar 18 16:47:19.144053 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:19.144005 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-hnxlk" podStartSLOduration=3.050423293 podStartE2EDuration="4.143988407s" podCreationTimestamp="2026-03-18 16:47:15 +0000 UTC" firstStartedPulling="2026-03-18 16:47:16.703572441 +0000 UTC m=+190.715681247" lastFinishedPulling="2026-03-18 16:47:17.797137554 +0000 UTC m=+191.809246361" observedRunningTime="2026-03-18 16:47:19.142402898 +0000 UTC m=+193.154511724" watchObservedRunningTime="2026-03-18 16:47:19.143988407 +0000 UTC m=+193.156097231" Mar 18 16:47:20.248686 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:20.248655 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-57f6d8dc57-4tc82"] Mar 18 16:47:20.251836 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:20.251819 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-57f6d8dc57-4tc82" Mar 18 16:47:20.266349 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:20.266317 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Mar 18 16:47:20.266480 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:20.266423 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Mar 18 16:47:20.266480 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:20.266431 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Mar 18 16:47:20.266480 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:20.266448 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Mar 18 16:47:20.266480 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:20.266470 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-27qzn\"" Mar 18 16:47:20.266684 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:20.266424 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-nps2ecr26p1j\"" Mar 18 16:47:20.295886 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:20.295853 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-57f6d8dc57-4tc82"] Mar 18 16:47:20.382140 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:20.382110 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/4d7aa466-f884-4bf5-b245-b7e3250058d9-secret-metrics-server-client-certs\") pod \"metrics-server-57f6d8dc57-4tc82\" (UID: \"4d7aa466-f884-4bf5-b245-b7e3250058d9\") " pod="openshift-monitoring/metrics-server-57f6d8dc57-4tc82" Mar 18 16:47:20.382140 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:20.382143 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7j5qc\" (UniqueName: \"kubernetes.io/projected/4d7aa466-f884-4bf5-b245-b7e3250058d9-kube-api-access-7j5qc\") pod \"metrics-server-57f6d8dc57-4tc82\" (UID: \"4d7aa466-f884-4bf5-b245-b7e3250058d9\") " pod="openshift-monitoring/metrics-server-57f6d8dc57-4tc82" Mar 18 16:47:20.382335 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:20.382210 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d7aa466-f884-4bf5-b245-b7e3250058d9-client-ca-bundle\") pod \"metrics-server-57f6d8dc57-4tc82\" (UID: \"4d7aa466-f884-4bf5-b245-b7e3250058d9\") " pod="openshift-monitoring/metrics-server-57f6d8dc57-4tc82" Mar 18 16:47:20.382335 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:20.382252 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4d7aa466-f884-4bf5-b245-b7e3250058d9-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-57f6d8dc57-4tc82\" (UID: \"4d7aa466-f884-4bf5-b245-b7e3250058d9\") " pod="openshift-monitoring/metrics-server-57f6d8dc57-4tc82" Mar 18 16:47:20.382335 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:20.382277 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/4d7aa466-f884-4bf5-b245-b7e3250058d9-secret-metrics-server-tls\") pod \"metrics-server-57f6d8dc57-4tc82\" (UID: \"4d7aa466-f884-4bf5-b245-b7e3250058d9\") " pod="openshift-monitoring/metrics-server-57f6d8dc57-4tc82" Mar 18 16:47:20.382434 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:20.382367 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/4d7aa466-f884-4bf5-b245-b7e3250058d9-audit-log\") pod \"metrics-server-57f6d8dc57-4tc82\" (UID: \"4d7aa466-f884-4bf5-b245-b7e3250058d9\") " pod="openshift-monitoring/metrics-server-57f6d8dc57-4tc82" Mar 18 16:47:20.382434 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:20.382391 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/4d7aa466-f884-4bf5-b245-b7e3250058d9-metrics-server-audit-profiles\") pod \"metrics-server-57f6d8dc57-4tc82\" (UID: \"4d7aa466-f884-4bf5-b245-b7e3250058d9\") " pod="openshift-monitoring/metrics-server-57f6d8dc57-4tc82" Mar 18 16:47:20.482829 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:20.482787 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/4d7aa466-f884-4bf5-b245-b7e3250058d9-secret-metrics-server-client-certs\") pod \"metrics-server-57f6d8dc57-4tc82\" (UID: \"4d7aa466-f884-4bf5-b245-b7e3250058d9\") " pod="openshift-monitoring/metrics-server-57f6d8dc57-4tc82" Mar 18 16:47:20.482829 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:20.482830 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7j5qc\" (UniqueName: \"kubernetes.io/projected/4d7aa466-f884-4bf5-b245-b7e3250058d9-kube-api-access-7j5qc\") pod \"metrics-server-57f6d8dc57-4tc82\" (UID: \"4d7aa466-f884-4bf5-b245-b7e3250058d9\") " pod="openshift-monitoring/metrics-server-57f6d8dc57-4tc82" Mar 18 16:47:20.483086 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:20.482917 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d7aa466-f884-4bf5-b245-b7e3250058d9-client-ca-bundle\") pod \"metrics-server-57f6d8dc57-4tc82\" (UID: \"4d7aa466-f884-4bf5-b245-b7e3250058d9\") " pod="openshift-monitoring/metrics-server-57f6d8dc57-4tc82" Mar 18 16:47:20.483086 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:20.483031 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4d7aa466-f884-4bf5-b245-b7e3250058d9-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-57f6d8dc57-4tc82\" (UID: \"4d7aa466-f884-4bf5-b245-b7e3250058d9\") " pod="openshift-monitoring/metrics-server-57f6d8dc57-4tc82" Mar 18 16:47:20.483086 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:20.483059 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/4d7aa466-f884-4bf5-b245-b7e3250058d9-secret-metrics-server-tls\") pod \"metrics-server-57f6d8dc57-4tc82\" (UID: \"4d7aa466-f884-4bf5-b245-b7e3250058d9\") " pod="openshift-monitoring/metrics-server-57f6d8dc57-4tc82" Mar 18 16:47:20.483262 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:20.483161 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/4d7aa466-f884-4bf5-b245-b7e3250058d9-audit-log\") pod \"metrics-server-57f6d8dc57-4tc82\" (UID: \"4d7aa466-f884-4bf5-b245-b7e3250058d9\") " pod="openshift-monitoring/metrics-server-57f6d8dc57-4tc82" Mar 18 16:47:20.483262 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:20.483191 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/4d7aa466-f884-4bf5-b245-b7e3250058d9-metrics-server-audit-profiles\") pod \"metrics-server-57f6d8dc57-4tc82\" (UID: \"4d7aa466-f884-4bf5-b245-b7e3250058d9\") " pod="openshift-monitoring/metrics-server-57f6d8dc57-4tc82" Mar 18 16:47:20.483645 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:20.483614 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/4d7aa466-f884-4bf5-b245-b7e3250058d9-audit-log\") pod \"metrics-server-57f6d8dc57-4tc82\" (UID: \"4d7aa466-f884-4bf5-b245-b7e3250058d9\") " pod="openshift-monitoring/metrics-server-57f6d8dc57-4tc82" Mar 18 16:47:20.484046 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:20.484018 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4d7aa466-f884-4bf5-b245-b7e3250058d9-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-57f6d8dc57-4tc82\" (UID: \"4d7aa466-f884-4bf5-b245-b7e3250058d9\") " pod="openshift-monitoring/metrics-server-57f6d8dc57-4tc82" Mar 18 16:47:20.484177 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:20.484160 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/4d7aa466-f884-4bf5-b245-b7e3250058d9-metrics-server-audit-profiles\") pod \"metrics-server-57f6d8dc57-4tc82\" (UID: \"4d7aa466-f884-4bf5-b245-b7e3250058d9\") " pod="openshift-monitoring/metrics-server-57f6d8dc57-4tc82" Mar 18 16:47:20.485487 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:20.485466 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/4d7aa466-f884-4bf5-b245-b7e3250058d9-secret-metrics-server-tls\") pod \"metrics-server-57f6d8dc57-4tc82\" (UID: \"4d7aa466-f884-4bf5-b245-b7e3250058d9\") " pod="openshift-monitoring/metrics-server-57f6d8dc57-4tc82" Mar 18 16:47:20.485630 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:20.485614 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/4d7aa466-f884-4bf5-b245-b7e3250058d9-secret-metrics-server-client-certs\") pod \"metrics-server-57f6d8dc57-4tc82\" (UID: \"4d7aa466-f884-4bf5-b245-b7e3250058d9\") " pod="openshift-monitoring/metrics-server-57f6d8dc57-4tc82" Mar 18 16:47:20.486210 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:20.486189 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d7aa466-f884-4bf5-b245-b7e3250058d9-client-ca-bundle\") pod \"metrics-server-57f6d8dc57-4tc82\" (UID: \"4d7aa466-f884-4bf5-b245-b7e3250058d9\") " pod="openshift-monitoring/metrics-server-57f6d8dc57-4tc82" Mar 18 16:47:20.513124 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:20.513065 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7j5qc\" (UniqueName: \"kubernetes.io/projected/4d7aa466-f884-4bf5-b245-b7e3250058d9-kube-api-access-7j5qc\") pod \"metrics-server-57f6d8dc57-4tc82\" (UID: \"4d7aa466-f884-4bf5-b245-b7e3250058d9\") " pod="openshift-monitoring/metrics-server-57f6d8dc57-4tc82" Mar 18 16:47:20.561149 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:20.561109 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-57f6d8dc57-4tc82" Mar 18 16:47:20.713411 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:20.713380 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-57f6d8dc57-4tc82"] Mar 18 16:47:20.718130 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:47:20.718094 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d7aa466_f884_4bf5_b245_b7e3250058d9.slice/crio-8920637f585dcc301f2f30203742b0784536d1db74672d72bb9f1a3cc6224f34 WatchSource:0}: Error finding container 8920637f585dcc301f2f30203742b0784536d1db74672d72bb9f1a3cc6224f34: Status 404 returned error can't find the container with id 8920637f585dcc301f2f30203742b0784536d1db74672d72bb9f1a3cc6224f34 Mar 18 16:47:20.922700 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:20.922665 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-6d47bdb78d-7mt94"] Mar 18 16:47:20.927093 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:20.927072 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-6d47bdb78d-7mt94" Mar 18 16:47:20.929561 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:20.929538 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Mar 18 16:47:20.929561 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:20.929553 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-5vnm2\"" Mar 18 16:47:20.957082 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:20.957053 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-6d47bdb78d-7mt94"] Mar 18 16:47:21.089289 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:21.089256 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/127a2232-0ea6-440f-ad16-58a6181cf5c7-monitoring-plugin-cert\") pod \"monitoring-plugin-6d47bdb78d-7mt94\" (UID: \"127a2232-0ea6-440f-ad16-58a6181cf5c7\") " pod="openshift-monitoring/monitoring-plugin-6d47bdb78d-7mt94" Mar 18 16:47:21.111356 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:21.111326 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-57f6d8dc57-4tc82" event={"ID":"4d7aa466-f884-4bf5-b245-b7e3250058d9","Type":"ContainerStarted","Data":"8920637f585dcc301f2f30203742b0784536d1db74672d72bb9f1a3cc6224f34"} Mar 18 16:47:21.190031 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:21.189940 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/127a2232-0ea6-440f-ad16-58a6181cf5c7-monitoring-plugin-cert\") pod \"monitoring-plugin-6d47bdb78d-7mt94\" (UID: \"127a2232-0ea6-440f-ad16-58a6181cf5c7\") " pod="openshift-monitoring/monitoring-plugin-6d47bdb78d-7mt94" Mar 18 16:47:21.192286 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:21.192266 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/127a2232-0ea6-440f-ad16-58a6181cf5c7-monitoring-plugin-cert\") pod \"monitoring-plugin-6d47bdb78d-7mt94\" (UID: \"127a2232-0ea6-440f-ad16-58a6181cf5c7\") " pod="openshift-monitoring/monitoring-plugin-6d47bdb78d-7mt94" Mar 18 16:47:21.236339 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:21.236304 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-6d47bdb78d-7mt94" Mar 18 16:47:21.393380 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:21.393346 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-6d47bdb78d-7mt94"] Mar 18 16:47:21.397405 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:47:21.397371 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod127a2232_0ea6_440f_ad16_58a6181cf5c7.slice/crio-523825c7e04e9f70b15aec36cf51fa99b673eaf769ae97cb179a9ae93d509bf3 WatchSource:0}: Error finding container 523825c7e04e9f70b15aec36cf51fa99b673eaf769ae97cb179a9ae93d509bf3: Status 404 returned error can't find the container with id 523825c7e04e9f70b15aec36cf51fa99b673eaf769ae97cb179a9ae93d509bf3 Mar 18 16:47:22.117499 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:22.117475 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-6d47bdb78d-7mt94" event={"ID":"127a2232-0ea6-440f-ad16-58a6181cf5c7","Type":"ContainerStarted","Data":"523825c7e04e9f70b15aec36cf51fa99b673eaf769ae97cb179a9ae93d509bf3"} Mar 18 16:47:23.063803 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:23.063775 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-6ddd9cbcbd-xgxcr" Mar 18 16:47:23.121516 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:23.121488 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-57f6d8dc57-4tc82" event={"ID":"4d7aa466-f884-4bf5-b245-b7e3250058d9","Type":"ContainerStarted","Data":"3851489c63beb3a18372732a1f4f60d39770479bb238ec2628a65fd072e6a228"} Mar 18 16:47:23.193828 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:23.193769 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-57f6d8dc57-4tc82" podStartSLOduration=1.8301745 podStartE2EDuration="3.193746924s" podCreationTimestamp="2026-03-18 16:47:20 +0000 UTC" firstStartedPulling="2026-03-18 16:47:20.720479077 +0000 UTC m=+194.732587883" lastFinishedPulling="2026-03-18 16:47:22.084051502 +0000 UTC m=+196.096160307" observedRunningTime="2026-03-18 16:47:23.190616401 +0000 UTC m=+197.202725227" watchObservedRunningTime="2026-03-18 16:47:23.193746924 +0000 UTC m=+197.205855749" Mar 18 16:47:24.128046 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:24.128012 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-6d47bdb78d-7mt94" event={"ID":"127a2232-0ea6-440f-ad16-58a6181cf5c7","Type":"ContainerStarted","Data":"7b4f3e993fe8e1629de3706f9ee943cc0e6aca51a0ebcf3a7ebf5cf9f99f5c78"} Mar 18 16:47:24.128461 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:24.128297 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-6d47bdb78d-7mt94" Mar 18 16:47:24.133135 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:24.133113 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-6d47bdb78d-7mt94" Mar 18 16:47:24.150311 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:24.150259 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-6d47bdb78d-7mt94" podStartSLOduration=2.439954353 podStartE2EDuration="4.15024212s" podCreationTimestamp="2026-03-18 16:47:20 +0000 UTC" firstStartedPulling="2026-03-18 16:47:21.399492621 +0000 UTC m=+195.411601430" lastFinishedPulling="2026-03-18 16:47:23.109780394 +0000 UTC m=+197.121889197" observedRunningTime="2026-03-18 16:47:24.148285021 +0000 UTC m=+198.160393846" watchObservedRunningTime="2026-03-18 16:47:24.15024212 +0000 UTC m=+198.162350945" Mar 18 16:47:26.276323 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:26.276245 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-86d7f745b7-6hv58" podUID="73d3b30d-1d2b-4666-8183-1a58e717bac7" containerName="registry" containerID="cri-o://d5278eabe108076f3f93eef7d93d1da92b3951ac8cd1a67552b7539859ee9236" gracePeriod=30 Mar 18 16:47:26.514697 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:26.514673 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-86d7f745b7-6hv58" Mar 18 16:47:26.634076 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:26.634019 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/73d3b30d-1d2b-4666-8183-1a58e717bac7-installation-pull-secrets\") pod \"73d3b30d-1d2b-4666-8183-1a58e717bac7\" (UID: \"73d3b30d-1d2b-4666-8183-1a58e717bac7\") " Mar 18 16:47:26.634076 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:26.634080 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/73d3b30d-1d2b-4666-8183-1a58e717bac7-image-registry-private-configuration\") pod \"73d3b30d-1d2b-4666-8183-1a58e717bac7\" (UID: \"73d3b30d-1d2b-4666-8183-1a58e717bac7\") " Mar 18 16:47:26.634347 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:26.634104 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/73d3b30d-1d2b-4666-8183-1a58e717bac7-trusted-ca\") pod \"73d3b30d-1d2b-4666-8183-1a58e717bac7\" (UID: \"73d3b30d-1d2b-4666-8183-1a58e717bac7\") " Mar 18 16:47:26.634347 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:26.634137 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p5pvc\" (UniqueName: \"kubernetes.io/projected/73d3b30d-1d2b-4666-8183-1a58e717bac7-kube-api-access-p5pvc\") pod \"73d3b30d-1d2b-4666-8183-1a58e717bac7\" (UID: \"73d3b30d-1d2b-4666-8183-1a58e717bac7\") " Mar 18 16:47:26.634347 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:26.634165 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/73d3b30d-1d2b-4666-8183-1a58e717bac7-registry-certificates\") pod \"73d3b30d-1d2b-4666-8183-1a58e717bac7\" (UID: \"73d3b30d-1d2b-4666-8183-1a58e717bac7\") " Mar 18 16:47:26.634347 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:26.634277 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/73d3b30d-1d2b-4666-8183-1a58e717bac7-ca-trust-extracted\") pod \"73d3b30d-1d2b-4666-8183-1a58e717bac7\" (UID: \"73d3b30d-1d2b-4666-8183-1a58e717bac7\") " Mar 18 16:47:26.634347 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:26.634340 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/73d3b30d-1d2b-4666-8183-1a58e717bac7-bound-sa-token\") pod \"73d3b30d-1d2b-4666-8183-1a58e717bac7\" (UID: \"73d3b30d-1d2b-4666-8183-1a58e717bac7\") " Mar 18 16:47:26.634631 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:26.634375 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/73d3b30d-1d2b-4666-8183-1a58e717bac7-registry-tls\") pod \"73d3b30d-1d2b-4666-8183-1a58e717bac7\" (UID: \"73d3b30d-1d2b-4666-8183-1a58e717bac7\") " Mar 18 16:47:26.634690 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:26.634674 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73d3b30d-1d2b-4666-8183-1a58e717bac7-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "73d3b30d-1d2b-4666-8183-1a58e717bac7" (UID: "73d3b30d-1d2b-4666-8183-1a58e717bac7"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 16:47:26.634746 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:26.634684 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73d3b30d-1d2b-4666-8183-1a58e717bac7-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "73d3b30d-1d2b-4666-8183-1a58e717bac7" (UID: "73d3b30d-1d2b-4666-8183-1a58e717bac7"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 16:47:26.636885 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:26.636844 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73d3b30d-1d2b-4666-8183-1a58e717bac7-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "73d3b30d-1d2b-4666-8183-1a58e717bac7" (UID: "73d3b30d-1d2b-4666-8183-1a58e717bac7"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:47:26.637045 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:26.636920 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73d3b30d-1d2b-4666-8183-1a58e717bac7-kube-api-access-p5pvc" (OuterVolumeSpecName: "kube-api-access-p5pvc") pod "73d3b30d-1d2b-4666-8183-1a58e717bac7" (UID: "73d3b30d-1d2b-4666-8183-1a58e717bac7"). InnerVolumeSpecName "kube-api-access-p5pvc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 16:47:26.637045 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:26.636975 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73d3b30d-1d2b-4666-8183-1a58e717bac7-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "73d3b30d-1d2b-4666-8183-1a58e717bac7" (UID: "73d3b30d-1d2b-4666-8183-1a58e717bac7"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:47:26.637045 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:26.637001 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73d3b30d-1d2b-4666-8183-1a58e717bac7-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "73d3b30d-1d2b-4666-8183-1a58e717bac7" (UID: "73d3b30d-1d2b-4666-8183-1a58e717bac7"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 16:47:26.637220 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:26.637054 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73d3b30d-1d2b-4666-8183-1a58e717bac7-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "73d3b30d-1d2b-4666-8183-1a58e717bac7" (UID: "73d3b30d-1d2b-4666-8183-1a58e717bac7"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 16:47:26.643708 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:26.643682 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73d3b30d-1d2b-4666-8183-1a58e717bac7-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "73d3b30d-1d2b-4666-8183-1a58e717bac7" (UID: "73d3b30d-1d2b-4666-8183-1a58e717bac7"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 18 16:47:26.735839 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:26.735802 2578 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/73d3b30d-1d2b-4666-8183-1a58e717bac7-bound-sa-token\") on node \"ip-10-0-143-175.ec2.internal\" DevicePath \"\"" Mar 18 16:47:26.735839 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:26.735832 2578 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/73d3b30d-1d2b-4666-8183-1a58e717bac7-registry-tls\") on node \"ip-10-0-143-175.ec2.internal\" DevicePath \"\"" Mar 18 16:47:26.735839 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:26.735842 2578 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/73d3b30d-1d2b-4666-8183-1a58e717bac7-installation-pull-secrets\") on node \"ip-10-0-143-175.ec2.internal\" DevicePath \"\"" Mar 18 16:47:26.736070 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:26.735853 2578 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/73d3b30d-1d2b-4666-8183-1a58e717bac7-image-registry-private-configuration\") on node \"ip-10-0-143-175.ec2.internal\" DevicePath \"\"" Mar 18 16:47:26.736070 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:26.735863 2578 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/73d3b30d-1d2b-4666-8183-1a58e717bac7-trusted-ca\") on node \"ip-10-0-143-175.ec2.internal\" DevicePath \"\"" Mar 18 16:47:26.736070 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:26.735872 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-p5pvc\" (UniqueName: \"kubernetes.io/projected/73d3b30d-1d2b-4666-8183-1a58e717bac7-kube-api-access-p5pvc\") on node \"ip-10-0-143-175.ec2.internal\" DevicePath \"\"" Mar 18 16:47:26.736070 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:26.735880 2578 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/73d3b30d-1d2b-4666-8183-1a58e717bac7-registry-certificates\") on node \"ip-10-0-143-175.ec2.internal\" DevicePath \"\"" Mar 18 16:47:26.736070 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:26.735888 2578 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/73d3b30d-1d2b-4666-8183-1a58e717bac7-ca-trust-extracted\") on node \"ip-10-0-143-175.ec2.internal\" DevicePath \"\"" Mar 18 16:47:27.136513 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:27.136475 2578 generic.go:358] "Generic (PLEG): container finished" podID="73d3b30d-1d2b-4666-8183-1a58e717bac7" containerID="d5278eabe108076f3f93eef7d93d1da92b3951ac8cd1a67552b7539859ee9236" exitCode=0 Mar 18 16:47:27.136700 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:27.136553 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-86d7f745b7-6hv58" Mar 18 16:47:27.136700 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:27.136566 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-86d7f745b7-6hv58" event={"ID":"73d3b30d-1d2b-4666-8183-1a58e717bac7","Type":"ContainerDied","Data":"d5278eabe108076f3f93eef7d93d1da92b3951ac8cd1a67552b7539859ee9236"} Mar 18 16:47:27.136700 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:27.136610 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-86d7f745b7-6hv58" event={"ID":"73d3b30d-1d2b-4666-8183-1a58e717bac7","Type":"ContainerDied","Data":"dc43ee6408b863137d0157aed3e95e5aac15b5fee1c3753e732aeabfcc722e07"} Mar 18 16:47:27.136700 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:27.136626 2578 scope.go:117] "RemoveContainer" containerID="d5278eabe108076f3f93eef7d93d1da92b3951ac8cd1a67552b7539859ee9236" Mar 18 16:47:27.144598 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:27.144577 2578 scope.go:117] "RemoveContainer" containerID="d5278eabe108076f3f93eef7d93d1da92b3951ac8cd1a67552b7539859ee9236" Mar 18 16:47:27.144885 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:47:27.144858 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5278eabe108076f3f93eef7d93d1da92b3951ac8cd1a67552b7539859ee9236\": container with ID starting with d5278eabe108076f3f93eef7d93d1da92b3951ac8cd1a67552b7539859ee9236 not found: ID does not exist" containerID="d5278eabe108076f3f93eef7d93d1da92b3951ac8cd1a67552b7539859ee9236" Mar 18 16:47:27.144972 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:27.144892 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5278eabe108076f3f93eef7d93d1da92b3951ac8cd1a67552b7539859ee9236"} err="failed to get container status \"d5278eabe108076f3f93eef7d93d1da92b3951ac8cd1a67552b7539859ee9236\": rpc error: code = NotFound desc = could not find container \"d5278eabe108076f3f93eef7d93d1da92b3951ac8cd1a67552b7539859ee9236\": container with ID starting with d5278eabe108076f3f93eef7d93d1da92b3951ac8cd1a67552b7539859ee9236 not found: ID does not exist" Mar 18 16:47:27.160079 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:27.160055 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-86d7f745b7-6hv58"] Mar 18 16:47:27.182512 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:27.182486 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-86d7f745b7-6hv58"] Mar 18 16:47:28.592297 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:28.592264 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73d3b30d-1d2b-4666-8183-1a58e717bac7" path="/var/lib/kubelet/pods/73d3b30d-1d2b-4666-8183-1a58e717bac7/volumes" Mar 18 16:47:35.781847 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:35.781815 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5678f46c46-rrrz9"] Mar 18 16:47:35.782321 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:35.782159 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="73d3b30d-1d2b-4666-8183-1a58e717bac7" containerName="registry" Mar 18 16:47:35.782321 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:35.782177 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="73d3b30d-1d2b-4666-8183-1a58e717bac7" containerName="registry" Mar 18 16:47:35.782321 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:35.782253 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="73d3b30d-1d2b-4666-8183-1a58e717bac7" containerName="registry" Mar 18 16:47:35.786842 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:35.786820 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5678f46c46-rrrz9" Mar 18 16:47:35.789704 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:35.789678 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Mar 18 16:47:35.789925 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:35.789727 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Mar 18 16:47:35.789925 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:35.789682 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Mar 18 16:47:35.789925 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:35.789682 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Mar 18 16:47:35.789925 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:35.789688 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-pbg6k\"" Mar 18 16:47:35.789925 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:35.789687 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Mar 18 16:47:35.789925 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:35.789737 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Mar 18 16:47:35.790290 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:35.790089 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Mar 18 16:47:35.796479 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:35.796459 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Mar 18 16:47:35.797008 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:35.796990 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5678f46c46-rrrz9"] Mar 18 16:47:35.907245 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:35.907211 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cc0101a5-e04a-438f-9eb7-ee76683894a0-service-ca\") pod \"console-5678f46c46-rrrz9\" (UID: \"cc0101a5-e04a-438f-9eb7-ee76683894a0\") " pod="openshift-console/console-5678f46c46-rrrz9" Mar 18 16:47:35.907434 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:35.907260 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cc0101a5-e04a-438f-9eb7-ee76683894a0-trusted-ca-bundle\") pod \"console-5678f46c46-rrrz9\" (UID: \"cc0101a5-e04a-438f-9eb7-ee76683894a0\") " pod="openshift-console/console-5678f46c46-rrrz9" Mar 18 16:47:35.907434 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:35.907336 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqjmj\" (UniqueName: \"kubernetes.io/projected/cc0101a5-e04a-438f-9eb7-ee76683894a0-kube-api-access-qqjmj\") pod \"console-5678f46c46-rrrz9\" (UID: \"cc0101a5-e04a-438f-9eb7-ee76683894a0\") " pod="openshift-console/console-5678f46c46-rrrz9" Mar 18 16:47:35.907434 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:35.907366 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cc0101a5-e04a-438f-9eb7-ee76683894a0-console-serving-cert\") pod \"console-5678f46c46-rrrz9\" (UID: \"cc0101a5-e04a-438f-9eb7-ee76683894a0\") " pod="openshift-console/console-5678f46c46-rrrz9" Mar 18 16:47:35.907434 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:35.907397 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cc0101a5-e04a-438f-9eb7-ee76683894a0-console-config\") pod \"console-5678f46c46-rrrz9\" (UID: \"cc0101a5-e04a-438f-9eb7-ee76683894a0\") " pod="openshift-console/console-5678f46c46-rrrz9" Mar 18 16:47:35.907434 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:35.907421 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cc0101a5-e04a-438f-9eb7-ee76683894a0-oauth-serving-cert\") pod \"console-5678f46c46-rrrz9\" (UID: \"cc0101a5-e04a-438f-9eb7-ee76683894a0\") " pod="openshift-console/console-5678f46c46-rrrz9" Mar 18 16:47:35.907639 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:35.907440 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cc0101a5-e04a-438f-9eb7-ee76683894a0-console-oauth-config\") pod \"console-5678f46c46-rrrz9\" (UID: \"cc0101a5-e04a-438f-9eb7-ee76683894a0\") " pod="openshift-console/console-5678f46c46-rrrz9" Mar 18 16:47:36.008565 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:36.008513 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cc0101a5-e04a-438f-9eb7-ee76683894a0-service-ca\") pod \"console-5678f46c46-rrrz9\" (UID: \"cc0101a5-e04a-438f-9eb7-ee76683894a0\") " pod="openshift-console/console-5678f46c46-rrrz9" Mar 18 16:47:36.008770 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:36.008573 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cc0101a5-e04a-438f-9eb7-ee76683894a0-trusted-ca-bundle\") pod \"console-5678f46c46-rrrz9\" (UID: \"cc0101a5-e04a-438f-9eb7-ee76683894a0\") " pod="openshift-console/console-5678f46c46-rrrz9" Mar 18 16:47:36.008770 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:36.008624 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qqjmj\" (UniqueName: \"kubernetes.io/projected/cc0101a5-e04a-438f-9eb7-ee76683894a0-kube-api-access-qqjmj\") pod \"console-5678f46c46-rrrz9\" (UID: \"cc0101a5-e04a-438f-9eb7-ee76683894a0\") " pod="openshift-console/console-5678f46c46-rrrz9" Mar 18 16:47:36.008770 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:36.008646 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cc0101a5-e04a-438f-9eb7-ee76683894a0-console-serving-cert\") pod \"console-5678f46c46-rrrz9\" (UID: \"cc0101a5-e04a-438f-9eb7-ee76683894a0\") " pod="openshift-console/console-5678f46c46-rrrz9" Mar 18 16:47:36.008770 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:36.008694 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cc0101a5-e04a-438f-9eb7-ee76683894a0-console-config\") pod \"console-5678f46c46-rrrz9\" (UID: \"cc0101a5-e04a-438f-9eb7-ee76683894a0\") " pod="openshift-console/console-5678f46c46-rrrz9" Mar 18 16:47:36.008770 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:36.008726 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cc0101a5-e04a-438f-9eb7-ee76683894a0-oauth-serving-cert\") pod \"console-5678f46c46-rrrz9\" (UID: \"cc0101a5-e04a-438f-9eb7-ee76683894a0\") " pod="openshift-console/console-5678f46c46-rrrz9" Mar 18 16:47:36.008770 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:36.008762 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cc0101a5-e04a-438f-9eb7-ee76683894a0-console-oauth-config\") pod \"console-5678f46c46-rrrz9\" (UID: \"cc0101a5-e04a-438f-9eb7-ee76683894a0\") " pod="openshift-console/console-5678f46c46-rrrz9" Mar 18 16:47:36.009432 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:36.009401 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cc0101a5-e04a-438f-9eb7-ee76683894a0-console-config\") pod \"console-5678f46c46-rrrz9\" (UID: \"cc0101a5-e04a-438f-9eb7-ee76683894a0\") " pod="openshift-console/console-5678f46c46-rrrz9" Mar 18 16:47:36.009598 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:36.009470 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cc0101a5-e04a-438f-9eb7-ee76683894a0-oauth-serving-cert\") pod \"console-5678f46c46-rrrz9\" (UID: \"cc0101a5-e04a-438f-9eb7-ee76683894a0\") " pod="openshift-console/console-5678f46c46-rrrz9" Mar 18 16:47:36.009598 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:36.009575 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cc0101a5-e04a-438f-9eb7-ee76683894a0-service-ca\") pod \"console-5678f46c46-rrrz9\" (UID: \"cc0101a5-e04a-438f-9eb7-ee76683894a0\") " pod="openshift-console/console-5678f46c46-rrrz9" Mar 18 16:47:36.009717 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:36.009678 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cc0101a5-e04a-438f-9eb7-ee76683894a0-trusted-ca-bundle\") pod \"console-5678f46c46-rrrz9\" (UID: \"cc0101a5-e04a-438f-9eb7-ee76683894a0\") " pod="openshift-console/console-5678f46c46-rrrz9" Mar 18 16:47:36.010978 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:36.010955 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cc0101a5-e04a-438f-9eb7-ee76683894a0-console-serving-cert\") pod \"console-5678f46c46-rrrz9\" (UID: \"cc0101a5-e04a-438f-9eb7-ee76683894a0\") " pod="openshift-console/console-5678f46c46-rrrz9" Mar 18 16:47:36.011108 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:36.011090 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cc0101a5-e04a-438f-9eb7-ee76683894a0-console-oauth-config\") pod \"console-5678f46c46-rrrz9\" (UID: \"cc0101a5-e04a-438f-9eb7-ee76683894a0\") " pod="openshift-console/console-5678f46c46-rrrz9" Mar 18 16:47:36.018776 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:36.018754 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqjmj\" (UniqueName: \"kubernetes.io/projected/cc0101a5-e04a-438f-9eb7-ee76683894a0-kube-api-access-qqjmj\") pod \"console-5678f46c46-rrrz9\" (UID: \"cc0101a5-e04a-438f-9eb7-ee76683894a0\") " pod="openshift-console/console-5678f46c46-rrrz9" Mar 18 16:47:36.099038 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:36.098933 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5678f46c46-rrrz9" Mar 18 16:47:36.227193 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:36.227154 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5678f46c46-rrrz9"] Mar 18 16:47:36.230811 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:47:36.230785 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcc0101a5_e04a_438f_9eb7_ee76683894a0.slice/crio-8a77585d137144f244beaa471fde1a5b8c887c4c675116cbf31f24bdc49a58ae WatchSource:0}: Error finding container 8a77585d137144f244beaa471fde1a5b8c887c4c675116cbf31f24bdc49a58ae: Status 404 returned error can't find the container with id 8a77585d137144f244beaa471fde1a5b8c887c4c675116cbf31f24bdc49a58ae Mar 18 16:47:37.167514 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:37.167471 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5678f46c46-rrrz9" event={"ID":"cc0101a5-e04a-438f-9eb7-ee76683894a0","Type":"ContainerStarted","Data":"8a77585d137144f244beaa471fde1a5b8c887c4c675116cbf31f24bdc49a58ae"} Mar 18 16:47:39.174068 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:39.173972 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5678f46c46-rrrz9" event={"ID":"cc0101a5-e04a-438f-9eb7-ee76683894a0","Type":"ContainerStarted","Data":"1eeba6ccacfbb3c987d5af8cabe8f146c52d5ffc77afc978b0a7537ff3b331e7"} Mar 18 16:47:39.191517 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:39.191471 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5678f46c46-rrrz9" podStartSLOduration=1.586660364 podStartE2EDuration="4.191448104s" podCreationTimestamp="2026-03-18 16:47:35 +0000 UTC" firstStartedPulling="2026-03-18 16:47:36.232472954 +0000 UTC m=+210.244581758" lastFinishedPulling="2026-03-18 16:47:38.837260691 +0000 UTC m=+212.849369498" observedRunningTime="2026-03-18 16:47:39.190769151 +0000 UTC m=+213.202878001" watchObservedRunningTime="2026-03-18 16:47:39.191448104 +0000 UTC m=+213.203556925" Mar 18 16:47:40.562028 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:40.561998 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-57f6d8dc57-4tc82" Mar 18 16:47:40.562389 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:40.562084 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-57f6d8dc57-4tc82" Mar 18 16:47:46.099074 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:46.099025 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5678f46c46-rrrz9" Mar 18 16:47:46.099074 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:46.099071 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5678f46c46-rrrz9" Mar 18 16:47:46.103868 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:46.103845 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5678f46c46-rrrz9" Mar 18 16:47:46.195478 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:46.195450 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5678f46c46-rrrz9" Mar 18 16:47:47.195666 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:47.195630 2578 generic.go:358] "Generic (PLEG): container finished" podID="61e0afa5-670e-42e7-8743-980838b63847" containerID="288332924e1de03a5bc1cf8b18e3caba9648b6f1116374355537fc7d9b0b6ec7" exitCode=0 Mar 18 16:47:47.196007 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:47.195701 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-76bdd9f478-x922g" event={"ID":"61e0afa5-670e-42e7-8743-980838b63847","Type":"ContainerDied","Data":"288332924e1de03a5bc1cf8b18e3caba9648b6f1116374355537fc7d9b0b6ec7"} Mar 18 16:47:47.196183 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:47.196167 2578 scope.go:117] "RemoveContainer" containerID="288332924e1de03a5bc1cf8b18e3caba9648b6f1116374355537fc7d9b0b6ec7" Mar 18 16:47:48.201125 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:47:48.201092 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-76bdd9f478-x922g" event={"ID":"61e0afa5-670e-42e7-8743-980838b63847","Type":"ContainerStarted","Data":"8b7507cadad06ae955eeb7fae0c36a01d7c271e284c95325b37efbcc11364364"} Mar 18 16:48:00.567871 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:48:00.567823 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-57f6d8dc57-4tc82" Mar 18 16:48:00.571883 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:48:00.571855 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-57f6d8dc57-4tc82" Mar 18 16:48:18.473165 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:48:18.473129 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4bd683e7-9eae-4e40-ba74-10c5acb6fd8b-metrics-certs\") pod \"network-metrics-daemon-bcgtq\" (UID: \"4bd683e7-9eae-4e40-ba74-10c5acb6fd8b\") " pod="openshift-multus/network-metrics-daemon-bcgtq" Mar 18 16:48:18.475371 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:48:18.475349 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4bd683e7-9eae-4e40-ba74-10c5acb6fd8b-metrics-certs\") pod \"network-metrics-daemon-bcgtq\" (UID: \"4bd683e7-9eae-4e40-ba74-10c5acb6fd8b\") " pod="openshift-multus/network-metrics-daemon-bcgtq" Mar 18 16:48:18.591996 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:48:18.591969 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-vrjl4\"" Mar 18 16:48:18.600737 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:48:18.600715 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bcgtq" Mar 18 16:48:18.714996 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:48:18.714964 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-bcgtq"] Mar 18 16:48:18.718278 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:48:18.718246 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4bd683e7_9eae_4e40_ba74_10c5acb6fd8b.slice/crio-6d21eab9fbbdc74c6c369369d9f320120a8ed3cb030cf1ecded98a2aa5866072 WatchSource:0}: Error finding container 6d21eab9fbbdc74c6c369369d9f320120a8ed3cb030cf1ecded98a2aa5866072: Status 404 returned error can't find the container with id 6d21eab9fbbdc74c6c369369d9f320120a8ed3cb030cf1ecded98a2aa5866072 Mar 18 16:48:19.284990 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:48:19.284954 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-bcgtq" event={"ID":"4bd683e7-9eae-4e40-ba74-10c5acb6fd8b","Type":"ContainerStarted","Data":"6d21eab9fbbdc74c6c369369d9f320120a8ed3cb030cf1ecded98a2aa5866072"} Mar 18 16:48:20.289191 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:48:20.289154 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-bcgtq" event={"ID":"4bd683e7-9eae-4e40-ba74-10c5acb6fd8b","Type":"ContainerStarted","Data":"122efe275aeae082705f496b8ec7cb4ade50d6ba830780f5b850f685a0112789"} Mar 18 16:48:20.289191 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:48:20.289192 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-bcgtq" event={"ID":"4bd683e7-9eae-4e40-ba74-10c5acb6fd8b","Type":"ContainerStarted","Data":"256385e4f47004c08615b5f9b50b396e91f3f3ca7ad8f20c9cee19edf1b5a044"} Mar 18 16:48:20.305256 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:48:20.305211 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-bcgtq" podStartSLOduration=253.337873292 podStartE2EDuration="4m14.305157149s" podCreationTimestamp="2026-03-18 16:44:06 +0000 UTC" firstStartedPulling="2026-03-18 16:48:18.720023443 +0000 UTC m=+252.732132247" lastFinishedPulling="2026-03-18 16:48:19.687307286 +0000 UTC m=+253.699416104" observedRunningTime="2026-03-18 16:48:20.304409108 +0000 UTC m=+254.316517933" watchObservedRunningTime="2026-03-18 16:48:20.305157149 +0000 UTC m=+254.317265954" Mar 18 16:48:40.346779 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:48:40.346702 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-74554c6fc7-vjhfx"] Mar 18 16:48:40.349850 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:48:40.349833 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-74554c6fc7-vjhfx" Mar 18 16:48:40.352297 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:48:40.352266 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Mar 18 16:48:40.352823 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:48:40.352801 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Mar 18 16:48:40.353070 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:48:40.352918 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-sr7nh\"" Mar 18 16:48:40.353162 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:48:40.353105 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Mar 18 16:48:40.353261 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:48:40.353237 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Mar 18 16:48:40.357721 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:48:40.357699 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Mar 18 16:48:40.360030 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:48:40.360008 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Mar 18 16:48:40.361069 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:48:40.361040 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-74554c6fc7-vjhfx"] Mar 18 16:48:40.452420 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:48:40.452389 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/73448893-1fc0-402a-9e38-cf3a81a0a9f0-telemeter-client-tls\") pod \"telemeter-client-74554c6fc7-vjhfx\" (UID: \"73448893-1fc0-402a-9e38-cf3a81a0a9f0\") " pod="openshift-monitoring/telemeter-client-74554c6fc7-vjhfx" Mar 18 16:48:40.452420 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:48:40.452422 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/73448893-1fc0-402a-9e38-cf3a81a0a9f0-telemeter-trusted-ca-bundle\") pod \"telemeter-client-74554c6fc7-vjhfx\" (UID: \"73448893-1fc0-402a-9e38-cf3a81a0a9f0\") " pod="openshift-monitoring/telemeter-client-74554c6fc7-vjhfx" Mar 18 16:48:40.452673 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:48:40.452511 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/73448893-1fc0-402a-9e38-cf3a81a0a9f0-serving-certs-ca-bundle\") pod \"telemeter-client-74554c6fc7-vjhfx\" (UID: \"73448893-1fc0-402a-9e38-cf3a81a0a9f0\") " pod="openshift-monitoring/telemeter-client-74554c6fc7-vjhfx" Mar 18 16:48:40.452673 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:48:40.452571 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/73448893-1fc0-402a-9e38-cf3a81a0a9f0-secret-telemeter-client\") pod \"telemeter-client-74554c6fc7-vjhfx\" (UID: \"73448893-1fc0-402a-9e38-cf3a81a0a9f0\") " pod="openshift-monitoring/telemeter-client-74554c6fc7-vjhfx" Mar 18 16:48:40.452673 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:48:40.452617 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/73448893-1fc0-402a-9e38-cf3a81a0a9f0-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-74554c6fc7-vjhfx\" (UID: \"73448893-1fc0-402a-9e38-cf3a81a0a9f0\") " pod="openshift-monitoring/telemeter-client-74554c6fc7-vjhfx" Mar 18 16:48:40.452673 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:48:40.452645 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/73448893-1fc0-402a-9e38-cf3a81a0a9f0-metrics-client-ca\") pod \"telemeter-client-74554c6fc7-vjhfx\" (UID: \"73448893-1fc0-402a-9e38-cf3a81a0a9f0\") " pod="openshift-monitoring/telemeter-client-74554c6fc7-vjhfx" Mar 18 16:48:40.452799 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:48:40.452676 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/73448893-1fc0-402a-9e38-cf3a81a0a9f0-federate-client-tls\") pod \"telemeter-client-74554c6fc7-vjhfx\" (UID: \"73448893-1fc0-402a-9e38-cf3a81a0a9f0\") " pod="openshift-monitoring/telemeter-client-74554c6fc7-vjhfx" Mar 18 16:48:40.452799 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:48:40.452694 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9stj\" (UniqueName: \"kubernetes.io/projected/73448893-1fc0-402a-9e38-cf3a81a0a9f0-kube-api-access-t9stj\") pod \"telemeter-client-74554c6fc7-vjhfx\" (UID: \"73448893-1fc0-402a-9e38-cf3a81a0a9f0\") " pod="openshift-monitoring/telemeter-client-74554c6fc7-vjhfx" Mar 18 16:48:40.553882 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:48:40.553833 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/73448893-1fc0-402a-9e38-cf3a81a0a9f0-secret-telemeter-client\") pod \"telemeter-client-74554c6fc7-vjhfx\" (UID: \"73448893-1fc0-402a-9e38-cf3a81a0a9f0\") " pod="openshift-monitoring/telemeter-client-74554c6fc7-vjhfx" Mar 18 16:48:40.553882 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:48:40.553885 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/73448893-1fc0-402a-9e38-cf3a81a0a9f0-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-74554c6fc7-vjhfx\" (UID: \"73448893-1fc0-402a-9e38-cf3a81a0a9f0\") " pod="openshift-monitoring/telemeter-client-74554c6fc7-vjhfx" Mar 18 16:48:40.554073 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:48:40.553906 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/73448893-1fc0-402a-9e38-cf3a81a0a9f0-metrics-client-ca\") pod \"telemeter-client-74554c6fc7-vjhfx\" (UID: \"73448893-1fc0-402a-9e38-cf3a81a0a9f0\") " pod="openshift-monitoring/telemeter-client-74554c6fc7-vjhfx" Mar 18 16:48:40.554073 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:48:40.553924 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/73448893-1fc0-402a-9e38-cf3a81a0a9f0-federate-client-tls\") pod \"telemeter-client-74554c6fc7-vjhfx\" (UID: \"73448893-1fc0-402a-9e38-cf3a81a0a9f0\") " pod="openshift-monitoring/telemeter-client-74554c6fc7-vjhfx" Mar 18 16:48:40.554073 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:48:40.553941 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t9stj\" (UniqueName: \"kubernetes.io/projected/73448893-1fc0-402a-9e38-cf3a81a0a9f0-kube-api-access-t9stj\") pod \"telemeter-client-74554c6fc7-vjhfx\" (UID: \"73448893-1fc0-402a-9e38-cf3a81a0a9f0\") " pod="openshift-monitoring/telemeter-client-74554c6fc7-vjhfx" Mar 18 16:48:40.554073 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:48:40.553984 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/73448893-1fc0-402a-9e38-cf3a81a0a9f0-telemeter-client-tls\") pod \"telemeter-client-74554c6fc7-vjhfx\" (UID: \"73448893-1fc0-402a-9e38-cf3a81a0a9f0\") " pod="openshift-monitoring/telemeter-client-74554c6fc7-vjhfx" Mar 18 16:48:40.554073 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:48:40.554004 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/73448893-1fc0-402a-9e38-cf3a81a0a9f0-telemeter-trusted-ca-bundle\") pod \"telemeter-client-74554c6fc7-vjhfx\" (UID: \"73448893-1fc0-402a-9e38-cf3a81a0a9f0\") " pod="openshift-monitoring/telemeter-client-74554c6fc7-vjhfx" Mar 18 16:48:40.554073 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:48:40.554073 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/73448893-1fc0-402a-9e38-cf3a81a0a9f0-serving-certs-ca-bundle\") pod \"telemeter-client-74554c6fc7-vjhfx\" (UID: \"73448893-1fc0-402a-9e38-cf3a81a0a9f0\") " pod="openshift-monitoring/telemeter-client-74554c6fc7-vjhfx" Mar 18 16:48:40.554774 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:48:40.554749 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/73448893-1fc0-402a-9e38-cf3a81a0a9f0-serving-certs-ca-bundle\") pod \"telemeter-client-74554c6fc7-vjhfx\" (UID: \"73448893-1fc0-402a-9e38-cf3a81a0a9f0\") " pod="openshift-monitoring/telemeter-client-74554c6fc7-vjhfx" Mar 18 16:48:40.554907 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:48:40.554802 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/73448893-1fc0-402a-9e38-cf3a81a0a9f0-metrics-client-ca\") pod \"telemeter-client-74554c6fc7-vjhfx\" (UID: \"73448893-1fc0-402a-9e38-cf3a81a0a9f0\") " pod="openshift-monitoring/telemeter-client-74554c6fc7-vjhfx" Mar 18 16:48:40.555023 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:48:40.554998 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/73448893-1fc0-402a-9e38-cf3a81a0a9f0-telemeter-trusted-ca-bundle\") pod \"telemeter-client-74554c6fc7-vjhfx\" (UID: \"73448893-1fc0-402a-9e38-cf3a81a0a9f0\") " pod="openshift-monitoring/telemeter-client-74554c6fc7-vjhfx" Mar 18 16:48:40.556611 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:48:40.556583 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/73448893-1fc0-402a-9e38-cf3a81a0a9f0-telemeter-client-tls\") pod \"telemeter-client-74554c6fc7-vjhfx\" (UID: \"73448893-1fc0-402a-9e38-cf3a81a0a9f0\") " pod="openshift-monitoring/telemeter-client-74554c6fc7-vjhfx" Mar 18 16:48:40.556737 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:48:40.556721 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/73448893-1fc0-402a-9e38-cf3a81a0a9f0-secret-telemeter-client\") pod \"telemeter-client-74554c6fc7-vjhfx\" (UID: \"73448893-1fc0-402a-9e38-cf3a81a0a9f0\") " pod="openshift-monitoring/telemeter-client-74554c6fc7-vjhfx" Mar 18 16:48:40.556813 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:48:40.556795 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/73448893-1fc0-402a-9e38-cf3a81a0a9f0-federate-client-tls\") pod \"telemeter-client-74554c6fc7-vjhfx\" (UID: \"73448893-1fc0-402a-9e38-cf3a81a0a9f0\") " pod="openshift-monitoring/telemeter-client-74554c6fc7-vjhfx" Mar 18 16:48:40.556813 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:48:40.556807 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/73448893-1fc0-402a-9e38-cf3a81a0a9f0-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-74554c6fc7-vjhfx\" (UID: \"73448893-1fc0-402a-9e38-cf3a81a0a9f0\") " pod="openshift-monitoring/telemeter-client-74554c6fc7-vjhfx" Mar 18 16:48:40.561559 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:48:40.561521 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9stj\" (UniqueName: \"kubernetes.io/projected/73448893-1fc0-402a-9e38-cf3a81a0a9f0-kube-api-access-t9stj\") pod \"telemeter-client-74554c6fc7-vjhfx\" (UID: \"73448893-1fc0-402a-9e38-cf3a81a0a9f0\") " pod="openshift-monitoring/telemeter-client-74554c6fc7-vjhfx" Mar 18 16:48:40.662126 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:48:40.662049 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-74554c6fc7-vjhfx" Mar 18 16:48:40.787260 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:48:40.787216 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-74554c6fc7-vjhfx"] Mar 18 16:48:40.790290 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:48:40.790252 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73448893_1fc0_402a_9e38_cf3a81a0a9f0.slice/crio-3d51e754c9f4d123620c16ec437e075c69eeaeda5af06547872a8c9632651711 WatchSource:0}: Error finding container 3d51e754c9f4d123620c16ec437e075c69eeaeda5af06547872a8c9632651711: Status 404 returned error can't find the container with id 3d51e754c9f4d123620c16ec437e075c69eeaeda5af06547872a8c9632651711 Mar 18 16:48:41.346860 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:48:41.346825 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-74554c6fc7-vjhfx" event={"ID":"73448893-1fc0-402a-9e38-cf3a81a0a9f0","Type":"ContainerStarted","Data":"3d51e754c9f4d123620c16ec437e075c69eeaeda5af06547872a8c9632651711"} Mar 18 16:48:43.354465 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:48:43.354424 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-74554c6fc7-vjhfx" event={"ID":"73448893-1fc0-402a-9e38-cf3a81a0a9f0","Type":"ContainerStarted","Data":"f286762d7cc82a7d1e6edc304b2c78c1315833f1e8f9313d70e86c87ea04cc8f"} Mar 18 16:48:44.360352 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:48:44.360314 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-74554c6fc7-vjhfx" event={"ID":"73448893-1fc0-402a-9e38-cf3a81a0a9f0","Type":"ContainerStarted","Data":"a4c92423a76d37a1a57e4993791552c243e95151c7ae6d0bbc3c3f691dc3f725"} Mar 18 16:48:44.360352 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:48:44.360358 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-74554c6fc7-vjhfx" event={"ID":"73448893-1fc0-402a-9e38-cf3a81a0a9f0","Type":"ContainerStarted","Data":"6d4351d22cf31b3ff3afccee49eb423b0ace220f68e586669e78b94931c6e1d1"} Mar 18 16:48:44.396447 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:48:44.396389 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-74554c6fc7-vjhfx" podStartSLOduration=1.521314452 podStartE2EDuration="4.39637178s" podCreationTimestamp="2026-03-18 16:48:40 +0000 UTC" firstStartedPulling="2026-03-18 16:48:40.79218209 +0000 UTC m=+274.804290893" lastFinishedPulling="2026-03-18 16:48:43.667239398 +0000 UTC m=+277.679348221" observedRunningTime="2026-03-18 16:48:44.395312297 +0000 UTC m=+278.407421158" watchObservedRunningTime="2026-03-18 16:48:44.39637178 +0000 UTC m=+278.408480621" Mar 18 16:48:45.041039 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:48:45.041010 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5f68cd56dd-5tpxq"] Mar 18 16:48:45.044161 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:48:45.044145 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5f68cd56dd-5tpxq" Mar 18 16:48:45.053511 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:48:45.053488 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5f68cd56dd-5tpxq"] Mar 18 16:48:45.094221 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:48:45.094194 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkwk6\" (UniqueName: \"kubernetes.io/projected/8762406a-4b01-432f-ba53-96f3b7751e2a-kube-api-access-nkwk6\") pod \"console-5f68cd56dd-5tpxq\" (UID: \"8762406a-4b01-432f-ba53-96f3b7751e2a\") " pod="openshift-console/console-5f68cd56dd-5tpxq" Mar 18 16:48:45.094383 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:48:45.094229 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8762406a-4b01-432f-ba53-96f3b7751e2a-service-ca\") pod \"console-5f68cd56dd-5tpxq\" (UID: \"8762406a-4b01-432f-ba53-96f3b7751e2a\") " pod="openshift-console/console-5f68cd56dd-5tpxq" Mar 18 16:48:45.094383 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:48:45.094253 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8762406a-4b01-432f-ba53-96f3b7751e2a-console-serving-cert\") pod \"console-5f68cd56dd-5tpxq\" (UID: \"8762406a-4b01-432f-ba53-96f3b7751e2a\") " pod="openshift-console/console-5f68cd56dd-5tpxq" Mar 18 16:48:45.094383 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:48:45.094269 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8762406a-4b01-432f-ba53-96f3b7751e2a-console-config\") pod \"console-5f68cd56dd-5tpxq\" (UID: \"8762406a-4b01-432f-ba53-96f3b7751e2a\") " pod="openshift-console/console-5f68cd56dd-5tpxq" Mar 18 16:48:45.094383 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:48:45.094343 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8762406a-4b01-432f-ba53-96f3b7751e2a-oauth-serving-cert\") pod \"console-5f68cd56dd-5tpxq\" (UID: \"8762406a-4b01-432f-ba53-96f3b7751e2a\") " pod="openshift-console/console-5f68cd56dd-5tpxq" Mar 18 16:48:45.094512 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:48:45.094383 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8762406a-4b01-432f-ba53-96f3b7751e2a-trusted-ca-bundle\") pod \"console-5f68cd56dd-5tpxq\" (UID: \"8762406a-4b01-432f-ba53-96f3b7751e2a\") " pod="openshift-console/console-5f68cd56dd-5tpxq" Mar 18 16:48:45.094512 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:48:45.094407 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8762406a-4b01-432f-ba53-96f3b7751e2a-console-oauth-config\") pod \"console-5f68cd56dd-5tpxq\" (UID: \"8762406a-4b01-432f-ba53-96f3b7751e2a\") " pod="openshift-console/console-5f68cd56dd-5tpxq" Mar 18 16:48:45.195144 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:48:45.195110 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8762406a-4b01-432f-ba53-96f3b7751e2a-console-oauth-config\") pod \"console-5f68cd56dd-5tpxq\" (UID: \"8762406a-4b01-432f-ba53-96f3b7751e2a\") " pod="openshift-console/console-5f68cd56dd-5tpxq" Mar 18 16:48:45.195307 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:48:45.195164 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nkwk6\" (UniqueName: \"kubernetes.io/projected/8762406a-4b01-432f-ba53-96f3b7751e2a-kube-api-access-nkwk6\") pod \"console-5f68cd56dd-5tpxq\" (UID: \"8762406a-4b01-432f-ba53-96f3b7751e2a\") " pod="openshift-console/console-5f68cd56dd-5tpxq" Mar 18 16:48:45.195307 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:48:45.195188 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8762406a-4b01-432f-ba53-96f3b7751e2a-service-ca\") pod \"console-5f68cd56dd-5tpxq\" (UID: \"8762406a-4b01-432f-ba53-96f3b7751e2a\") " pod="openshift-console/console-5f68cd56dd-5tpxq" Mar 18 16:48:45.195307 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:48:45.195211 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8762406a-4b01-432f-ba53-96f3b7751e2a-console-serving-cert\") pod \"console-5f68cd56dd-5tpxq\" (UID: \"8762406a-4b01-432f-ba53-96f3b7751e2a\") " pod="openshift-console/console-5f68cd56dd-5tpxq" Mar 18 16:48:45.195307 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:48:45.195227 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8762406a-4b01-432f-ba53-96f3b7751e2a-console-config\") pod \"console-5f68cd56dd-5tpxq\" (UID: \"8762406a-4b01-432f-ba53-96f3b7751e2a\") " pod="openshift-console/console-5f68cd56dd-5tpxq" Mar 18 16:48:45.195498 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:48:45.195318 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8762406a-4b01-432f-ba53-96f3b7751e2a-oauth-serving-cert\") pod \"console-5f68cd56dd-5tpxq\" (UID: \"8762406a-4b01-432f-ba53-96f3b7751e2a\") " pod="openshift-console/console-5f68cd56dd-5tpxq" Mar 18 16:48:45.195498 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:48:45.195488 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8762406a-4b01-432f-ba53-96f3b7751e2a-trusted-ca-bundle\") pod \"console-5f68cd56dd-5tpxq\" (UID: \"8762406a-4b01-432f-ba53-96f3b7751e2a\") " pod="openshift-console/console-5f68cd56dd-5tpxq" Mar 18 16:48:45.195935 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:48:45.195915 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8762406a-4b01-432f-ba53-96f3b7751e2a-console-config\") pod \"console-5f68cd56dd-5tpxq\" (UID: \"8762406a-4b01-432f-ba53-96f3b7751e2a\") " pod="openshift-console/console-5f68cd56dd-5tpxq" Mar 18 16:48:45.196046 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:48:45.196014 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8762406a-4b01-432f-ba53-96f3b7751e2a-service-ca\") pod \"console-5f68cd56dd-5tpxq\" (UID: \"8762406a-4b01-432f-ba53-96f3b7751e2a\") " pod="openshift-console/console-5f68cd56dd-5tpxq" Mar 18 16:48:45.196146 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:48:45.196123 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8762406a-4b01-432f-ba53-96f3b7751e2a-oauth-serving-cert\") pod \"console-5f68cd56dd-5tpxq\" (UID: \"8762406a-4b01-432f-ba53-96f3b7751e2a\") " pod="openshift-console/console-5f68cd56dd-5tpxq" Mar 18 16:48:45.196413 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:48:45.196392 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8762406a-4b01-432f-ba53-96f3b7751e2a-trusted-ca-bundle\") pod \"console-5f68cd56dd-5tpxq\" (UID: \"8762406a-4b01-432f-ba53-96f3b7751e2a\") " pod="openshift-console/console-5f68cd56dd-5tpxq" Mar 18 16:48:45.197679 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:48:45.197659 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8762406a-4b01-432f-ba53-96f3b7751e2a-console-oauth-config\") pod \"console-5f68cd56dd-5tpxq\" (UID: \"8762406a-4b01-432f-ba53-96f3b7751e2a\") " pod="openshift-console/console-5f68cd56dd-5tpxq" Mar 18 16:48:45.198202 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:48:45.198175 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8762406a-4b01-432f-ba53-96f3b7751e2a-console-serving-cert\") pod \"console-5f68cd56dd-5tpxq\" (UID: \"8762406a-4b01-432f-ba53-96f3b7751e2a\") " pod="openshift-console/console-5f68cd56dd-5tpxq" Mar 18 16:48:45.203655 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:48:45.203624 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkwk6\" (UniqueName: \"kubernetes.io/projected/8762406a-4b01-432f-ba53-96f3b7751e2a-kube-api-access-nkwk6\") pod \"console-5f68cd56dd-5tpxq\" (UID: \"8762406a-4b01-432f-ba53-96f3b7751e2a\") " pod="openshift-console/console-5f68cd56dd-5tpxq" Mar 18 16:48:45.353739 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:48:45.353657 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5f68cd56dd-5tpxq" Mar 18 16:48:45.476202 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:48:45.476150 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5f68cd56dd-5tpxq"] Mar 18 16:48:45.480044 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:48:45.480016 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8762406a_4b01_432f_ba53_96f3b7751e2a.slice/crio-cf871e1bcd9ea7fc8a7dc5315383e88daa8f5a2f36160c09aa6c2cea6a028e84 WatchSource:0}: Error finding container cf871e1bcd9ea7fc8a7dc5315383e88daa8f5a2f36160c09aa6c2cea6a028e84: Status 404 returned error can't find the container with id cf871e1bcd9ea7fc8a7dc5315383e88daa8f5a2f36160c09aa6c2cea6a028e84 Mar 18 16:48:45.988246 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:48:45.988203 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-gwqfh" podUID="3d42738c-ceaa-4925-8255-a2b61010e00f" Mar 18 16:48:46.368089 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:48:46.368052 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5f68cd56dd-5tpxq" event={"ID":"8762406a-4b01-432f-ba53-96f3b7751e2a","Type":"ContainerStarted","Data":"57bf12264fa735d4ad95c4e50a75ac40afa68a03ece90d6d044e439b36297f72"} Mar 18 16:48:46.368089 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:48:46.368081 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-gwqfh" Mar 18 16:48:46.368089 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:48:46.368093 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5f68cd56dd-5tpxq" event={"ID":"8762406a-4b01-432f-ba53-96f3b7751e2a","Type":"ContainerStarted","Data":"cf871e1bcd9ea7fc8a7dc5315383e88daa8f5a2f36160c09aa6c2cea6a028e84"} Mar 18 16:48:46.385953 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:48:46.385911 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5f68cd56dd-5tpxq" podStartSLOduration=1.385898463 podStartE2EDuration="1.385898463s" podCreationTimestamp="2026-03-18 16:48:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:48:46.384121619 +0000 UTC m=+280.396230444" watchObservedRunningTime="2026-03-18 16:48:46.385898463 +0000 UTC m=+280.398007288" Mar 18 16:48:49.330143 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:48:49.330097 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3d42738c-ceaa-4925-8255-a2b61010e00f-cert\") pod \"ingress-canary-gwqfh\" (UID: \"3d42738c-ceaa-4925-8255-a2b61010e00f\") " pod="openshift-ingress-canary/ingress-canary-gwqfh" Mar 18 16:48:49.332437 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:48:49.332414 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3d42738c-ceaa-4925-8255-a2b61010e00f-cert\") pod \"ingress-canary-gwqfh\" (UID: \"3d42738c-ceaa-4925-8255-a2b61010e00f\") " pod="openshift-ingress-canary/ingress-canary-gwqfh" Mar 18 16:48:49.370969 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:48:49.370943 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-tcvm2\"" Mar 18 16:48:49.379061 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:48:49.379037 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-gwqfh" Mar 18 16:48:49.493889 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:48:49.493853 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-gwqfh"] Mar 18 16:48:49.497103 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:48:49.497071 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d42738c_ceaa_4925_8255_a2b61010e00f.slice/crio-3fa34704f4387dda9484d43e4a0918cb7217706d6c23eac7ec05aad41d6d33ba WatchSource:0}: Error finding container 3fa34704f4387dda9484d43e4a0918cb7217706d6c23eac7ec05aad41d6d33ba: Status 404 returned error can't find the container with id 3fa34704f4387dda9484d43e4a0918cb7217706d6c23eac7ec05aad41d6d33ba Mar 18 16:48:50.379729 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:48:50.379684 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-gwqfh" event={"ID":"3d42738c-ceaa-4925-8255-a2b61010e00f","Type":"ContainerStarted","Data":"3fa34704f4387dda9484d43e4a0918cb7217706d6c23eac7ec05aad41d6d33ba"} Mar 18 16:48:51.386165 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:48:51.386125 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-gwqfh" event={"ID":"3d42738c-ceaa-4925-8255-a2b61010e00f","Type":"ContainerStarted","Data":"67beb447ef4f7b8651a31e24ef5530577f38edc2dbdc8d4ed4faa8f7630ebd4d"} Mar 18 16:48:51.402181 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:48:51.402124 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-gwqfh" podStartSLOduration=250.660984399 podStartE2EDuration="4m12.402106178s" podCreationTimestamp="2026-03-18 16:44:39 +0000 UTC" firstStartedPulling="2026-03-18 16:48:49.498802859 +0000 UTC m=+283.510911668" lastFinishedPulling="2026-03-18 16:48:51.239924634 +0000 UTC m=+285.252033447" observedRunningTime="2026-03-18 16:48:51.400976812 +0000 UTC m=+285.413085638" watchObservedRunningTime="2026-03-18 16:48:51.402106178 +0000 UTC m=+285.414215004" Mar 18 16:48:55.354177 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:48:55.354138 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5f68cd56dd-5tpxq" Mar 18 16:48:55.354635 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:48:55.354212 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5f68cd56dd-5tpxq" Mar 18 16:48:55.358916 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:48:55.358896 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5f68cd56dd-5tpxq" Mar 18 16:48:55.401166 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:48:55.401136 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5f68cd56dd-5tpxq" Mar 18 16:48:55.449515 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:48:55.449481 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5678f46c46-rrrz9"] Mar 18 16:49:06.471210 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:49:06.471182 2578 kubelet.go:1628] "Image garbage collection succeeded" Mar 18 16:49:20.474129 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:49:20.474089 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-5678f46c46-rrrz9" podUID="cc0101a5-e04a-438f-9eb7-ee76683894a0" containerName="console" containerID="cri-o://1eeba6ccacfbb3c987d5af8cabe8f146c52d5ffc77afc978b0a7537ff3b331e7" gracePeriod=15 Mar 18 16:49:20.727426 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:49:20.727368 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5678f46c46-rrrz9_cc0101a5-e04a-438f-9eb7-ee76683894a0/console/0.log" Mar 18 16:49:20.727558 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:49:20.727429 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5678f46c46-rrrz9" Mar 18 16:49:20.899135 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:49:20.899088 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cc0101a5-e04a-438f-9eb7-ee76683894a0-console-config\") pod \"cc0101a5-e04a-438f-9eb7-ee76683894a0\" (UID: \"cc0101a5-e04a-438f-9eb7-ee76683894a0\") " Mar 18 16:49:20.899341 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:49:20.899163 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cc0101a5-e04a-438f-9eb7-ee76683894a0-trusted-ca-bundle\") pod \"cc0101a5-e04a-438f-9eb7-ee76683894a0\" (UID: \"cc0101a5-e04a-438f-9eb7-ee76683894a0\") " Mar 18 16:49:20.899341 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:49:20.899222 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qqjmj\" (UniqueName: \"kubernetes.io/projected/cc0101a5-e04a-438f-9eb7-ee76683894a0-kube-api-access-qqjmj\") pod \"cc0101a5-e04a-438f-9eb7-ee76683894a0\" (UID: \"cc0101a5-e04a-438f-9eb7-ee76683894a0\") " Mar 18 16:49:20.899341 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:49:20.899248 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cc0101a5-e04a-438f-9eb7-ee76683894a0-console-oauth-config\") pod \"cc0101a5-e04a-438f-9eb7-ee76683894a0\" (UID: \"cc0101a5-e04a-438f-9eb7-ee76683894a0\") " Mar 18 16:49:20.899483 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:49:20.899351 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cc0101a5-e04a-438f-9eb7-ee76683894a0-oauth-serving-cert\") pod \"cc0101a5-e04a-438f-9eb7-ee76683894a0\" (UID: \"cc0101a5-e04a-438f-9eb7-ee76683894a0\") " Mar 18 16:49:20.899483 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:49:20.899404 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cc0101a5-e04a-438f-9eb7-ee76683894a0-service-ca\") pod \"cc0101a5-e04a-438f-9eb7-ee76683894a0\" (UID: \"cc0101a5-e04a-438f-9eb7-ee76683894a0\") " Mar 18 16:49:20.899483 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:49:20.899470 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cc0101a5-e04a-438f-9eb7-ee76683894a0-console-serving-cert\") pod \"cc0101a5-e04a-438f-9eb7-ee76683894a0\" (UID: \"cc0101a5-e04a-438f-9eb7-ee76683894a0\") " Mar 18 16:49:20.899751 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:49:20.899597 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc0101a5-e04a-438f-9eb7-ee76683894a0-console-config" (OuterVolumeSpecName: "console-config") pod "cc0101a5-e04a-438f-9eb7-ee76683894a0" (UID: "cc0101a5-e04a-438f-9eb7-ee76683894a0"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 16:49:20.899751 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:49:20.899740 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc0101a5-e04a-438f-9eb7-ee76683894a0-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "cc0101a5-e04a-438f-9eb7-ee76683894a0" (UID: "cc0101a5-e04a-438f-9eb7-ee76683894a0"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 16:49:20.899853 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:49:20.899801 2578 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cc0101a5-e04a-438f-9eb7-ee76683894a0-console-config\") on node \"ip-10-0-143-175.ec2.internal\" DevicePath \"\"" Mar 18 16:49:20.899906 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:49:20.899865 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc0101a5-e04a-438f-9eb7-ee76683894a0-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "cc0101a5-e04a-438f-9eb7-ee76683894a0" (UID: "cc0101a5-e04a-438f-9eb7-ee76683894a0"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 16:49:20.899906 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:49:20.899862 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc0101a5-e04a-438f-9eb7-ee76683894a0-service-ca" (OuterVolumeSpecName: "service-ca") pod "cc0101a5-e04a-438f-9eb7-ee76683894a0" (UID: "cc0101a5-e04a-438f-9eb7-ee76683894a0"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 16:49:20.901658 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:49:20.901631 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc0101a5-e04a-438f-9eb7-ee76683894a0-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "cc0101a5-e04a-438f-9eb7-ee76683894a0" (UID: "cc0101a5-e04a-438f-9eb7-ee76683894a0"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:49:20.901658 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:49:20.901628 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc0101a5-e04a-438f-9eb7-ee76683894a0-kube-api-access-qqjmj" (OuterVolumeSpecName: "kube-api-access-qqjmj") pod "cc0101a5-e04a-438f-9eb7-ee76683894a0" (UID: "cc0101a5-e04a-438f-9eb7-ee76683894a0"). InnerVolumeSpecName "kube-api-access-qqjmj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 16:49:20.901802 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:49:20.901690 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc0101a5-e04a-438f-9eb7-ee76683894a0-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "cc0101a5-e04a-438f-9eb7-ee76683894a0" (UID: "cc0101a5-e04a-438f-9eb7-ee76683894a0"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:49:21.000742 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:49:21.000652 2578 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cc0101a5-e04a-438f-9eb7-ee76683894a0-oauth-serving-cert\") on node \"ip-10-0-143-175.ec2.internal\" DevicePath \"\"" Mar 18 16:49:21.000742 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:49:21.000685 2578 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cc0101a5-e04a-438f-9eb7-ee76683894a0-service-ca\") on node \"ip-10-0-143-175.ec2.internal\" DevicePath \"\"" Mar 18 16:49:21.000742 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:49:21.000695 2578 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cc0101a5-e04a-438f-9eb7-ee76683894a0-console-serving-cert\") on node \"ip-10-0-143-175.ec2.internal\" DevicePath \"\"" Mar 18 16:49:21.000742 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:49:21.000705 2578 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cc0101a5-e04a-438f-9eb7-ee76683894a0-trusted-ca-bundle\") on node \"ip-10-0-143-175.ec2.internal\" DevicePath \"\"" Mar 18 16:49:21.000742 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:49:21.000714 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qqjmj\" (UniqueName: \"kubernetes.io/projected/cc0101a5-e04a-438f-9eb7-ee76683894a0-kube-api-access-qqjmj\") on node \"ip-10-0-143-175.ec2.internal\" DevicePath \"\"" Mar 18 16:49:21.000742 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:49:21.000725 2578 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cc0101a5-e04a-438f-9eb7-ee76683894a0-console-oauth-config\") on node \"ip-10-0-143-175.ec2.internal\" DevicePath \"\"" Mar 18 16:49:21.471148 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:49:21.471122 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5678f46c46-rrrz9_cc0101a5-e04a-438f-9eb7-ee76683894a0/console/0.log" Mar 18 16:49:21.471321 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:49:21.471161 2578 generic.go:358] "Generic (PLEG): container finished" podID="cc0101a5-e04a-438f-9eb7-ee76683894a0" containerID="1eeba6ccacfbb3c987d5af8cabe8f146c52d5ffc77afc978b0a7537ff3b331e7" exitCode=2 Mar 18 16:49:21.471321 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:49:21.471193 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5678f46c46-rrrz9" event={"ID":"cc0101a5-e04a-438f-9eb7-ee76683894a0","Type":"ContainerDied","Data":"1eeba6ccacfbb3c987d5af8cabe8f146c52d5ffc77afc978b0a7537ff3b331e7"} Mar 18 16:49:21.471321 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:49:21.471231 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5678f46c46-rrrz9" event={"ID":"cc0101a5-e04a-438f-9eb7-ee76683894a0","Type":"ContainerDied","Data":"8a77585d137144f244beaa471fde1a5b8c887c4c675116cbf31f24bdc49a58ae"} Mar 18 16:49:21.471321 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:49:21.471230 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5678f46c46-rrrz9" Mar 18 16:49:21.471321 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:49:21.471300 2578 scope.go:117] "RemoveContainer" containerID="1eeba6ccacfbb3c987d5af8cabe8f146c52d5ffc77afc978b0a7537ff3b331e7" Mar 18 16:49:21.479815 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:49:21.479660 2578 scope.go:117] "RemoveContainer" containerID="1eeba6ccacfbb3c987d5af8cabe8f146c52d5ffc77afc978b0a7537ff3b331e7" Mar 18 16:49:21.480060 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:49:21.479947 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1eeba6ccacfbb3c987d5af8cabe8f146c52d5ffc77afc978b0a7537ff3b331e7\": container with ID starting with 1eeba6ccacfbb3c987d5af8cabe8f146c52d5ffc77afc978b0a7537ff3b331e7 not found: ID does not exist" containerID="1eeba6ccacfbb3c987d5af8cabe8f146c52d5ffc77afc978b0a7537ff3b331e7" Mar 18 16:49:21.480060 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:49:21.479970 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1eeba6ccacfbb3c987d5af8cabe8f146c52d5ffc77afc978b0a7537ff3b331e7"} err="failed to get container status \"1eeba6ccacfbb3c987d5af8cabe8f146c52d5ffc77afc978b0a7537ff3b331e7\": rpc error: code = NotFound desc = could not find container \"1eeba6ccacfbb3c987d5af8cabe8f146c52d5ffc77afc978b0a7537ff3b331e7\": container with ID starting with 1eeba6ccacfbb3c987d5af8cabe8f146c52d5ffc77afc978b0a7537ff3b331e7 not found: ID does not exist" Mar 18 16:49:21.490460 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:49:21.490438 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5678f46c46-rrrz9"] Mar 18 16:49:21.493843 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:49:21.493824 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5678f46c46-rrrz9"] Mar 18 16:49:22.592154 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:49:22.592121 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc0101a5-e04a-438f-9eb7-ee76683894a0" path="/var/lib/kubelet/pods/cc0101a5-e04a-438f-9eb7-ee76683894a0/volumes" Mar 18 16:49:52.757347 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:49:52.757312 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5fd97696f7-fffnn"] Mar 18 16:49:52.757898 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:49:52.757578 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cc0101a5-e04a-438f-9eb7-ee76683894a0" containerName="console" Mar 18 16:49:52.757898 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:49:52.757590 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc0101a5-e04a-438f-9eb7-ee76683894a0" containerName="console" Mar 18 16:49:52.757898 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:49:52.757649 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="cc0101a5-e04a-438f-9eb7-ee76683894a0" containerName="console" Mar 18 16:49:52.760290 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:49:52.760274 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5fd97696f7-fffnn" Mar 18 16:49:52.772277 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:49:52.772252 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5fd97696f7-fffnn"] Mar 18 16:49:52.848940 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:49:52.848898 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ade3e0f6-9c50-4ce3-ab0c-587c10f7ba15-console-serving-cert\") pod \"console-5fd97696f7-fffnn\" (UID: \"ade3e0f6-9c50-4ce3-ab0c-587c10f7ba15\") " pod="openshift-console/console-5fd97696f7-fffnn" Mar 18 16:49:52.849213 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:49:52.849194 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ade3e0f6-9c50-4ce3-ab0c-587c10f7ba15-oauth-serving-cert\") pod \"console-5fd97696f7-fffnn\" (UID: \"ade3e0f6-9c50-4ce3-ab0c-587c10f7ba15\") " pod="openshift-console/console-5fd97696f7-fffnn" Mar 18 16:49:52.849354 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:49:52.849336 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ade3e0f6-9c50-4ce3-ab0c-587c10f7ba15-trusted-ca-bundle\") pod \"console-5fd97696f7-fffnn\" (UID: \"ade3e0f6-9c50-4ce3-ab0c-587c10f7ba15\") " pod="openshift-console/console-5fd97696f7-fffnn" Mar 18 16:49:52.849472 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:49:52.849458 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ade3e0f6-9c50-4ce3-ab0c-587c10f7ba15-console-config\") pod \"console-5fd97696f7-fffnn\" (UID: \"ade3e0f6-9c50-4ce3-ab0c-587c10f7ba15\") " pod="openshift-console/console-5fd97696f7-fffnn" Mar 18 16:49:52.849607 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:49:52.849593 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ade3e0f6-9c50-4ce3-ab0c-587c10f7ba15-service-ca\") pod \"console-5fd97696f7-fffnn\" (UID: \"ade3e0f6-9c50-4ce3-ab0c-587c10f7ba15\") " pod="openshift-console/console-5fd97696f7-fffnn" Mar 18 16:49:52.849735 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:49:52.849722 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ade3e0f6-9c50-4ce3-ab0c-587c10f7ba15-console-oauth-config\") pod \"console-5fd97696f7-fffnn\" (UID: \"ade3e0f6-9c50-4ce3-ab0c-587c10f7ba15\") " pod="openshift-console/console-5fd97696f7-fffnn" Mar 18 16:49:52.849880 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:49:52.849868 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkfq4\" (UniqueName: \"kubernetes.io/projected/ade3e0f6-9c50-4ce3-ab0c-587c10f7ba15-kube-api-access-mkfq4\") pod \"console-5fd97696f7-fffnn\" (UID: \"ade3e0f6-9c50-4ce3-ab0c-587c10f7ba15\") " pod="openshift-console/console-5fd97696f7-fffnn" Mar 18 16:49:52.951000 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:49:52.950965 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ade3e0f6-9c50-4ce3-ab0c-587c10f7ba15-oauth-serving-cert\") pod \"console-5fd97696f7-fffnn\" (UID: \"ade3e0f6-9c50-4ce3-ab0c-587c10f7ba15\") " pod="openshift-console/console-5fd97696f7-fffnn" Mar 18 16:49:52.951000 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:49:52.951007 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ade3e0f6-9c50-4ce3-ab0c-587c10f7ba15-trusted-ca-bundle\") pod \"console-5fd97696f7-fffnn\" (UID: \"ade3e0f6-9c50-4ce3-ab0c-587c10f7ba15\") " pod="openshift-console/console-5fd97696f7-fffnn" Mar 18 16:49:52.951248 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:49:52.951025 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ade3e0f6-9c50-4ce3-ab0c-587c10f7ba15-console-config\") pod \"console-5fd97696f7-fffnn\" (UID: \"ade3e0f6-9c50-4ce3-ab0c-587c10f7ba15\") " pod="openshift-console/console-5fd97696f7-fffnn" Mar 18 16:49:52.951248 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:49:52.951044 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ade3e0f6-9c50-4ce3-ab0c-587c10f7ba15-service-ca\") pod \"console-5fd97696f7-fffnn\" (UID: \"ade3e0f6-9c50-4ce3-ab0c-587c10f7ba15\") " pod="openshift-console/console-5fd97696f7-fffnn" Mar 18 16:49:52.951248 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:49:52.951082 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ade3e0f6-9c50-4ce3-ab0c-587c10f7ba15-console-oauth-config\") pod \"console-5fd97696f7-fffnn\" (UID: \"ade3e0f6-9c50-4ce3-ab0c-587c10f7ba15\") " pod="openshift-console/console-5fd97696f7-fffnn" Mar 18 16:49:52.951248 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:49:52.951145 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mkfq4\" (UniqueName: \"kubernetes.io/projected/ade3e0f6-9c50-4ce3-ab0c-587c10f7ba15-kube-api-access-mkfq4\") pod \"console-5fd97696f7-fffnn\" (UID: \"ade3e0f6-9c50-4ce3-ab0c-587c10f7ba15\") " pod="openshift-console/console-5fd97696f7-fffnn" Mar 18 16:49:52.951248 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:49:52.951213 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ade3e0f6-9c50-4ce3-ab0c-587c10f7ba15-console-serving-cert\") pod \"console-5fd97696f7-fffnn\" (UID: \"ade3e0f6-9c50-4ce3-ab0c-587c10f7ba15\") " pod="openshift-console/console-5fd97696f7-fffnn" Mar 18 16:49:52.951857 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:49:52.951831 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ade3e0f6-9c50-4ce3-ab0c-587c10f7ba15-oauth-serving-cert\") pod \"console-5fd97696f7-fffnn\" (UID: \"ade3e0f6-9c50-4ce3-ab0c-587c10f7ba15\") " pod="openshift-console/console-5fd97696f7-fffnn" Mar 18 16:49:52.951955 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:49:52.951932 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ade3e0f6-9c50-4ce3-ab0c-587c10f7ba15-service-ca\") pod \"console-5fd97696f7-fffnn\" (UID: \"ade3e0f6-9c50-4ce3-ab0c-587c10f7ba15\") " pod="openshift-console/console-5fd97696f7-fffnn" Mar 18 16:49:52.951995 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:49:52.951945 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ade3e0f6-9c50-4ce3-ab0c-587c10f7ba15-console-config\") pod \"console-5fd97696f7-fffnn\" (UID: \"ade3e0f6-9c50-4ce3-ab0c-587c10f7ba15\") " pod="openshift-console/console-5fd97696f7-fffnn" Mar 18 16:49:52.951995 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:49:52.951982 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ade3e0f6-9c50-4ce3-ab0c-587c10f7ba15-trusted-ca-bundle\") pod \"console-5fd97696f7-fffnn\" (UID: \"ade3e0f6-9c50-4ce3-ab0c-587c10f7ba15\") " pod="openshift-console/console-5fd97696f7-fffnn" Mar 18 16:49:52.953692 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:49:52.953670 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ade3e0f6-9c50-4ce3-ab0c-587c10f7ba15-console-serving-cert\") pod \"console-5fd97696f7-fffnn\" (UID: \"ade3e0f6-9c50-4ce3-ab0c-587c10f7ba15\") " pod="openshift-console/console-5fd97696f7-fffnn" Mar 18 16:49:52.953785 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:49:52.953766 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ade3e0f6-9c50-4ce3-ab0c-587c10f7ba15-console-oauth-config\") pod \"console-5fd97696f7-fffnn\" (UID: \"ade3e0f6-9c50-4ce3-ab0c-587c10f7ba15\") " pod="openshift-console/console-5fd97696f7-fffnn" Mar 18 16:49:52.958821 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:49:52.958799 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkfq4\" (UniqueName: \"kubernetes.io/projected/ade3e0f6-9c50-4ce3-ab0c-587c10f7ba15-kube-api-access-mkfq4\") pod \"console-5fd97696f7-fffnn\" (UID: \"ade3e0f6-9c50-4ce3-ab0c-587c10f7ba15\") " pod="openshift-console/console-5fd97696f7-fffnn" Mar 18 16:49:53.068769 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:49:53.068729 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5fd97696f7-fffnn" Mar 18 16:49:53.213760 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:49:53.213726 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5fd97696f7-fffnn"] Mar 18 16:49:53.217158 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:49:53.217130 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podade3e0f6_9c50_4ce3_ab0c_587c10f7ba15.slice/crio-a8eb5053bcecb7ffa239e572c464665ded5fbebb250b9ee84feb86e3fcb886cd WatchSource:0}: Error finding container a8eb5053bcecb7ffa239e572c464665ded5fbebb250b9ee84feb86e3fcb886cd: Status 404 returned error can't find the container with id a8eb5053bcecb7ffa239e572c464665ded5fbebb250b9ee84feb86e3fcb886cd Mar 18 16:49:53.219069 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:49:53.219049 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 16:49:53.558317 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:49:53.558283 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5fd97696f7-fffnn" event={"ID":"ade3e0f6-9c50-4ce3-ab0c-587c10f7ba15","Type":"ContainerStarted","Data":"4d6029c3b15fd7c1371a21e23d399415587beab7d7f8dbb7d391200d5fd78886"} Mar 18 16:49:53.558317 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:49:53.558320 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5fd97696f7-fffnn" event={"ID":"ade3e0f6-9c50-4ce3-ab0c-587c10f7ba15","Type":"ContainerStarted","Data":"a8eb5053bcecb7ffa239e572c464665ded5fbebb250b9ee84feb86e3fcb886cd"} Mar 18 16:49:53.585181 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:49:53.585116 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5fd97696f7-fffnn" podStartSLOduration=1.585100509 podStartE2EDuration="1.585100509s" podCreationTimestamp="2026-03-18 16:49:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:49:53.583826224 +0000 UTC m=+347.595935050" watchObservedRunningTime="2026-03-18 16:49:53.585100509 +0000 UTC m=+347.597209334" Mar 18 16:50:03.069091 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:50:03.069006 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5fd97696f7-fffnn" Mar 18 16:50:03.069091 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:50:03.069048 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5fd97696f7-fffnn" Mar 18 16:50:03.073859 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:50:03.073836 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5fd97696f7-fffnn" Mar 18 16:50:03.592470 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:50:03.592441 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5fd97696f7-fffnn" Mar 18 16:50:03.640999 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:50:03.640957 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5f68cd56dd-5tpxq"] Mar 18 16:50:28.665115 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:50:28.665040 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-5f68cd56dd-5tpxq" podUID="8762406a-4b01-432f-ba53-96f3b7751e2a" containerName="console" containerID="cri-o://57bf12264fa735d4ad95c4e50a75ac40afa68a03ece90d6d044e439b36297f72" gracePeriod=15 Mar 18 16:50:28.890605 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:50:28.890579 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5f68cd56dd-5tpxq_8762406a-4b01-432f-ba53-96f3b7751e2a/console/0.log" Mar 18 16:50:28.890755 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:50:28.890640 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5f68cd56dd-5tpxq" Mar 18 16:50:28.940214 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:50:28.940135 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8762406a-4b01-432f-ba53-96f3b7751e2a-console-serving-cert\") pod \"8762406a-4b01-432f-ba53-96f3b7751e2a\" (UID: \"8762406a-4b01-432f-ba53-96f3b7751e2a\") " Mar 18 16:50:28.940214 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:50:28.940179 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nkwk6\" (UniqueName: \"kubernetes.io/projected/8762406a-4b01-432f-ba53-96f3b7751e2a-kube-api-access-nkwk6\") pod \"8762406a-4b01-432f-ba53-96f3b7751e2a\" (UID: \"8762406a-4b01-432f-ba53-96f3b7751e2a\") " Mar 18 16:50:28.940214 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:50:28.940213 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8762406a-4b01-432f-ba53-96f3b7751e2a-service-ca\") pod \"8762406a-4b01-432f-ba53-96f3b7751e2a\" (UID: \"8762406a-4b01-432f-ba53-96f3b7751e2a\") " Mar 18 16:50:28.940495 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:50:28.940238 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8762406a-4b01-432f-ba53-96f3b7751e2a-console-config\") pod \"8762406a-4b01-432f-ba53-96f3b7751e2a\" (UID: \"8762406a-4b01-432f-ba53-96f3b7751e2a\") " Mar 18 16:50:28.940652 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:50:28.940627 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8762406a-4b01-432f-ba53-96f3b7751e2a-service-ca" (OuterVolumeSpecName: "service-ca") pod "8762406a-4b01-432f-ba53-96f3b7751e2a" (UID: "8762406a-4b01-432f-ba53-96f3b7751e2a"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 16:50:28.940706 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:50:28.940647 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8762406a-4b01-432f-ba53-96f3b7751e2a-console-config" (OuterVolumeSpecName: "console-config") pod "8762406a-4b01-432f-ba53-96f3b7751e2a" (UID: "8762406a-4b01-432f-ba53-96f3b7751e2a"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 16:50:28.942264 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:50:28.942242 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8762406a-4b01-432f-ba53-96f3b7751e2a-kube-api-access-nkwk6" (OuterVolumeSpecName: "kube-api-access-nkwk6") pod "8762406a-4b01-432f-ba53-96f3b7751e2a" (UID: "8762406a-4b01-432f-ba53-96f3b7751e2a"). InnerVolumeSpecName "kube-api-access-nkwk6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 16:50:28.942320 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:50:28.942270 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8762406a-4b01-432f-ba53-96f3b7751e2a-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "8762406a-4b01-432f-ba53-96f3b7751e2a" (UID: "8762406a-4b01-432f-ba53-96f3b7751e2a"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:50:29.041326 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:50:29.041279 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8762406a-4b01-432f-ba53-96f3b7751e2a-oauth-serving-cert\") pod \"8762406a-4b01-432f-ba53-96f3b7751e2a\" (UID: \"8762406a-4b01-432f-ba53-96f3b7751e2a\") " Mar 18 16:50:29.041326 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:50:29.041330 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8762406a-4b01-432f-ba53-96f3b7751e2a-console-oauth-config\") pod \"8762406a-4b01-432f-ba53-96f3b7751e2a\" (UID: \"8762406a-4b01-432f-ba53-96f3b7751e2a\") " Mar 18 16:50:29.041587 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:50:29.041368 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8762406a-4b01-432f-ba53-96f3b7751e2a-trusted-ca-bundle\") pod \"8762406a-4b01-432f-ba53-96f3b7751e2a\" (UID: \"8762406a-4b01-432f-ba53-96f3b7751e2a\") " Mar 18 16:50:29.041630 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:50:29.041613 2578 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8762406a-4b01-432f-ba53-96f3b7751e2a-console-serving-cert\") on node \"ip-10-0-143-175.ec2.internal\" DevicePath \"\"" Mar 18 16:50:29.041663 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:50:29.041635 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nkwk6\" (UniqueName: \"kubernetes.io/projected/8762406a-4b01-432f-ba53-96f3b7751e2a-kube-api-access-nkwk6\") on node \"ip-10-0-143-175.ec2.internal\" DevicePath \"\"" Mar 18 16:50:29.041663 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:50:29.041652 2578 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8762406a-4b01-432f-ba53-96f3b7751e2a-service-ca\") on node \"ip-10-0-143-175.ec2.internal\" DevicePath \"\"" Mar 18 16:50:29.041725 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:50:29.041666 2578 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8762406a-4b01-432f-ba53-96f3b7751e2a-console-config\") on node \"ip-10-0-143-175.ec2.internal\" DevicePath \"\"" Mar 18 16:50:29.041756 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:50:29.041725 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8762406a-4b01-432f-ba53-96f3b7751e2a-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "8762406a-4b01-432f-ba53-96f3b7751e2a" (UID: "8762406a-4b01-432f-ba53-96f3b7751e2a"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 16:50:29.041860 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:50:29.041834 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8762406a-4b01-432f-ba53-96f3b7751e2a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "8762406a-4b01-432f-ba53-96f3b7751e2a" (UID: "8762406a-4b01-432f-ba53-96f3b7751e2a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 16:50:29.043413 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:50:29.043395 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8762406a-4b01-432f-ba53-96f3b7751e2a-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "8762406a-4b01-432f-ba53-96f3b7751e2a" (UID: "8762406a-4b01-432f-ba53-96f3b7751e2a"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:50:29.141977 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:50:29.141945 2578 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8762406a-4b01-432f-ba53-96f3b7751e2a-oauth-serving-cert\") on node \"ip-10-0-143-175.ec2.internal\" DevicePath \"\"" Mar 18 16:50:29.141977 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:50:29.141971 2578 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8762406a-4b01-432f-ba53-96f3b7751e2a-console-oauth-config\") on node \"ip-10-0-143-175.ec2.internal\" DevicePath \"\"" Mar 18 16:50:29.141977 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:50:29.141981 2578 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8762406a-4b01-432f-ba53-96f3b7751e2a-trusted-ca-bundle\") on node \"ip-10-0-143-175.ec2.internal\" DevicePath \"\"" Mar 18 16:50:29.656704 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:50:29.656673 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5f68cd56dd-5tpxq_8762406a-4b01-432f-ba53-96f3b7751e2a/console/0.log" Mar 18 16:50:29.656899 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:50:29.656715 2578 generic.go:358] "Generic (PLEG): container finished" podID="8762406a-4b01-432f-ba53-96f3b7751e2a" containerID="57bf12264fa735d4ad95c4e50a75ac40afa68a03ece90d6d044e439b36297f72" exitCode=2 Mar 18 16:50:29.656899 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:50:29.656789 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5f68cd56dd-5tpxq" Mar 18 16:50:29.656899 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:50:29.656800 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5f68cd56dd-5tpxq" event={"ID":"8762406a-4b01-432f-ba53-96f3b7751e2a","Type":"ContainerDied","Data":"57bf12264fa735d4ad95c4e50a75ac40afa68a03ece90d6d044e439b36297f72"} Mar 18 16:50:29.656899 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:50:29.656833 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5f68cd56dd-5tpxq" event={"ID":"8762406a-4b01-432f-ba53-96f3b7751e2a","Type":"ContainerDied","Data":"cf871e1bcd9ea7fc8a7dc5315383e88daa8f5a2f36160c09aa6c2cea6a028e84"} Mar 18 16:50:29.656899 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:50:29.656847 2578 scope.go:117] "RemoveContainer" containerID="57bf12264fa735d4ad95c4e50a75ac40afa68a03ece90d6d044e439b36297f72" Mar 18 16:50:29.664545 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:50:29.664501 2578 scope.go:117] "RemoveContainer" containerID="57bf12264fa735d4ad95c4e50a75ac40afa68a03ece90d6d044e439b36297f72" Mar 18 16:50:29.664771 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:50:29.664754 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57bf12264fa735d4ad95c4e50a75ac40afa68a03ece90d6d044e439b36297f72\": container with ID starting with 57bf12264fa735d4ad95c4e50a75ac40afa68a03ece90d6d044e439b36297f72 not found: ID does not exist" containerID="57bf12264fa735d4ad95c4e50a75ac40afa68a03ece90d6d044e439b36297f72" Mar 18 16:50:29.664821 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:50:29.664780 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57bf12264fa735d4ad95c4e50a75ac40afa68a03ece90d6d044e439b36297f72"} err="failed to get container status \"57bf12264fa735d4ad95c4e50a75ac40afa68a03ece90d6d044e439b36297f72\": rpc error: code = NotFound desc = could not find container \"57bf12264fa735d4ad95c4e50a75ac40afa68a03ece90d6d044e439b36297f72\": container with ID starting with 57bf12264fa735d4ad95c4e50a75ac40afa68a03ece90d6d044e439b36297f72 not found: ID does not exist" Mar 18 16:50:29.676099 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:50:29.676075 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5f68cd56dd-5tpxq"] Mar 18 16:50:29.679505 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:50:29.679485 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5f68cd56dd-5tpxq"] Mar 18 16:50:30.591943 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:50:30.591910 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8762406a-4b01-432f-ba53-96f3b7751e2a" path="/var/lib/kubelet/pods/8762406a-4b01-432f-ba53-96f3b7751e2a/volumes" Mar 18 16:50:52.732037 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:50:52.732002 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-65847bc4cd-8b2vc"] Mar 18 16:50:52.732392 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:50:52.732311 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8762406a-4b01-432f-ba53-96f3b7751e2a" containerName="console" Mar 18 16:50:52.732392 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:50:52.732324 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="8762406a-4b01-432f-ba53-96f3b7751e2a" containerName="console" Mar 18 16:50:52.732466 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:50:52.732399 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="8762406a-4b01-432f-ba53-96f3b7751e2a" containerName="console" Mar 18 16:50:52.736728 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:50:52.736709 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-65847bc4cd-8b2vc" Mar 18 16:50:52.738762 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:50:52.738743 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Mar 18 16:50:52.738880 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:50:52.738766 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Mar 18 16:50:52.739201 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:50:52.739184 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Mar 18 16:50:52.739250 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:50:52.739207 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Mar 18 16:50:52.744493 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:50:52.744472 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-65847bc4cd-8b2vc"] Mar 18 16:50:52.804598 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:50:52.804560 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/a000726d-ca94-4b8b-8f11-2bc02664bbaf-klusterlet-config\") pod \"klusterlet-addon-workmgr-65847bc4cd-8b2vc\" (UID: \"a000726d-ca94-4b8b-8f11-2bc02664bbaf\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-65847bc4cd-8b2vc" Mar 18 16:50:52.804802 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:50:52.804689 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a000726d-ca94-4b8b-8f11-2bc02664bbaf-tmp\") pod \"klusterlet-addon-workmgr-65847bc4cd-8b2vc\" (UID: \"a000726d-ca94-4b8b-8f11-2bc02664bbaf\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-65847bc4cd-8b2vc" Mar 18 16:50:52.804802 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:50:52.804731 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fq4t9\" (UniqueName: \"kubernetes.io/projected/a000726d-ca94-4b8b-8f11-2bc02664bbaf-kube-api-access-fq4t9\") pod \"klusterlet-addon-workmgr-65847bc4cd-8b2vc\" (UID: \"a000726d-ca94-4b8b-8f11-2bc02664bbaf\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-65847bc4cd-8b2vc" Mar 18 16:50:52.906074 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:50:52.906038 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/a000726d-ca94-4b8b-8f11-2bc02664bbaf-klusterlet-config\") pod \"klusterlet-addon-workmgr-65847bc4cd-8b2vc\" (UID: \"a000726d-ca94-4b8b-8f11-2bc02664bbaf\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-65847bc4cd-8b2vc" Mar 18 16:50:52.906247 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:50:52.906092 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a000726d-ca94-4b8b-8f11-2bc02664bbaf-tmp\") pod \"klusterlet-addon-workmgr-65847bc4cd-8b2vc\" (UID: \"a000726d-ca94-4b8b-8f11-2bc02664bbaf\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-65847bc4cd-8b2vc" Mar 18 16:50:52.906247 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:50:52.906113 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fq4t9\" (UniqueName: \"kubernetes.io/projected/a000726d-ca94-4b8b-8f11-2bc02664bbaf-kube-api-access-fq4t9\") pod \"klusterlet-addon-workmgr-65847bc4cd-8b2vc\" (UID: \"a000726d-ca94-4b8b-8f11-2bc02664bbaf\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-65847bc4cd-8b2vc" Mar 18 16:50:52.906546 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:50:52.906501 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a000726d-ca94-4b8b-8f11-2bc02664bbaf-tmp\") pod \"klusterlet-addon-workmgr-65847bc4cd-8b2vc\" (UID: \"a000726d-ca94-4b8b-8f11-2bc02664bbaf\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-65847bc4cd-8b2vc" Mar 18 16:50:52.908599 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:50:52.908581 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/a000726d-ca94-4b8b-8f11-2bc02664bbaf-klusterlet-config\") pod \"klusterlet-addon-workmgr-65847bc4cd-8b2vc\" (UID: \"a000726d-ca94-4b8b-8f11-2bc02664bbaf\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-65847bc4cd-8b2vc" Mar 18 16:50:52.914283 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:50:52.914257 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fq4t9\" (UniqueName: \"kubernetes.io/projected/a000726d-ca94-4b8b-8f11-2bc02664bbaf-kube-api-access-fq4t9\") pod \"klusterlet-addon-workmgr-65847bc4cd-8b2vc\" (UID: \"a000726d-ca94-4b8b-8f11-2bc02664bbaf\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-65847bc4cd-8b2vc" Mar 18 16:50:53.046500 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:50:53.046465 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-65847bc4cd-8b2vc" Mar 18 16:50:53.166725 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:50:53.166691 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-65847bc4cd-8b2vc"] Mar 18 16:50:53.170297 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:50:53.170263 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda000726d_ca94_4b8b_8f11_2bc02664bbaf.slice/crio-8f886384cbda56c054f179b9797dc3bd82632f4715bae6684503409dd16e9ef2 WatchSource:0}: Error finding container 8f886384cbda56c054f179b9797dc3bd82632f4715bae6684503409dd16e9ef2: Status 404 returned error can't find the container with id 8f886384cbda56c054f179b9797dc3bd82632f4715bae6684503409dd16e9ef2 Mar 18 16:50:53.720411 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:50:53.720369 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-65847bc4cd-8b2vc" event={"ID":"a000726d-ca94-4b8b-8f11-2bc02664bbaf","Type":"ContainerStarted","Data":"8f886384cbda56c054f179b9797dc3bd82632f4715bae6684503409dd16e9ef2"} Mar 18 16:50:57.734350 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:50:57.734313 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-65847bc4cd-8b2vc" event={"ID":"a000726d-ca94-4b8b-8f11-2bc02664bbaf","Type":"ContainerStarted","Data":"99268cf07591faa5c7286cbd07315d69105fb86e7af56f555333b95baf0ad46e"} Mar 18 16:50:57.734789 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:50:57.734510 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-65847bc4cd-8b2vc" Mar 18 16:50:57.736235 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:50:57.736214 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-65847bc4cd-8b2vc" Mar 18 16:50:57.751458 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:50:57.751411 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-65847bc4cd-8b2vc" podStartSLOduration=1.810164423 podStartE2EDuration="5.751394847s" podCreationTimestamp="2026-03-18 16:50:52 +0000 UTC" firstStartedPulling="2026-03-18 16:50:53.172377898 +0000 UTC m=+407.184486702" lastFinishedPulling="2026-03-18 16:50:57.11360831 +0000 UTC m=+411.125717126" observedRunningTime="2026-03-18 16:50:57.75033669 +0000 UTC m=+411.762445533" watchObservedRunningTime="2026-03-18 16:50:57.751394847 +0000 UTC m=+411.763503671" Mar 18 16:50:58.949985 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:50:58.949952 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjl688"] Mar 18 16:50:58.953191 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:50:58.953170 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjl688" Mar 18 16:50:58.955546 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:50:58.955512 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-9wdfm\"" Mar 18 16:50:58.955659 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:50:58.955512 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Mar 18 16:50:58.955659 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:50:58.955513 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Mar 18 16:50:58.963930 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:50:58.963902 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjl688"] Mar 18 16:50:59.065104 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:50:59.065062 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6bbk\" (UniqueName: \"kubernetes.io/projected/3c3cfd8a-8c2b-440f-ba9e-f91edc3682ef-kube-api-access-g6bbk\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjl688\" (UID: \"3c3cfd8a-8c2b-440f-ba9e-f91edc3682ef\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjl688" Mar 18 16:50:59.065276 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:50:59.065122 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3c3cfd8a-8c2b-440f-ba9e-f91edc3682ef-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjl688\" (UID: \"3c3cfd8a-8c2b-440f-ba9e-f91edc3682ef\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjl688" Mar 18 16:50:59.065276 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:50:59.065189 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3c3cfd8a-8c2b-440f-ba9e-f91edc3682ef-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjl688\" (UID: \"3c3cfd8a-8c2b-440f-ba9e-f91edc3682ef\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjl688" Mar 18 16:50:59.166275 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:50:59.166238 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3c3cfd8a-8c2b-440f-ba9e-f91edc3682ef-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjl688\" (UID: \"3c3cfd8a-8c2b-440f-ba9e-f91edc3682ef\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjl688" Mar 18 16:50:59.166412 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:50:59.166300 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3c3cfd8a-8c2b-440f-ba9e-f91edc3682ef-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjl688\" (UID: \"3c3cfd8a-8c2b-440f-ba9e-f91edc3682ef\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjl688" Mar 18 16:50:59.166412 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:50:59.166335 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g6bbk\" (UniqueName: \"kubernetes.io/projected/3c3cfd8a-8c2b-440f-ba9e-f91edc3682ef-kube-api-access-g6bbk\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjl688\" (UID: \"3c3cfd8a-8c2b-440f-ba9e-f91edc3682ef\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjl688" Mar 18 16:50:59.166680 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:50:59.166658 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3c3cfd8a-8c2b-440f-ba9e-f91edc3682ef-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjl688\" (UID: \"3c3cfd8a-8c2b-440f-ba9e-f91edc3682ef\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjl688" Mar 18 16:50:59.166761 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:50:59.166691 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3c3cfd8a-8c2b-440f-ba9e-f91edc3682ef-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjl688\" (UID: \"3c3cfd8a-8c2b-440f-ba9e-f91edc3682ef\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjl688" Mar 18 16:50:59.174804 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:50:59.174778 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6bbk\" (UniqueName: \"kubernetes.io/projected/3c3cfd8a-8c2b-440f-ba9e-f91edc3682ef-kube-api-access-g6bbk\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjl688\" (UID: \"3c3cfd8a-8c2b-440f-ba9e-f91edc3682ef\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjl688" Mar 18 16:50:59.262453 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:50:59.262350 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjl688" Mar 18 16:50:59.381838 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:50:59.381801 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjl688"] Mar 18 16:50:59.384666 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:50:59.384639 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c3cfd8a_8c2b_440f_ba9e_f91edc3682ef.slice/crio-a10c183a5ccb8d3953340966302d99f65a699b21095d5c86a1fde4e946d54073 WatchSource:0}: Error finding container a10c183a5ccb8d3953340966302d99f65a699b21095d5c86a1fde4e946d54073: Status 404 returned error can't find the container with id a10c183a5ccb8d3953340966302d99f65a699b21095d5c86a1fde4e946d54073 Mar 18 16:50:59.741405 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:50:59.741371 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjl688" event={"ID":"3c3cfd8a-8c2b-440f-ba9e-f91edc3682ef","Type":"ContainerStarted","Data":"a10c183a5ccb8d3953340966302d99f65a699b21095d5c86a1fde4e946d54073"} Mar 18 16:51:05.760120 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:51:05.760086 2578 generic.go:358] "Generic (PLEG): container finished" podID="3c3cfd8a-8c2b-440f-ba9e-f91edc3682ef" containerID="b1ce01b937e1d45643a877d7f61d9c4cf35920c83e6ccd45cb557e4aed0ad03f" exitCode=0 Mar 18 16:51:05.760509 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:51:05.760139 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjl688" event={"ID":"3c3cfd8a-8c2b-440f-ba9e-f91edc3682ef","Type":"ContainerDied","Data":"b1ce01b937e1d45643a877d7f61d9c4cf35920c83e6ccd45cb557e4aed0ad03f"} Mar 18 16:51:08.771501 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:51:08.771460 2578 generic.go:358] "Generic (PLEG): container finished" podID="3c3cfd8a-8c2b-440f-ba9e-f91edc3682ef" containerID="fe674329c0925d4425d9da3397fc6154ba82681523de64ddbe44cf187875500b" exitCode=0 Mar 18 16:51:08.771919 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:51:08.771559 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjl688" event={"ID":"3c3cfd8a-8c2b-440f-ba9e-f91edc3682ef","Type":"ContainerDied","Data":"fe674329c0925d4425d9da3397fc6154ba82681523de64ddbe44cf187875500b"} Mar 18 16:51:15.794519 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:51:15.794481 2578 generic.go:358] "Generic (PLEG): container finished" podID="3c3cfd8a-8c2b-440f-ba9e-f91edc3682ef" containerID="ee7ef6153fde321e20aa27d086821697407a4045340cf5167a7ec6d4d8f1f06f" exitCode=0 Mar 18 16:51:15.794924 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:51:15.794545 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjl688" event={"ID":"3c3cfd8a-8c2b-440f-ba9e-f91edc3682ef","Type":"ContainerDied","Data":"ee7ef6153fde321e20aa27d086821697407a4045340cf5167a7ec6d4d8f1f06f"} Mar 18 16:51:16.924776 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:51:16.924751 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjl688" Mar 18 16:51:17.034577 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:51:17.034510 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3c3cfd8a-8c2b-440f-ba9e-f91edc3682ef-util\") pod \"3c3cfd8a-8c2b-440f-ba9e-f91edc3682ef\" (UID: \"3c3cfd8a-8c2b-440f-ba9e-f91edc3682ef\") " Mar 18 16:51:17.034764 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:51:17.034598 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g6bbk\" (UniqueName: \"kubernetes.io/projected/3c3cfd8a-8c2b-440f-ba9e-f91edc3682ef-kube-api-access-g6bbk\") pod \"3c3cfd8a-8c2b-440f-ba9e-f91edc3682ef\" (UID: \"3c3cfd8a-8c2b-440f-ba9e-f91edc3682ef\") " Mar 18 16:51:17.034764 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:51:17.034619 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3c3cfd8a-8c2b-440f-ba9e-f91edc3682ef-bundle\") pod \"3c3cfd8a-8c2b-440f-ba9e-f91edc3682ef\" (UID: \"3c3cfd8a-8c2b-440f-ba9e-f91edc3682ef\") " Mar 18 16:51:17.035265 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:51:17.035231 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c3cfd8a-8c2b-440f-ba9e-f91edc3682ef-bundle" (OuterVolumeSpecName: "bundle") pod "3c3cfd8a-8c2b-440f-ba9e-f91edc3682ef" (UID: "3c3cfd8a-8c2b-440f-ba9e-f91edc3682ef"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 18 16:51:17.036826 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:51:17.036797 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c3cfd8a-8c2b-440f-ba9e-f91edc3682ef-kube-api-access-g6bbk" (OuterVolumeSpecName: "kube-api-access-g6bbk") pod "3c3cfd8a-8c2b-440f-ba9e-f91edc3682ef" (UID: "3c3cfd8a-8c2b-440f-ba9e-f91edc3682ef"). InnerVolumeSpecName "kube-api-access-g6bbk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 16:51:17.038485 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:51:17.038462 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c3cfd8a-8c2b-440f-ba9e-f91edc3682ef-util" (OuterVolumeSpecName: "util") pod "3c3cfd8a-8c2b-440f-ba9e-f91edc3682ef" (UID: "3c3cfd8a-8c2b-440f-ba9e-f91edc3682ef"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 18 16:51:17.136157 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:51:17.136065 2578 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3c3cfd8a-8c2b-440f-ba9e-f91edc3682ef-util\") on node \"ip-10-0-143-175.ec2.internal\" DevicePath \"\"" Mar 18 16:51:17.136157 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:51:17.136102 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-g6bbk\" (UniqueName: \"kubernetes.io/projected/3c3cfd8a-8c2b-440f-ba9e-f91edc3682ef-kube-api-access-g6bbk\") on node \"ip-10-0-143-175.ec2.internal\" DevicePath \"\"" Mar 18 16:51:17.136157 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:51:17.136113 2578 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3c3cfd8a-8c2b-440f-ba9e-f91edc3682ef-bundle\") on node \"ip-10-0-143-175.ec2.internal\" DevicePath \"\"" Mar 18 16:51:17.801866 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:51:17.801821 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjl688" event={"ID":"3c3cfd8a-8c2b-440f-ba9e-f91edc3682ef","Type":"ContainerDied","Data":"a10c183a5ccb8d3953340966302d99f65a699b21095d5c86a1fde4e946d54073"} Mar 18 16:51:17.801866 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:51:17.801856 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjl688" Mar 18 16:51:17.801866 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:51:17.801864 2578 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a10c183a5ccb8d3953340966302d99f65a699b21095d5c86a1fde4e946d54073" Mar 18 16:51:20.685132 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:51:20.685096 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-ksts4"] Mar 18 16:51:20.685627 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:51:20.685470 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3c3cfd8a-8c2b-440f-ba9e-f91edc3682ef" containerName="util" Mar 18 16:51:20.685627 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:51:20.685487 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c3cfd8a-8c2b-440f-ba9e-f91edc3682ef" containerName="util" Mar 18 16:51:20.685627 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:51:20.685500 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3c3cfd8a-8c2b-440f-ba9e-f91edc3682ef" containerName="extract" Mar 18 16:51:20.685627 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:51:20.685508 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c3cfd8a-8c2b-440f-ba9e-f91edc3682ef" containerName="extract" Mar 18 16:51:20.685627 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:51:20.685557 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3c3cfd8a-8c2b-440f-ba9e-f91edc3682ef" containerName="pull" Mar 18 16:51:20.685627 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:51:20.685565 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c3cfd8a-8c2b-440f-ba9e-f91edc3682ef" containerName="pull" Mar 18 16:51:20.685627 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:51:20.685628 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="3c3cfd8a-8c2b-440f-ba9e-f91edc3682ef" containerName="extract" Mar 18 16:51:20.748641 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:51:20.748596 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-ksts4"] Mar 18 16:51:20.748824 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:51:20.748745 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-ksts4" Mar 18 16:51:20.750943 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:51:20.750914 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-qjrgv\"" Mar 18 16:51:20.750943 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:51:20.750935 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Mar 18 16:51:20.751731 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:51:20.751708 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Mar 18 16:51:20.751913 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:51:20.751768 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Mar 18 16:51:20.866544 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:51:20.866492 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/e5892629-fef9-4860-8ec2-d74cecee2214-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-ksts4\" (UID: \"e5892629-fef9-4860-8ec2-d74cecee2214\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-ksts4" Mar 18 16:51:20.866544 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:51:20.866543 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6glnz\" (UniqueName: \"kubernetes.io/projected/e5892629-fef9-4860-8ec2-d74cecee2214-kube-api-access-6glnz\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-ksts4\" (UID: \"e5892629-fef9-4860-8ec2-d74cecee2214\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-ksts4" Mar 18 16:51:20.967070 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:51:20.966972 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/e5892629-fef9-4860-8ec2-d74cecee2214-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-ksts4\" (UID: \"e5892629-fef9-4860-8ec2-d74cecee2214\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-ksts4" Mar 18 16:51:20.967070 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:51:20.967022 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6glnz\" (UniqueName: \"kubernetes.io/projected/e5892629-fef9-4860-8ec2-d74cecee2214-kube-api-access-6glnz\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-ksts4\" (UID: \"e5892629-fef9-4860-8ec2-d74cecee2214\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-ksts4" Mar 18 16:51:20.969285 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:51:20.969255 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/e5892629-fef9-4860-8ec2-d74cecee2214-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-ksts4\" (UID: \"e5892629-fef9-4860-8ec2-d74cecee2214\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-ksts4" Mar 18 16:51:20.980750 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:51:20.980711 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6glnz\" (UniqueName: \"kubernetes.io/projected/e5892629-fef9-4860-8ec2-d74cecee2214-kube-api-access-6glnz\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-ksts4\" (UID: \"e5892629-fef9-4860-8ec2-d74cecee2214\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-ksts4" Mar 18 16:51:21.059195 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:51:21.059158 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-ksts4" Mar 18 16:51:21.185659 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:51:21.185635 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-ksts4"] Mar 18 16:51:21.188569 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:51:21.188542 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5892629_fef9_4860_8ec2_d74cecee2214.slice/crio-39deede4a6c93187567a14220e052671ff547deb7dc7e945c1f7f1e0b5d887cd WatchSource:0}: Error finding container 39deede4a6c93187567a14220e052671ff547deb7dc7e945c1f7f1e0b5d887cd: Status 404 returned error can't find the container with id 39deede4a6c93187567a14220e052671ff547deb7dc7e945c1f7f1e0b5d887cd Mar 18 16:51:21.813600 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:51:21.813565 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-ksts4" event={"ID":"e5892629-fef9-4860-8ec2-d74cecee2214","Type":"ContainerStarted","Data":"39deede4a6c93187567a14220e052671ff547deb7dc7e945c1f7f1e0b5d887cd"} Mar 18 16:51:26.326792 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:51:26.326752 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-7m2zt"] Mar 18 16:51:26.342253 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:51:26.342221 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-7m2zt"] Mar 18 16:51:26.342426 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:51:26.342406 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-7m2zt" Mar 18 16:51:26.344630 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:51:26.344606 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Mar 18 16:51:26.344763 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:51:26.344609 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Mar 18 16:51:26.344997 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:51:26.344978 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-k4mn4\"" Mar 18 16:51:26.515223 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:51:26.515195 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/f8b45b1d-5258-46df-8637-3e19b39d3a5a-cabundle0\") pod \"keda-operator-ffbb595cb-7m2zt\" (UID: \"f8b45b1d-5258-46df-8637-3e19b39d3a5a\") " pod="openshift-keda/keda-operator-ffbb595cb-7m2zt" Mar 18 16:51:26.515394 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:51:26.515231 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/f8b45b1d-5258-46df-8637-3e19b39d3a5a-certificates\") pod \"keda-operator-ffbb595cb-7m2zt\" (UID: \"f8b45b1d-5258-46df-8637-3e19b39d3a5a\") " pod="openshift-keda/keda-operator-ffbb595cb-7m2zt" Mar 18 16:51:26.515394 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:51:26.515259 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vhmj\" (UniqueName: \"kubernetes.io/projected/f8b45b1d-5258-46df-8637-3e19b39d3a5a-kube-api-access-7vhmj\") pod \"keda-operator-ffbb595cb-7m2zt\" (UID: \"f8b45b1d-5258-46df-8637-3e19b39d3a5a\") " pod="openshift-keda/keda-operator-ffbb595cb-7m2zt" Mar 18 16:51:26.580929 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:51:26.580853 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-nrwwr"] Mar 18 16:51:26.598998 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:51:26.598962 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-nrwwr" Mar 18 16:51:26.601074 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:51:26.601054 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Mar 18 16:51:26.604240 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:51:26.604214 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-nrwwr"] Mar 18 16:51:26.616736 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:51:26.616710 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/f8b45b1d-5258-46df-8637-3e19b39d3a5a-cabundle0\") pod \"keda-operator-ffbb595cb-7m2zt\" (UID: \"f8b45b1d-5258-46df-8637-3e19b39d3a5a\") " pod="openshift-keda/keda-operator-ffbb595cb-7m2zt" Mar 18 16:51:26.616842 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:51:26.616757 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/f8b45b1d-5258-46df-8637-3e19b39d3a5a-certificates\") pod \"keda-operator-ffbb595cb-7m2zt\" (UID: \"f8b45b1d-5258-46df-8637-3e19b39d3a5a\") " pod="openshift-keda/keda-operator-ffbb595cb-7m2zt" Mar 18 16:51:26.616842 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:51:26.616800 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7vhmj\" (UniqueName: \"kubernetes.io/projected/f8b45b1d-5258-46df-8637-3e19b39d3a5a-kube-api-access-7vhmj\") pod \"keda-operator-ffbb595cb-7m2zt\" (UID: \"f8b45b1d-5258-46df-8637-3e19b39d3a5a\") " pod="openshift-keda/keda-operator-ffbb595cb-7m2zt" Mar 18 16:51:26.616924 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:51:26.616910 2578 secret.go:281] references non-existent secret key: ca.crt Mar 18 16:51:26.616962 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:51:26.616928 2578 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Mar 18 16:51:26.616962 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:51:26.616936 2578 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-7m2zt: references non-existent secret key: ca.crt Mar 18 16:51:26.617023 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:51:26.616984 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f8b45b1d-5258-46df-8637-3e19b39d3a5a-certificates podName:f8b45b1d-5258-46df-8637-3e19b39d3a5a nodeName:}" failed. No retries permitted until 2026-03-18 16:51:27.116967131 +0000 UTC m=+441.129075947 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/f8b45b1d-5258-46df-8637-3e19b39d3a5a-certificates") pod "keda-operator-ffbb595cb-7m2zt" (UID: "f8b45b1d-5258-46df-8637-3e19b39d3a5a") : references non-existent secret key: ca.crt Mar 18 16:51:26.617301 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:51:26.617284 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/f8b45b1d-5258-46df-8637-3e19b39d3a5a-cabundle0\") pod \"keda-operator-ffbb595cb-7m2zt\" (UID: \"f8b45b1d-5258-46df-8637-3e19b39d3a5a\") " pod="openshift-keda/keda-operator-ffbb595cb-7m2zt" Mar 18 16:51:26.627380 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:51:26.627354 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vhmj\" (UniqueName: \"kubernetes.io/projected/f8b45b1d-5258-46df-8637-3e19b39d3a5a-kube-api-access-7vhmj\") pod \"keda-operator-ffbb595cb-7m2zt\" (UID: \"f8b45b1d-5258-46df-8637-3e19b39d3a5a\") " pod="openshift-keda/keda-operator-ffbb595cb-7m2zt" Mar 18 16:51:26.717580 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:51:26.717538 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/2af4d381-5c30-48d1-a799-f0efd5f9ba83-certificates\") pod \"keda-metrics-apiserver-7c9f485588-nrwwr\" (UID: \"2af4d381-5c30-48d1-a799-f0efd5f9ba83\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-nrwwr" Mar 18 16:51:26.717580 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:51:26.717581 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86jtb\" (UniqueName: \"kubernetes.io/projected/2af4d381-5c30-48d1-a799-f0efd5f9ba83-kube-api-access-86jtb\") pod \"keda-metrics-apiserver-7c9f485588-nrwwr\" (UID: \"2af4d381-5c30-48d1-a799-f0efd5f9ba83\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-nrwwr" Mar 18 16:51:26.717848 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:51:26.717722 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/2af4d381-5c30-48d1-a799-f0efd5f9ba83-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-nrwwr\" (UID: \"2af4d381-5c30-48d1-a799-f0efd5f9ba83\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-nrwwr" Mar 18 16:51:26.818801 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:51:26.818756 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/2af4d381-5c30-48d1-a799-f0efd5f9ba83-certificates\") pod \"keda-metrics-apiserver-7c9f485588-nrwwr\" (UID: \"2af4d381-5c30-48d1-a799-f0efd5f9ba83\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-nrwwr" Mar 18 16:51:26.818801 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:51:26.818801 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-86jtb\" (UniqueName: \"kubernetes.io/projected/2af4d381-5c30-48d1-a799-f0efd5f9ba83-kube-api-access-86jtb\") pod \"keda-metrics-apiserver-7c9f485588-nrwwr\" (UID: \"2af4d381-5c30-48d1-a799-f0efd5f9ba83\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-nrwwr" Mar 18 16:51:26.818990 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:51:26.818850 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/2af4d381-5c30-48d1-a799-f0efd5f9ba83-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-nrwwr\" (UID: \"2af4d381-5c30-48d1-a799-f0efd5f9ba83\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-nrwwr" Mar 18 16:51:26.818990 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:51:26.818914 2578 secret.go:281] references non-existent secret key: tls.crt Mar 18 16:51:26.818990 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:51:26.818938 2578 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Mar 18 16:51:26.818990 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:51:26.818962 2578 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-nrwwr: references non-existent secret key: tls.crt Mar 18 16:51:26.819160 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:51:26.819026 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2af4d381-5c30-48d1-a799-f0efd5f9ba83-certificates podName:2af4d381-5c30-48d1-a799-f0efd5f9ba83 nodeName:}" failed. No retries permitted until 2026-03-18 16:51:27.31900856 +0000 UTC m=+441.331117363 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/2af4d381-5c30-48d1-a799-f0efd5f9ba83-certificates") pod "keda-metrics-apiserver-7c9f485588-nrwwr" (UID: "2af4d381-5c30-48d1-a799-f0efd5f9ba83") : references non-existent secret key: tls.crt Mar 18 16:51:26.819236 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:51:26.819221 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/2af4d381-5c30-48d1-a799-f0efd5f9ba83-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-nrwwr\" (UID: \"2af4d381-5c30-48d1-a799-f0efd5f9ba83\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-nrwwr" Mar 18 16:51:26.829132 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:51:26.829099 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-86jtb\" (UniqueName: \"kubernetes.io/projected/2af4d381-5c30-48d1-a799-f0efd5f9ba83-kube-api-access-86jtb\") pod \"keda-metrics-apiserver-7c9f485588-nrwwr\" (UID: \"2af4d381-5c30-48d1-a799-f0efd5f9ba83\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-nrwwr" Mar 18 16:51:26.831074 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:51:26.830980 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-ksts4" event={"ID":"e5892629-fef9-4860-8ec2-d74cecee2214","Type":"ContainerStarted","Data":"1a79bcf79691f9aebebee022dadb139ef91b12ca51edebabb95a153499f0251e"} Mar 18 16:51:26.831211 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:51:26.831111 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-ksts4" Mar 18 16:51:26.851288 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:51:26.851235 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-ksts4" podStartSLOduration=2.279302657 podStartE2EDuration="6.851219589s" podCreationTimestamp="2026-03-18 16:51:20 +0000 UTC" firstStartedPulling="2026-03-18 16:51:21.190375236 +0000 UTC m=+435.202484039" lastFinishedPulling="2026-03-18 16:51:25.762292165 +0000 UTC m=+439.774400971" observedRunningTime="2026-03-18 16:51:26.849063734 +0000 UTC m=+440.861172559" watchObservedRunningTime="2026-03-18 16:51:26.851219589 +0000 UTC m=+440.863328413" Mar 18 16:51:27.121844 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:51:27.121744 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/f8b45b1d-5258-46df-8637-3e19b39d3a5a-certificates\") pod \"keda-operator-ffbb595cb-7m2zt\" (UID: \"f8b45b1d-5258-46df-8637-3e19b39d3a5a\") " pod="openshift-keda/keda-operator-ffbb595cb-7m2zt" Mar 18 16:51:27.122014 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:51:27.121883 2578 secret.go:281] references non-existent secret key: ca.crt Mar 18 16:51:27.122014 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:51:27.121902 2578 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Mar 18 16:51:27.122014 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:51:27.121911 2578 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-7m2zt: references non-existent secret key: ca.crt Mar 18 16:51:27.122014 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:51:27.121963 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f8b45b1d-5258-46df-8637-3e19b39d3a5a-certificates podName:f8b45b1d-5258-46df-8637-3e19b39d3a5a nodeName:}" failed. No retries permitted until 2026-03-18 16:51:28.121948936 +0000 UTC m=+442.134057739 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/f8b45b1d-5258-46df-8637-3e19b39d3a5a-certificates") pod "keda-operator-ffbb595cb-7m2zt" (UID: "f8b45b1d-5258-46df-8637-3e19b39d3a5a") : references non-existent secret key: ca.crt Mar 18 16:51:27.322982 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:51:27.322937 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/2af4d381-5c30-48d1-a799-f0efd5f9ba83-certificates\") pod \"keda-metrics-apiserver-7c9f485588-nrwwr\" (UID: \"2af4d381-5c30-48d1-a799-f0efd5f9ba83\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-nrwwr" Mar 18 16:51:27.323155 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:51:27.323066 2578 secret.go:281] references non-existent secret key: tls.crt Mar 18 16:51:27.323155 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:51:27.323084 2578 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Mar 18 16:51:27.323155 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:51:27.323102 2578 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-nrwwr: references non-existent secret key: tls.crt Mar 18 16:51:27.323155 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:51:27.323155 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2af4d381-5c30-48d1-a799-f0efd5f9ba83-certificates podName:2af4d381-5c30-48d1-a799-f0efd5f9ba83 nodeName:}" failed. No retries permitted until 2026-03-18 16:51:28.323140119 +0000 UTC m=+442.335248921 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/2af4d381-5c30-48d1-a799-f0efd5f9ba83-certificates") pod "keda-metrics-apiserver-7c9f485588-nrwwr" (UID: "2af4d381-5c30-48d1-a799-f0efd5f9ba83") : references non-existent secret key: tls.crt Mar 18 16:51:28.129434 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:51:28.129378 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/f8b45b1d-5258-46df-8637-3e19b39d3a5a-certificates\") pod \"keda-operator-ffbb595cb-7m2zt\" (UID: \"f8b45b1d-5258-46df-8637-3e19b39d3a5a\") " pod="openshift-keda/keda-operator-ffbb595cb-7m2zt" Mar 18 16:51:28.129866 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:51:28.129576 2578 secret.go:281] references non-existent secret key: ca.crt Mar 18 16:51:28.129866 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:51:28.129598 2578 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Mar 18 16:51:28.129866 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:51:28.129609 2578 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-7m2zt: references non-existent secret key: ca.crt Mar 18 16:51:28.129866 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:51:28.129672 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f8b45b1d-5258-46df-8637-3e19b39d3a5a-certificates podName:f8b45b1d-5258-46df-8637-3e19b39d3a5a nodeName:}" failed. No retries permitted until 2026-03-18 16:51:30.129656443 +0000 UTC m=+444.141765246 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/f8b45b1d-5258-46df-8637-3e19b39d3a5a-certificates") pod "keda-operator-ffbb595cb-7m2zt" (UID: "f8b45b1d-5258-46df-8637-3e19b39d3a5a") : references non-existent secret key: ca.crt Mar 18 16:51:28.331383 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:51:28.331341 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/2af4d381-5c30-48d1-a799-f0efd5f9ba83-certificates\") pod \"keda-metrics-apiserver-7c9f485588-nrwwr\" (UID: \"2af4d381-5c30-48d1-a799-f0efd5f9ba83\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-nrwwr" Mar 18 16:51:28.331555 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:51:28.331489 2578 secret.go:281] references non-existent secret key: tls.crt Mar 18 16:51:28.331555 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:51:28.331506 2578 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Mar 18 16:51:28.331555 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:51:28.331550 2578 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-nrwwr: references non-existent secret key: tls.crt Mar 18 16:51:28.331675 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:51:28.331630 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2af4d381-5c30-48d1-a799-f0efd5f9ba83-certificates podName:2af4d381-5c30-48d1-a799-f0efd5f9ba83 nodeName:}" failed. No retries permitted until 2026-03-18 16:51:30.331609918 +0000 UTC m=+444.343718720 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/2af4d381-5c30-48d1-a799-f0efd5f9ba83-certificates") pod "keda-metrics-apiserver-7c9f485588-nrwwr" (UID: "2af4d381-5c30-48d1-a799-f0efd5f9ba83") : references non-existent secret key: tls.crt Mar 18 16:51:30.146538 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:51:30.146496 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/f8b45b1d-5258-46df-8637-3e19b39d3a5a-certificates\") pod \"keda-operator-ffbb595cb-7m2zt\" (UID: \"f8b45b1d-5258-46df-8637-3e19b39d3a5a\") " pod="openshift-keda/keda-operator-ffbb595cb-7m2zt" Mar 18 16:51:30.146976 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:51:30.146640 2578 secret.go:281] references non-existent secret key: ca.crt Mar 18 16:51:30.146976 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:51:30.146658 2578 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Mar 18 16:51:30.146976 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:51:30.146668 2578 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-7m2zt: references non-existent secret key: ca.crt Mar 18 16:51:30.146976 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:51:30.146718 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f8b45b1d-5258-46df-8637-3e19b39d3a5a-certificates podName:f8b45b1d-5258-46df-8637-3e19b39d3a5a nodeName:}" failed. No retries permitted until 2026-03-18 16:51:34.146704906 +0000 UTC m=+448.158813709 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/f8b45b1d-5258-46df-8637-3e19b39d3a5a-certificates") pod "keda-operator-ffbb595cb-7m2zt" (UID: "f8b45b1d-5258-46df-8637-3e19b39d3a5a") : references non-existent secret key: ca.crt Mar 18 16:51:30.348172 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:51:30.348135 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/2af4d381-5c30-48d1-a799-f0efd5f9ba83-certificates\") pod \"keda-metrics-apiserver-7c9f485588-nrwwr\" (UID: \"2af4d381-5c30-48d1-a799-f0efd5f9ba83\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-nrwwr" Mar 18 16:51:30.348372 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:51:30.348274 2578 secret.go:281] references non-existent secret key: tls.crt Mar 18 16:51:30.348372 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:51:30.348290 2578 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Mar 18 16:51:30.348372 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:51:30.348308 2578 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-nrwwr: references non-existent secret key: tls.crt Mar 18 16:51:30.348372 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:51:30.348367 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2af4d381-5c30-48d1-a799-f0efd5f9ba83-certificates podName:2af4d381-5c30-48d1-a799-f0efd5f9ba83 nodeName:}" failed. No retries permitted until 2026-03-18 16:51:34.348352579 +0000 UTC m=+448.360461382 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/2af4d381-5c30-48d1-a799-f0efd5f9ba83-certificates") pod "keda-metrics-apiserver-7c9f485588-nrwwr" (UID: "2af4d381-5c30-48d1-a799-f0efd5f9ba83") : references non-existent secret key: tls.crt Mar 18 16:51:34.182405 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:51:34.182359 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/f8b45b1d-5258-46df-8637-3e19b39d3a5a-certificates\") pod \"keda-operator-ffbb595cb-7m2zt\" (UID: \"f8b45b1d-5258-46df-8637-3e19b39d3a5a\") " pod="openshift-keda/keda-operator-ffbb595cb-7m2zt" Mar 18 16:51:34.184940 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:51:34.184915 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/f8b45b1d-5258-46df-8637-3e19b39d3a5a-certificates\") pod \"keda-operator-ffbb595cb-7m2zt\" (UID: \"f8b45b1d-5258-46df-8637-3e19b39d3a5a\") " pod="openshift-keda/keda-operator-ffbb595cb-7m2zt" Mar 18 16:51:34.384622 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:51:34.384570 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/2af4d381-5c30-48d1-a799-f0efd5f9ba83-certificates\") pod \"keda-metrics-apiserver-7c9f485588-nrwwr\" (UID: \"2af4d381-5c30-48d1-a799-f0efd5f9ba83\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-nrwwr" Mar 18 16:51:34.387128 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:51:34.387098 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/2af4d381-5c30-48d1-a799-f0efd5f9ba83-certificates\") pod \"keda-metrics-apiserver-7c9f485588-nrwwr\" (UID: \"2af4d381-5c30-48d1-a799-f0efd5f9ba83\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-nrwwr" Mar 18 16:51:34.409948 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:51:34.409917 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-nrwwr" Mar 18 16:51:34.452896 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:51:34.452809 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-7m2zt" Mar 18 16:51:34.557323 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:51:34.557299 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-nrwwr"] Mar 18 16:51:34.559862 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:51:34.559832 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2af4d381_5c30_48d1_a799_f0efd5f9ba83.slice/crio-4c2fa99b0750e49dedb5895b7f6a5a44ced8ef15384e952a77d4d711e62ca918 WatchSource:0}: Error finding container 4c2fa99b0750e49dedb5895b7f6a5a44ced8ef15384e952a77d4d711e62ca918: Status 404 returned error can't find the container with id 4c2fa99b0750e49dedb5895b7f6a5a44ced8ef15384e952a77d4d711e62ca918 Mar 18 16:51:34.614623 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:51:34.614583 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-7m2zt"] Mar 18 16:51:34.618261 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:51:34.618233 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8b45b1d_5258_46df_8637_3e19b39d3a5a.slice/crio-d68ca19c2a3d9fd5091ed3cae9c52fa506dec874f9ecb91e10ab405969f63180 WatchSource:0}: Error finding container d68ca19c2a3d9fd5091ed3cae9c52fa506dec874f9ecb91e10ab405969f63180: Status 404 returned error can't find the container with id d68ca19c2a3d9fd5091ed3cae9c52fa506dec874f9ecb91e10ab405969f63180 Mar 18 16:51:34.856442 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:51:34.856399 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-nrwwr" event={"ID":"2af4d381-5c30-48d1-a799-f0efd5f9ba83","Type":"ContainerStarted","Data":"4c2fa99b0750e49dedb5895b7f6a5a44ced8ef15384e952a77d4d711e62ca918"} Mar 18 16:51:34.857481 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:51:34.857455 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-7m2zt" event={"ID":"f8b45b1d-5258-46df-8637-3e19b39d3a5a","Type":"ContainerStarted","Data":"d68ca19c2a3d9fd5091ed3cae9c52fa506dec874f9ecb91e10ab405969f63180"} Mar 18 16:51:38.879286 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:51:38.879248 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-nrwwr" event={"ID":"2af4d381-5c30-48d1-a799-f0efd5f9ba83","Type":"ContainerStarted","Data":"1c7d7afe1e6554051bc0cabe19579ce40f278184f5a1ef6c89f419dd53e2738b"} Mar 18 16:51:38.879796 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:51:38.879362 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-nrwwr" Mar 18 16:51:38.880636 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:51:38.880615 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-7m2zt" event={"ID":"f8b45b1d-5258-46df-8637-3e19b39d3a5a","Type":"ContainerStarted","Data":"4214c881cd06026165e39e81832af0d436fc95e5ed17faa620b37e7aaafcfbb4"} Mar 18 16:51:38.880742 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:51:38.880724 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-7m2zt" Mar 18 16:51:38.895019 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:51:38.894976 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-nrwwr" podStartSLOduration=9.077462982 podStartE2EDuration="12.89496182s" podCreationTimestamp="2026-03-18 16:51:26 +0000 UTC" firstStartedPulling="2026-03-18 16:51:34.561154659 +0000 UTC m=+448.573263462" lastFinishedPulling="2026-03-18 16:51:38.378653483 +0000 UTC m=+452.390762300" observedRunningTime="2026-03-18 16:51:38.89355742 +0000 UTC m=+452.905666246" watchObservedRunningTime="2026-03-18 16:51:38.89496182 +0000 UTC m=+452.907070690" Mar 18 16:51:38.908663 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:51:38.908607 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-7m2zt" podStartSLOduration=9.143808571 podStartE2EDuration="12.908588252s" podCreationTimestamp="2026-03-18 16:51:26 +0000 UTC" firstStartedPulling="2026-03-18 16:51:34.619496907 +0000 UTC m=+448.631605710" lastFinishedPulling="2026-03-18 16:51:38.384276585 +0000 UTC m=+452.396385391" observedRunningTime="2026-03-18 16:51:38.907177538 +0000 UTC m=+452.919286366" watchObservedRunningTime="2026-03-18 16:51:38.908588252 +0000 UTC m=+452.920697078" Mar 18 16:51:47.836714 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:51:47.836680 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-ksts4" Mar 18 16:51:49.889441 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:51:49.889404 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-nrwwr" Mar 18 16:51:59.886054 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:51:59.886018 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-7m2zt" Mar 18 16:52:32.108505 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:52:32.108469 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-69d7c9bbdc-nqjtv"] Mar 18 16:52:32.111677 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:52:32.111657 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-69d7c9bbdc-nqjtv" Mar 18 16:52:32.113886 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:52:32.113848 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Mar 18 16:52:32.114205 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:52:32.114186 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Mar 18 16:52:32.114300 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:52:32.114206 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-d6b54\"" Mar 18 16:52:32.114300 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:52:32.114208 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Mar 18 16:52:32.115150 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:52:32.115131 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-npvlf"] Mar 18 16:52:32.118098 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:52:32.118082 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-npvlf" Mar 18 16:52:32.120826 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:52:32.120565 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-jhqd7\"" Mar 18 16:52:32.121293 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:52:32.121271 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Mar 18 16:52:32.122630 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:52:32.122610 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-69d7c9bbdc-nqjtv"] Mar 18 16:52:32.130416 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:52:32.130394 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-npvlf"] Mar 18 16:52:32.180698 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:52:32.180664 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9tww\" (UniqueName: \"kubernetes.io/projected/4c96b482-f91c-4de6-8c41-5581cd74ad2f-kube-api-access-w9tww\") pod \"kserve-controller-manager-69d7c9bbdc-nqjtv\" (UID: \"4c96b482-f91c-4de6-8c41-5581cd74ad2f\") " pod="kserve/kserve-controller-manager-69d7c9bbdc-nqjtv" Mar 18 16:52:32.180888 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:52:32.180800 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9xtg\" (UniqueName: \"kubernetes.io/projected/8b14e9b6-c9f5-45f4-a14d-34e6ce85ba31-kube-api-access-v9xtg\") pod \"llmisvc-controller-manager-68cc5db7c4-npvlf\" (UID: \"8b14e9b6-c9f5-45f4-a14d-34e6ce85ba31\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-npvlf" Mar 18 16:52:32.180888 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:52:32.180848 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4c96b482-f91c-4de6-8c41-5581cd74ad2f-cert\") pod \"kserve-controller-manager-69d7c9bbdc-nqjtv\" (UID: \"4c96b482-f91c-4de6-8c41-5581cd74ad2f\") " pod="kserve/kserve-controller-manager-69d7c9bbdc-nqjtv" Mar 18 16:52:32.180888 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:52:32.180874 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8b14e9b6-c9f5-45f4-a14d-34e6ce85ba31-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-npvlf\" (UID: \"8b14e9b6-c9f5-45f4-a14d-34e6ce85ba31\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-npvlf" Mar 18 16:52:32.281756 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:52:32.281717 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v9xtg\" (UniqueName: \"kubernetes.io/projected/8b14e9b6-c9f5-45f4-a14d-34e6ce85ba31-kube-api-access-v9xtg\") pod \"llmisvc-controller-manager-68cc5db7c4-npvlf\" (UID: \"8b14e9b6-c9f5-45f4-a14d-34e6ce85ba31\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-npvlf" Mar 18 16:52:32.281953 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:52:32.281765 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4c96b482-f91c-4de6-8c41-5581cd74ad2f-cert\") pod \"kserve-controller-manager-69d7c9bbdc-nqjtv\" (UID: \"4c96b482-f91c-4de6-8c41-5581cd74ad2f\") " pod="kserve/kserve-controller-manager-69d7c9bbdc-nqjtv" Mar 18 16:52:32.281953 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:52:32.281790 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8b14e9b6-c9f5-45f4-a14d-34e6ce85ba31-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-npvlf\" (UID: \"8b14e9b6-c9f5-45f4-a14d-34e6ce85ba31\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-npvlf" Mar 18 16:52:32.281953 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:52:32.281850 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w9tww\" (UniqueName: \"kubernetes.io/projected/4c96b482-f91c-4de6-8c41-5581cd74ad2f-kube-api-access-w9tww\") pod \"kserve-controller-manager-69d7c9bbdc-nqjtv\" (UID: \"4c96b482-f91c-4de6-8c41-5581cd74ad2f\") " pod="kserve/kserve-controller-manager-69d7c9bbdc-nqjtv" Mar 18 16:52:32.281953 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:52:32.281904 2578 secret.go:189] Couldn't get secret kserve/kserve-webhook-server-cert: secret "kserve-webhook-server-cert" not found Mar 18 16:52:32.282159 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:52:32.281993 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4c96b482-f91c-4de6-8c41-5581cd74ad2f-cert podName:4c96b482-f91c-4de6-8c41-5581cd74ad2f nodeName:}" failed. No retries permitted until 2026-03-18 16:52:32.781960516 +0000 UTC m=+506.794069337 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4c96b482-f91c-4de6-8c41-5581cd74ad2f-cert") pod "kserve-controller-manager-69d7c9bbdc-nqjtv" (UID: "4c96b482-f91c-4de6-8c41-5581cd74ad2f") : secret "kserve-webhook-server-cert" not found Mar 18 16:52:32.284358 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:52:32.284331 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8b14e9b6-c9f5-45f4-a14d-34e6ce85ba31-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-npvlf\" (UID: \"8b14e9b6-c9f5-45f4-a14d-34e6ce85ba31\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-npvlf" Mar 18 16:52:32.290351 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:52:32.290328 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9xtg\" (UniqueName: \"kubernetes.io/projected/8b14e9b6-c9f5-45f4-a14d-34e6ce85ba31-kube-api-access-v9xtg\") pod \"llmisvc-controller-manager-68cc5db7c4-npvlf\" (UID: \"8b14e9b6-c9f5-45f4-a14d-34e6ce85ba31\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-npvlf" Mar 18 16:52:32.290748 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:52:32.290729 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9tww\" (UniqueName: \"kubernetes.io/projected/4c96b482-f91c-4de6-8c41-5581cd74ad2f-kube-api-access-w9tww\") pod \"kserve-controller-manager-69d7c9bbdc-nqjtv\" (UID: \"4c96b482-f91c-4de6-8c41-5581cd74ad2f\") " pod="kserve/kserve-controller-manager-69d7c9bbdc-nqjtv" Mar 18 16:52:32.433737 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:52:32.433660 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-npvlf" Mar 18 16:52:32.555643 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:52:32.555612 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-npvlf"] Mar 18 16:52:32.558165 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:52:32.558133 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod8b14e9b6_c9f5_45f4_a14d_34e6ce85ba31.slice/crio-587814c009158a9ae2e5360bf7c6abd2cf1c39fba118f3a6dd472c5594c0aa28 WatchSource:0}: Error finding container 587814c009158a9ae2e5360bf7c6abd2cf1c39fba118f3a6dd472c5594c0aa28: Status 404 returned error can't find the container with id 587814c009158a9ae2e5360bf7c6abd2cf1c39fba118f3a6dd472c5594c0aa28 Mar 18 16:52:32.791129 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:52:32.791100 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4c96b482-f91c-4de6-8c41-5581cd74ad2f-cert\") pod \"kserve-controller-manager-69d7c9bbdc-nqjtv\" (UID: \"4c96b482-f91c-4de6-8c41-5581cd74ad2f\") " pod="kserve/kserve-controller-manager-69d7c9bbdc-nqjtv" Mar 18 16:52:32.793410 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:52:32.793389 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4c96b482-f91c-4de6-8c41-5581cd74ad2f-cert\") pod \"kserve-controller-manager-69d7c9bbdc-nqjtv\" (UID: \"4c96b482-f91c-4de6-8c41-5581cd74ad2f\") " pod="kserve/kserve-controller-manager-69d7c9bbdc-nqjtv" Mar 18 16:52:33.024045 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:52:33.024013 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-69d7c9bbdc-nqjtv" Mar 18 16:52:33.052725 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:52:33.052650 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-npvlf" event={"ID":"8b14e9b6-c9f5-45f4-a14d-34e6ce85ba31","Type":"ContainerStarted","Data":"587814c009158a9ae2e5360bf7c6abd2cf1c39fba118f3a6dd472c5594c0aa28"} Mar 18 16:52:33.071016 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:52:33.070957 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-69d7c9bbdc-nqjtv"] Mar 18 16:52:33.156372 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:52:33.156343 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-69d7c9bbdc-nqjtv"] Mar 18 16:52:33.180131 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:52:33.180092 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c96b482_f91c_4de6_8c41_5581cd74ad2f.slice/crio-8c574ee72dab8fa989468c84af2cbd2148a9dbef9730ca87dc4ed994ccd2a71e WatchSource:0}: Error finding container 8c574ee72dab8fa989468c84af2cbd2148a9dbef9730ca87dc4ed994ccd2a71e: Status 404 returned error can't find the container with id 8c574ee72dab8fa989468c84af2cbd2148a9dbef9730ca87dc4ed994ccd2a71e Mar 18 16:52:34.060365 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:52:34.060306 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-69d7c9bbdc-nqjtv" event={"ID":"4c96b482-f91c-4de6-8c41-5581cd74ad2f","Type":"ContainerStarted","Data":"8c574ee72dab8fa989468c84af2cbd2148a9dbef9730ca87dc4ed994ccd2a71e"} Mar 18 16:52:37.071073 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:52:37.071038 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-npvlf" event={"ID":"8b14e9b6-c9f5-45f4-a14d-34e6ce85ba31","Type":"ContainerStarted","Data":"fd728bed133a1ecdc6c82cc1df4ce9cc3fb4b92163644c045dc8af3603d19c97"} Mar 18 16:52:37.071545 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:52:37.071100 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-npvlf" Mar 18 16:52:37.072344 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:52:37.072322 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-69d7c9bbdc-nqjtv" event={"ID":"4c96b482-f91c-4de6-8c41-5581cd74ad2f","Type":"ContainerStarted","Data":"6927c013014e5b76ce8a37a3872aa86af924dbc3596bacf8a03df92d7adf1cee"} Mar 18 16:52:37.072440 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:52:37.072420 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-69d7c9bbdc-nqjtv" Mar 18 16:52:37.072440 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:52:37.072399 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-69d7c9bbdc-nqjtv" podUID="4c96b482-f91c-4de6-8c41-5581cd74ad2f" containerName="manager" containerID="cri-o://6927c013014e5b76ce8a37a3872aa86af924dbc3596bacf8a03df92d7adf1cee" gracePeriod=10 Mar 18 16:52:37.088947 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:52:37.088906 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-68cc5db7c4-npvlf" podStartSLOduration=1.6004872479999999 podStartE2EDuration="5.088894307s" podCreationTimestamp="2026-03-18 16:52:32 +0000 UTC" firstStartedPulling="2026-03-18 16:52:32.55976776 +0000 UTC m=+506.571876562" lastFinishedPulling="2026-03-18 16:52:36.048174815 +0000 UTC m=+510.060283621" observedRunningTime="2026-03-18 16:52:37.087134401 +0000 UTC m=+511.099243228" watchObservedRunningTime="2026-03-18 16:52:37.088894307 +0000 UTC m=+511.101003132" Mar 18 16:52:37.102564 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:52:37.102513 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-69d7c9bbdc-nqjtv" podStartSLOduration=2.18870767 podStartE2EDuration="5.102502036s" podCreationTimestamp="2026-03-18 16:52:32 +0000 UTC" firstStartedPulling="2026-03-18 16:52:33.18192128 +0000 UTC m=+507.194030084" lastFinishedPulling="2026-03-18 16:52:36.095715647 +0000 UTC m=+510.107824450" observedRunningTime="2026-03-18 16:52:37.101870682 +0000 UTC m=+511.113979505" watchObservedRunningTime="2026-03-18 16:52:37.102502036 +0000 UTC m=+511.114610893" Mar 18 16:52:37.798841 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:52:37.798815 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-69d7c9bbdc-nqjtv" Mar 18 16:52:37.939433 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:52:37.939339 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4c96b482-f91c-4de6-8c41-5581cd74ad2f-cert\") pod \"4c96b482-f91c-4de6-8c41-5581cd74ad2f\" (UID: \"4c96b482-f91c-4de6-8c41-5581cd74ad2f\") " Mar 18 16:52:37.939641 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:52:37.939436 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9tww\" (UniqueName: \"kubernetes.io/projected/4c96b482-f91c-4de6-8c41-5581cd74ad2f-kube-api-access-w9tww\") pod \"4c96b482-f91c-4de6-8c41-5581cd74ad2f\" (UID: \"4c96b482-f91c-4de6-8c41-5581cd74ad2f\") " Mar 18 16:52:37.941652 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:52:37.941618 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c96b482-f91c-4de6-8c41-5581cd74ad2f-cert" (OuterVolumeSpecName: "cert") pod "4c96b482-f91c-4de6-8c41-5581cd74ad2f" (UID: "4c96b482-f91c-4de6-8c41-5581cd74ad2f"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:52:37.941744 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:52:37.941708 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c96b482-f91c-4de6-8c41-5581cd74ad2f-kube-api-access-w9tww" (OuterVolumeSpecName: "kube-api-access-w9tww") pod "4c96b482-f91c-4de6-8c41-5581cd74ad2f" (UID: "4c96b482-f91c-4de6-8c41-5581cd74ad2f"). InnerVolumeSpecName "kube-api-access-w9tww". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 16:52:38.040294 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:52:38.040261 2578 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4c96b482-f91c-4de6-8c41-5581cd74ad2f-cert\") on node \"ip-10-0-143-175.ec2.internal\" DevicePath \"\"" Mar 18 16:52:38.040294 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:52:38.040291 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-w9tww\" (UniqueName: \"kubernetes.io/projected/4c96b482-f91c-4de6-8c41-5581cd74ad2f-kube-api-access-w9tww\") on node \"ip-10-0-143-175.ec2.internal\" DevicePath \"\"" Mar 18 16:52:38.076864 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:52:38.076829 2578 generic.go:358] "Generic (PLEG): container finished" podID="4c96b482-f91c-4de6-8c41-5581cd74ad2f" containerID="6927c013014e5b76ce8a37a3872aa86af924dbc3596bacf8a03df92d7adf1cee" exitCode=0 Mar 18 16:52:38.077296 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:52:38.076882 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-69d7c9bbdc-nqjtv" event={"ID":"4c96b482-f91c-4de6-8c41-5581cd74ad2f","Type":"ContainerDied","Data":"6927c013014e5b76ce8a37a3872aa86af924dbc3596bacf8a03df92d7adf1cee"} Mar 18 16:52:38.077296 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:52:38.076924 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-69d7c9bbdc-nqjtv" event={"ID":"4c96b482-f91c-4de6-8c41-5581cd74ad2f","Type":"ContainerDied","Data":"8c574ee72dab8fa989468c84af2cbd2148a9dbef9730ca87dc4ed994ccd2a71e"} Mar 18 16:52:38.077296 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:52:38.076940 2578 scope.go:117] "RemoveContainer" containerID="6927c013014e5b76ce8a37a3872aa86af924dbc3596bacf8a03df92d7adf1cee" Mar 18 16:52:38.077296 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:52:38.076892 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-69d7c9bbdc-nqjtv" Mar 18 16:52:38.086925 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:52:38.086895 2578 scope.go:117] "RemoveContainer" containerID="6927c013014e5b76ce8a37a3872aa86af924dbc3596bacf8a03df92d7adf1cee" Mar 18 16:52:38.087604 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:52:38.087579 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6927c013014e5b76ce8a37a3872aa86af924dbc3596bacf8a03df92d7adf1cee\": container with ID starting with 6927c013014e5b76ce8a37a3872aa86af924dbc3596bacf8a03df92d7adf1cee not found: ID does not exist" containerID="6927c013014e5b76ce8a37a3872aa86af924dbc3596bacf8a03df92d7adf1cee" Mar 18 16:52:38.087691 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:52:38.087615 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6927c013014e5b76ce8a37a3872aa86af924dbc3596bacf8a03df92d7adf1cee"} err="failed to get container status \"6927c013014e5b76ce8a37a3872aa86af924dbc3596bacf8a03df92d7adf1cee\": rpc error: code = NotFound desc = could not find container \"6927c013014e5b76ce8a37a3872aa86af924dbc3596bacf8a03df92d7adf1cee\": container with ID starting with 6927c013014e5b76ce8a37a3872aa86af924dbc3596bacf8a03df92d7adf1cee not found: ID does not exist" Mar 18 16:52:38.099983 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:52:38.099953 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-69d7c9bbdc-nqjtv"] Mar 18 16:52:38.103178 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:52:38.103157 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-69d7c9bbdc-nqjtv"] Mar 18 16:52:38.592587 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:52:38.592557 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c96b482-f91c-4de6-8c41-5581cd74ad2f" path="/var/lib/kubelet/pods/4c96b482-f91c-4de6-8c41-5581cd74ad2f/volumes" Mar 18 16:53:08.079956 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:53:08.079923 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-npvlf" Mar 18 16:53:09.198234 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:53:09.198199 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-9699c8d45-stcsx"] Mar 18 16:53:09.198854 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:53:09.198519 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4c96b482-f91c-4de6-8c41-5581cd74ad2f" containerName="manager" Mar 18 16:53:09.198854 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:53:09.198549 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c96b482-f91c-4de6-8c41-5581cd74ad2f" containerName="manager" Mar 18 16:53:09.198854 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:53:09.198598 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="4c96b482-f91c-4de6-8c41-5581cd74ad2f" containerName="manager" Mar 18 16:53:09.272287 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:53:09.272245 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-9699c8d45-stcsx"] Mar 18 16:53:09.272287 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:53:09.272286 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-6lk5g"] Mar 18 16:53:09.272517 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:53:09.272407 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-9699c8d45-stcsx" Mar 18 16:53:09.275382 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:53:09.275358 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-d7jfz\"" Mar 18 16:53:09.275517 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:53:09.275360 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Mar 18 16:53:09.298077 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:53:09.298042 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-6lk5g"] Mar 18 16:53:09.298208 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:53:09.298088 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-6lk5g" Mar 18 16:53:09.300088 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:53:09.300066 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Mar 18 16:53:09.300088 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:53:09.300080 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-82rjj\"" Mar 18 16:53:09.301138 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:53:09.301116 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4f796fc2-c6ea-4936-8fae-6cd0d2b876db-tls-certs\") pod \"model-serving-api-9699c8d45-stcsx\" (UID: \"4f796fc2-c6ea-4936-8fae-6cd0d2b876db\") " pod="kserve/model-serving-api-9699c8d45-stcsx" Mar 18 16:53:09.301239 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:53:09.301173 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xp5hl\" (UniqueName: \"kubernetes.io/projected/4f796fc2-c6ea-4936-8fae-6cd0d2b876db-kube-api-access-xp5hl\") pod \"model-serving-api-9699c8d45-stcsx\" (UID: \"4f796fc2-c6ea-4936-8fae-6cd0d2b876db\") " pod="kserve/model-serving-api-9699c8d45-stcsx" Mar 18 16:53:09.402090 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:53:09.402044 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xp5hl\" (UniqueName: \"kubernetes.io/projected/4f796fc2-c6ea-4936-8fae-6cd0d2b876db-kube-api-access-xp5hl\") pod \"model-serving-api-9699c8d45-stcsx\" (UID: \"4f796fc2-c6ea-4936-8fae-6cd0d2b876db\") " pod="kserve/model-serving-api-9699c8d45-stcsx" Mar 18 16:53:09.402090 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:53:09.402098 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4b2bc59c-ccc3-4fdc-9cde-f8bd31d1e1f0-cert\") pod \"odh-model-controller-696fc77849-6lk5g\" (UID: \"4b2bc59c-ccc3-4fdc-9cde-f8bd31d1e1f0\") " pod="kserve/odh-model-controller-696fc77849-6lk5g" Mar 18 16:53:09.402354 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:53:09.402129 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q58q6\" (UniqueName: \"kubernetes.io/projected/4b2bc59c-ccc3-4fdc-9cde-f8bd31d1e1f0-kube-api-access-q58q6\") pod \"odh-model-controller-696fc77849-6lk5g\" (UID: \"4b2bc59c-ccc3-4fdc-9cde-f8bd31d1e1f0\") " pod="kserve/odh-model-controller-696fc77849-6lk5g" Mar 18 16:53:09.402354 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:53:09.402162 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4f796fc2-c6ea-4936-8fae-6cd0d2b876db-tls-certs\") pod \"model-serving-api-9699c8d45-stcsx\" (UID: \"4f796fc2-c6ea-4936-8fae-6cd0d2b876db\") " pod="kserve/model-serving-api-9699c8d45-stcsx" Mar 18 16:53:09.404644 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:53:09.404617 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4f796fc2-c6ea-4936-8fae-6cd0d2b876db-tls-certs\") pod \"model-serving-api-9699c8d45-stcsx\" (UID: \"4f796fc2-c6ea-4936-8fae-6cd0d2b876db\") " pod="kserve/model-serving-api-9699c8d45-stcsx" Mar 18 16:53:09.411611 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:53:09.411583 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xp5hl\" (UniqueName: \"kubernetes.io/projected/4f796fc2-c6ea-4936-8fae-6cd0d2b876db-kube-api-access-xp5hl\") pod \"model-serving-api-9699c8d45-stcsx\" (UID: \"4f796fc2-c6ea-4936-8fae-6cd0d2b876db\") " pod="kserve/model-serving-api-9699c8d45-stcsx" Mar 18 16:53:09.502703 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:53:09.502597 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q58q6\" (UniqueName: \"kubernetes.io/projected/4b2bc59c-ccc3-4fdc-9cde-f8bd31d1e1f0-kube-api-access-q58q6\") pod \"odh-model-controller-696fc77849-6lk5g\" (UID: \"4b2bc59c-ccc3-4fdc-9cde-f8bd31d1e1f0\") " pod="kserve/odh-model-controller-696fc77849-6lk5g" Mar 18 16:53:09.502875 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:53:09.502707 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4b2bc59c-ccc3-4fdc-9cde-f8bd31d1e1f0-cert\") pod \"odh-model-controller-696fc77849-6lk5g\" (UID: \"4b2bc59c-ccc3-4fdc-9cde-f8bd31d1e1f0\") " pod="kserve/odh-model-controller-696fc77849-6lk5g" Mar 18 16:53:09.505654 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:53:09.505623 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4b2bc59c-ccc3-4fdc-9cde-f8bd31d1e1f0-cert\") pod \"odh-model-controller-696fc77849-6lk5g\" (UID: \"4b2bc59c-ccc3-4fdc-9cde-f8bd31d1e1f0\") " pod="kserve/odh-model-controller-696fc77849-6lk5g" Mar 18 16:53:09.511461 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:53:09.511432 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q58q6\" (UniqueName: \"kubernetes.io/projected/4b2bc59c-ccc3-4fdc-9cde-f8bd31d1e1f0-kube-api-access-q58q6\") pod \"odh-model-controller-696fc77849-6lk5g\" (UID: \"4b2bc59c-ccc3-4fdc-9cde-f8bd31d1e1f0\") " pod="kserve/odh-model-controller-696fc77849-6lk5g" Mar 18 16:53:09.584407 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:53:09.584372 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-9699c8d45-stcsx" Mar 18 16:53:09.608653 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:53:09.608626 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-6lk5g" Mar 18 16:53:09.731743 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:53:09.731720 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-9699c8d45-stcsx"] Mar 18 16:53:09.734710 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:53:09.734678 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f796fc2_c6ea_4936_8fae_6cd0d2b876db.slice/crio-7b3376b20aa38d8844003e400e01e69587129b3b31be55bc26a860f18df30d20 WatchSource:0}: Error finding container 7b3376b20aa38d8844003e400e01e69587129b3b31be55bc26a860f18df30d20: Status 404 returned error can't find the container with id 7b3376b20aa38d8844003e400e01e69587129b3b31be55bc26a860f18df30d20 Mar 18 16:53:09.748413 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:53:09.748392 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-6lk5g"] Mar 18 16:53:09.750464 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:53:09.750439 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b2bc59c_ccc3_4fdc_9cde_f8bd31d1e1f0.slice/crio-b280cd085ec06f092c4ae9f6cc3dd0c0af62531042a31038a125b670e11cda1a WatchSource:0}: Error finding container b280cd085ec06f092c4ae9f6cc3dd0c0af62531042a31038a125b670e11cda1a: Status 404 returned error can't find the container with id b280cd085ec06f092c4ae9f6cc3dd0c0af62531042a31038a125b670e11cda1a Mar 18 16:53:10.010582 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:53:10.010460 2578 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized" image="quay.io/opendatahub/odh-model-serving-api:fast" Mar 18 16:53:10.010735 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:53:10.010690 2578 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:server,Image:quay.io/opendatahub/odh-model-serving-api:fast,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},ContainerPort{Name:metrics,HostPort:0,ContainerPort:9090,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:TLS_CERT_DIR,Value:/tls,ValueFrom:nil,},EnvVar{Name:GATEWAY_LABEL_SELECTOR,Value:,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{50 -3} {} 50m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tls-certs,ReadOnly:true,MountPath:/tls,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xp5hl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000640000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod model-serving-api-9699c8d45-stcsx_kserve(4f796fc2-c6ea-4936-8fae-6cd0d2b876db): ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized" logger="UnhandledError" Mar 18 16:53:10.011873 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:53:10.011840 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ErrImagePull: \"unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 16:53:10.179878 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:53:10.179842 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-9699c8d45-stcsx" event={"ID":"4f796fc2-c6ea-4936-8fae-6cd0d2b876db","Type":"ContainerStarted","Data":"7b3376b20aa38d8844003e400e01e69587129b3b31be55bc26a860f18df30d20"} Mar 18 16:53:10.180927 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:53:10.180885 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 16:53:10.181031 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:53:10.180937 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-6lk5g" event={"ID":"4b2bc59c-ccc3-4fdc-9cde-f8bd31d1e1f0","Type":"ContainerStarted","Data":"b280cd085ec06f092c4ae9f6cc3dd0c0af62531042a31038a125b670e11cda1a"} Mar 18 16:53:11.187193 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:53:11.187135 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 16:53:13.194046 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:53:13.194014 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-6lk5g" event={"ID":"4b2bc59c-ccc3-4fdc-9cde-f8bd31d1e1f0","Type":"ContainerStarted","Data":"d0d2b2ad103219c92067ed585a55372b5e5f8c5d22d13b0703e11be59b638f3f"} Mar 18 16:53:13.194436 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:53:13.194175 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-6lk5g" Mar 18 16:53:13.211866 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:53:13.211810 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-6lk5g" podStartSLOduration=1.45873961 podStartE2EDuration="4.211792023s" podCreationTimestamp="2026-03-18 16:53:09 +0000 UTC" firstStartedPulling="2026-03-18 16:53:09.751805026 +0000 UTC m=+543.763913829" lastFinishedPulling="2026-03-18 16:53:12.504857436 +0000 UTC m=+546.516966242" observedRunningTime="2026-03-18 16:53:13.210116022 +0000 UTC m=+547.222224847" watchObservedRunningTime="2026-03-18 16:53:13.211792023 +0000 UTC m=+547.223900849" Mar 18 16:53:22.891256 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:53:22.891161 2578 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized" image="quay.io/opendatahub/odh-model-serving-api:fast" Mar 18 16:53:22.891636 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:53:22.891347 2578 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:server,Image:quay.io/opendatahub/odh-model-serving-api:fast,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},ContainerPort{Name:metrics,HostPort:0,ContainerPort:9090,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:TLS_CERT_DIR,Value:/tls,ValueFrom:nil,},EnvVar{Name:GATEWAY_LABEL_SELECTOR,Value:,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{50 -3} {} 50m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tls-certs,ReadOnly:true,MountPath:/tls,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xp5hl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000640000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod model-serving-api-9699c8d45-stcsx_kserve(4f796fc2-c6ea-4936-8fae-6cd0d2b876db): ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized" logger="UnhandledError" Mar 18 16:53:22.892556 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:53:22.892513 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ErrImagePull: \"unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 16:53:24.199630 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:53:24.199598 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-6lk5g" Mar 18 16:53:36.590969 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:53:36.590939 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 16:53:43.776031 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:53:43.775935 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6579795fb5-vvblb"] Mar 18 16:53:43.782730 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:53:43.782705 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6579795fb5-vvblb" Mar 18 16:53:43.785078 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:53:43.785027 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Mar 18 16:53:43.785916 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:53:43.785893 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-lns69\"" Mar 18 16:53:43.786051 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:53:43.785930 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Mar 18 16:53:43.787719 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:53:43.787696 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6579795fb5-vvblb"] Mar 18 16:53:43.893255 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:53:43.893222 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vn6ns\" (UniqueName: \"kubernetes.io/projected/e9d24bd6-a499-475b-81ac-535409aeb9bf-kube-api-access-vn6ns\") pod \"isvc-sklearn-graph-1-predictor-6579795fb5-vvblb\" (UID: \"e9d24bd6-a499-475b-81ac-535409aeb9bf\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6579795fb5-vvblb" Mar 18 16:53:43.893419 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:53:43.893286 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e9d24bd6-a499-475b-81ac-535409aeb9bf-kserve-provision-location\") pod \"isvc-sklearn-graph-1-predictor-6579795fb5-vvblb\" (UID: \"e9d24bd6-a499-475b-81ac-535409aeb9bf\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6579795fb5-vvblb" Mar 18 16:53:43.957557 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:53:43.957507 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-a225b-predictor-b8745d9cc-cqs9k"] Mar 18 16:53:43.961817 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:53:43.961795 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-a225b-predictor-b8745d9cc-cqs9k" Mar 18 16:53:43.967701 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:53:43.967674 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-a225b-predictor-b8745d9cc-cqs9k"] Mar 18 16:53:43.994027 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:53:43.993995 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vn6ns\" (UniqueName: \"kubernetes.io/projected/e9d24bd6-a499-475b-81ac-535409aeb9bf-kube-api-access-vn6ns\") pod \"isvc-sklearn-graph-1-predictor-6579795fb5-vvblb\" (UID: \"e9d24bd6-a499-475b-81ac-535409aeb9bf\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6579795fb5-vvblb" Mar 18 16:53:43.994181 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:53:43.994058 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e9d24bd6-a499-475b-81ac-535409aeb9bf-kserve-provision-location\") pod \"isvc-sklearn-graph-1-predictor-6579795fb5-vvblb\" (UID: \"e9d24bd6-a499-475b-81ac-535409aeb9bf\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6579795fb5-vvblb" Mar 18 16:53:43.994384 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:53:43.994365 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e9d24bd6-a499-475b-81ac-535409aeb9bf-kserve-provision-location\") pod \"isvc-sklearn-graph-1-predictor-6579795fb5-vvblb\" (UID: \"e9d24bd6-a499-475b-81ac-535409aeb9bf\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6579795fb5-vvblb" Mar 18 16:53:44.006213 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:53:44.006190 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vn6ns\" (UniqueName: \"kubernetes.io/projected/e9d24bd6-a499-475b-81ac-535409aeb9bf-kube-api-access-vn6ns\") pod \"isvc-sklearn-graph-1-predictor-6579795fb5-vvblb\" (UID: \"e9d24bd6-a499-475b-81ac-535409aeb9bf\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6579795fb5-vvblb" Mar 18 16:53:44.094832 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:53:44.094796 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrl6q\" (UniqueName: \"kubernetes.io/projected/0172b66d-e446-422b-a91f-901dcaad88d2-kube-api-access-qrl6q\") pod \"success-200-isvc-a225b-predictor-b8745d9cc-cqs9k\" (UID: \"0172b66d-e446-422b-a91f-901dcaad88d2\") " pod="kserve-ci-e2e-test/success-200-isvc-a225b-predictor-b8745d9cc-cqs9k" Mar 18 16:53:44.095006 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:53:44.094853 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6579795fb5-vvblb" Mar 18 16:53:44.195649 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:53:44.195607 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qrl6q\" (UniqueName: \"kubernetes.io/projected/0172b66d-e446-422b-a91f-901dcaad88d2-kube-api-access-qrl6q\") pod \"success-200-isvc-a225b-predictor-b8745d9cc-cqs9k\" (UID: \"0172b66d-e446-422b-a91f-901dcaad88d2\") " pod="kserve-ci-e2e-test/success-200-isvc-a225b-predictor-b8745d9cc-cqs9k" Mar 18 16:53:44.204102 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:53:44.204044 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrl6q\" (UniqueName: \"kubernetes.io/projected/0172b66d-e446-422b-a91f-901dcaad88d2-kube-api-access-qrl6q\") pod \"success-200-isvc-a225b-predictor-b8745d9cc-cqs9k\" (UID: \"0172b66d-e446-422b-a91f-901dcaad88d2\") " pod="kserve-ci-e2e-test/success-200-isvc-a225b-predictor-b8745d9cc-cqs9k" Mar 18 16:53:44.240143 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:53:44.240108 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6579795fb5-vvblb"] Mar 18 16:53:44.243894 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:53:44.243858 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9d24bd6_a499_475b_81ac_535409aeb9bf.slice/crio-dee4fe45dff776b582136d1032f1aec454e4eba50da1ff46bede883a392d15ec WatchSource:0}: Error finding container dee4fe45dff776b582136d1032f1aec454e4eba50da1ff46bede883a392d15ec: Status 404 returned error can't find the container with id dee4fe45dff776b582136d1032f1aec454e4eba50da1ff46bede883a392d15ec Mar 18 16:53:44.275429 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:53:44.275402 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-a225b-predictor-b8745d9cc-cqs9k" Mar 18 16:53:44.309649 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:53:44.309601 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6579795fb5-vvblb" event={"ID":"e9d24bd6-a499-475b-81ac-535409aeb9bf","Type":"ContainerStarted","Data":"dee4fe45dff776b582136d1032f1aec454e4eba50da1ff46bede883a392d15ec"} Mar 18 16:53:44.370037 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:53:44.369960 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-f89879bcc-6z6lg"] Mar 18 16:53:44.375910 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:53:44.375886 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-f89879bcc-6z6lg" Mar 18 16:53:44.389728 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:53:44.389662 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-f89879bcc-6z6lg"] Mar 18 16:53:44.438860 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:53:44.438781 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-a225b-predictor-b8745d9cc-cqs9k"] Mar 18 16:53:44.442874 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:53:44.442844 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0172b66d_e446_422b_a91f_901dcaad88d2.slice/crio-c704a2bc1300cc67826770881057a2765a0bd83631a52f83293180fbade7734f WatchSource:0}: Error finding container c704a2bc1300cc67826770881057a2765a0bd83631a52f83293180fbade7734f: Status 404 returned error can't find the container with id c704a2bc1300cc67826770881057a2765a0bd83631a52f83293180fbade7734f Mar 18 16:53:44.506949 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:53:44.506915 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/77931099-0e41-4ac7-81e1-41e67c4b0c9b-kserve-provision-location\") pod \"isvc-sklearn-graph-2-predictor-f89879bcc-6z6lg\" (UID: \"77931099-0e41-4ac7-81e1-41e67c4b0c9b\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-f89879bcc-6z6lg" Mar 18 16:53:44.507151 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:53:44.506986 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtcnw\" (UniqueName: \"kubernetes.io/projected/77931099-0e41-4ac7-81e1-41e67c4b0c9b-kube-api-access-qtcnw\") pod \"isvc-sklearn-graph-2-predictor-f89879bcc-6z6lg\" (UID: \"77931099-0e41-4ac7-81e1-41e67c4b0c9b\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-f89879bcc-6z6lg" Mar 18 16:53:44.608060 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:53:44.608028 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/77931099-0e41-4ac7-81e1-41e67c4b0c9b-kserve-provision-location\") pod \"isvc-sklearn-graph-2-predictor-f89879bcc-6z6lg\" (UID: \"77931099-0e41-4ac7-81e1-41e67c4b0c9b\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-f89879bcc-6z6lg" Mar 18 16:53:44.608244 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:53:44.608078 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qtcnw\" (UniqueName: \"kubernetes.io/projected/77931099-0e41-4ac7-81e1-41e67c4b0c9b-kube-api-access-qtcnw\") pod \"isvc-sklearn-graph-2-predictor-f89879bcc-6z6lg\" (UID: \"77931099-0e41-4ac7-81e1-41e67c4b0c9b\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-f89879bcc-6z6lg" Mar 18 16:53:44.608485 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:53:44.608459 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/77931099-0e41-4ac7-81e1-41e67c4b0c9b-kserve-provision-location\") pod \"isvc-sklearn-graph-2-predictor-f89879bcc-6z6lg\" (UID: \"77931099-0e41-4ac7-81e1-41e67c4b0c9b\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-f89879bcc-6z6lg" Mar 18 16:53:44.615512 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:53:44.615488 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtcnw\" (UniqueName: \"kubernetes.io/projected/77931099-0e41-4ac7-81e1-41e67c4b0c9b-kube-api-access-qtcnw\") pod \"isvc-sklearn-graph-2-predictor-f89879bcc-6z6lg\" (UID: \"77931099-0e41-4ac7-81e1-41e67c4b0c9b\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-f89879bcc-6z6lg" Mar 18 16:53:44.697622 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:53:44.697507 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-f89879bcc-6z6lg" Mar 18 16:53:44.823654 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:53:44.823629 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-f89879bcc-6z6lg"] Mar 18 16:53:44.826233 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:53:44.826202 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod77931099_0e41_4ac7_81e1_41e67c4b0c9b.slice/crio-cf44396f1f3a20dc546f6ec93b6a482515f8bebf0c2bd7016daf19bf0ffe78f9 WatchSource:0}: Error finding container cf44396f1f3a20dc546f6ec93b6a482515f8bebf0c2bd7016daf19bf0ffe78f9: Status 404 returned error can't find the container with id cf44396f1f3a20dc546f6ec93b6a482515f8bebf0c2bd7016daf19bf0ffe78f9 Mar 18 16:53:45.316693 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:53:45.316650 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-a225b-predictor-b8745d9cc-cqs9k" event={"ID":"0172b66d-e446-422b-a91f-901dcaad88d2","Type":"ContainerStarted","Data":"c704a2bc1300cc67826770881057a2765a0bd83631a52f83293180fbade7734f"} Mar 18 16:53:45.319272 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:53:45.319241 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-f89879bcc-6z6lg" event={"ID":"77931099-0e41-4ac7-81e1-41e67c4b0c9b","Type":"ContainerStarted","Data":"cf44396f1f3a20dc546f6ec93b6a482515f8bebf0c2bd7016daf19bf0ffe78f9"} Mar 18 16:53:48.890793 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:53:48.890684 2578 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized" image="quay.io/opendatahub/odh-model-serving-api:fast" Mar 18 16:53:48.891213 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:53:48.890910 2578 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:server,Image:quay.io/opendatahub/odh-model-serving-api:fast,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},ContainerPort{Name:metrics,HostPort:0,ContainerPort:9090,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:TLS_CERT_DIR,Value:/tls,ValueFrom:nil,},EnvVar{Name:GATEWAY_LABEL_SELECTOR,Value:,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{50 -3} {} 50m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tls-certs,ReadOnly:true,MountPath:/tls,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xp5hl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000640000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod model-serving-api-9699c8d45-stcsx_kserve(4f796fc2-c6ea-4936-8fae-6cd0d2b876db): ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized" logger="UnhandledError" Mar 18 16:53:48.892133 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:53:48.892095 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ErrImagePull: \"unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 16:53:51.344217 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:53:51.344173 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-a225b-predictor-b8745d9cc-cqs9k" event={"ID":"0172b66d-e446-422b-a91f-901dcaad88d2","Type":"ContainerStarted","Data":"35dc1883e9fbfd14410f51f917a42991e0d8eaf0c4800178788d4553c717073b"} Mar 18 16:53:51.344844 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:53:51.344821 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-a225b-predictor-b8745d9cc-cqs9k" Mar 18 16:53:51.345846 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:53:51.345795 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-a225b-predictor-b8745d9cc-cqs9k" podUID="0172b66d-e446-422b-a91f-901dcaad88d2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Mar 18 16:53:51.346267 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:53:51.346240 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-f89879bcc-6z6lg" event={"ID":"77931099-0e41-4ac7-81e1-41e67c4b0c9b","Type":"ContainerStarted","Data":"b828013c758597f2deef8b4f4f498ea02ad00127ed903653a430c3aae6ff8ad4"} Mar 18 16:53:51.347794 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:53:51.347771 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6579795fb5-vvblb" event={"ID":"e9d24bd6-a499-475b-81ac-535409aeb9bf","Type":"ContainerStarted","Data":"d7c4df6247b6738d260e62ebd95a02210b87f2cc43d657aa86580194e07a7fa3"} Mar 18 16:53:51.360326 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:53:51.360257 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-a225b-predictor-b8745d9cc-cqs9k" podStartSLOduration=2.141927368 podStartE2EDuration="8.360236469s" podCreationTimestamp="2026-03-18 16:53:43 +0000 UTC" firstStartedPulling="2026-03-18 16:53:44.44487989 +0000 UTC m=+578.456988693" lastFinishedPulling="2026-03-18 16:53:50.663188981 +0000 UTC m=+584.675297794" observedRunningTime="2026-03-18 16:53:51.358828422 +0000 UTC m=+585.370937248" watchObservedRunningTime="2026-03-18 16:53:51.360236469 +0000 UTC m=+585.372345298" Mar 18 16:53:52.351305 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:53:52.351262 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-a225b-predictor-b8745d9cc-cqs9k" podUID="0172b66d-e446-422b-a91f-901dcaad88d2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Mar 18 16:53:53.354972 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:53:53.354937 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-a225b-predictor-b8745d9cc-cqs9k" podUID="0172b66d-e446-422b-a91f-901dcaad88d2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Mar 18 16:53:54.359647 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:53:54.359613 2578 generic.go:358] "Generic (PLEG): container finished" podID="77931099-0e41-4ac7-81e1-41e67c4b0c9b" containerID="b828013c758597f2deef8b4f4f498ea02ad00127ed903653a430c3aae6ff8ad4" exitCode=0 Mar 18 16:53:54.360032 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:53:54.359687 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-f89879bcc-6z6lg" event={"ID":"77931099-0e41-4ac7-81e1-41e67c4b0c9b","Type":"ContainerDied","Data":"b828013c758597f2deef8b4f4f498ea02ad00127ed903653a430c3aae6ff8ad4"} Mar 18 16:53:55.365114 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:53:55.365070 2578 generic.go:358] "Generic (PLEG): container finished" podID="e9d24bd6-a499-475b-81ac-535409aeb9bf" containerID="d7c4df6247b6738d260e62ebd95a02210b87f2cc43d657aa86580194e07a7fa3" exitCode=0 Mar 18 16:53:55.365571 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:53:55.365144 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6579795fb5-vvblb" event={"ID":"e9d24bd6-a499-475b-81ac-535409aeb9bf","Type":"ContainerDied","Data":"d7c4df6247b6738d260e62ebd95a02210b87f2cc43d657aa86580194e07a7fa3"} Mar 18 16:53:58.627389 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:53:58.627351 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5fd97696f7-fffnn"] Mar 18 16:54:01.392078 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:54:01.392042 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6579795fb5-vvblb" event={"ID":"e9d24bd6-a499-475b-81ac-535409aeb9bf","Type":"ContainerStarted","Data":"166a6c14693b238cc551928589c8467d4e046fab1c918ba03da8cbfb3f20cb84"} Mar 18 16:54:01.392563 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:54:01.392332 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6579795fb5-vvblb" Mar 18 16:54:01.393796 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:54:01.393775 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-f89879bcc-6z6lg" event={"ID":"77931099-0e41-4ac7-81e1-41e67c4b0c9b","Type":"ContainerStarted","Data":"c3ffa98f1625a3fdd5df40150f7a8c03bbc9ac00954e5368ae7d00ab09f5750f"} Mar 18 16:54:01.393901 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:54:01.393769 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6579795fb5-vvblb" podUID="e9d24bd6-a499-475b-81ac-535409aeb9bf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Mar 18 16:54:01.394062 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:54:01.394046 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-f89879bcc-6z6lg" Mar 18 16:54:01.395075 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:54:01.395051 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-f89879bcc-6z6lg" podUID="77931099-0e41-4ac7-81e1-41e67c4b0c9b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Mar 18 16:54:01.408568 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:54:01.408503 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6579795fb5-vvblb" podStartSLOduration=1.866573277 podStartE2EDuration="18.408469498s" podCreationTimestamp="2026-03-18 16:53:43 +0000 UTC" firstStartedPulling="2026-03-18 16:53:44.246185503 +0000 UTC m=+578.258294312" lastFinishedPulling="2026-03-18 16:54:00.788081727 +0000 UTC m=+594.800190533" observedRunningTime="2026-03-18 16:54:01.407160154 +0000 UTC m=+595.419269004" watchObservedRunningTime="2026-03-18 16:54:01.408469498 +0000 UTC m=+595.420578324" Mar 18 16:54:01.422210 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:54:01.422165 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-f89879bcc-6z6lg" podStartSLOduration=1.4772973710000001 podStartE2EDuration="17.422152161s" podCreationTimestamp="2026-03-18 16:53:44 +0000 UTC" firstStartedPulling="2026-03-18 16:53:44.828142588 +0000 UTC m=+578.840251391" lastFinishedPulling="2026-03-18 16:54:00.772997375 +0000 UTC m=+594.785106181" observedRunningTime="2026-03-18 16:54:01.421285603 +0000 UTC m=+595.433394425" watchObservedRunningTime="2026-03-18 16:54:01.422152161 +0000 UTC m=+595.434260985" Mar 18 16:54:02.397659 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:54:02.397621 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-f89879bcc-6z6lg" podUID="77931099-0e41-4ac7-81e1-41e67c4b0c9b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Mar 18 16:54:02.397659 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:54:02.397634 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6579795fb5-vvblb" podUID="e9d24bd6-a499-475b-81ac-535409aeb9bf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Mar 18 16:54:03.355507 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:54:03.355461 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-a225b-predictor-b8745d9cc-cqs9k" podUID="0172b66d-e446-422b-a91f-901dcaad88d2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Mar 18 16:54:03.589403 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:54:03.589366 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 16:54:12.398452 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:54:12.398402 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6579795fb5-vvblb" podUID="e9d24bd6-a499-475b-81ac-535409aeb9bf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Mar 18 16:54:12.398452 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:54:12.398403 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-f89879bcc-6z6lg" podUID="77931099-0e41-4ac7-81e1-41e67c4b0c9b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Mar 18 16:54:13.355475 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:54:13.355426 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-a225b-predictor-b8745d9cc-cqs9k" podUID="0172b66d-e446-422b-a91f-901dcaad88d2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Mar 18 16:54:17.589812 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:54:17.589762 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 16:54:22.398336 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:54:22.398293 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6579795fb5-vvblb" podUID="e9d24bd6-a499-475b-81ac-535409aeb9bf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Mar 18 16:54:22.410638 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:54:22.398301 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-f89879bcc-6z6lg" podUID="77931099-0e41-4ac7-81e1-41e67c4b0c9b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Mar 18 16:54:23.355441 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:54:23.355390 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-a225b-predictor-b8745d9cc-cqs9k" podUID="0172b66d-e446-422b-a91f-901dcaad88d2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Mar 18 16:54:23.653292 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:54:23.653205 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-5fd97696f7-fffnn" podUID="ade3e0f6-9c50-4ce3-ab0c-587c10f7ba15" containerName="console" containerID="cri-o://4d6029c3b15fd7c1371a21e23d399415587beab7d7f8dbb7d391200d5fd78886" gracePeriod=15 Mar 18 16:54:23.891570 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:54:23.891543 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5fd97696f7-fffnn_ade3e0f6-9c50-4ce3-ab0c-587c10f7ba15/console/0.log" Mar 18 16:54:23.891699 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:54:23.891611 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5fd97696f7-fffnn" Mar 18 16:54:23.959246 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:54:23.959166 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ade3e0f6-9c50-4ce3-ab0c-587c10f7ba15-trusted-ca-bundle\") pod \"ade3e0f6-9c50-4ce3-ab0c-587c10f7ba15\" (UID: \"ade3e0f6-9c50-4ce3-ab0c-587c10f7ba15\") " Mar 18 16:54:23.959246 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:54:23.959204 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mkfq4\" (UniqueName: \"kubernetes.io/projected/ade3e0f6-9c50-4ce3-ab0c-587c10f7ba15-kube-api-access-mkfq4\") pod \"ade3e0f6-9c50-4ce3-ab0c-587c10f7ba15\" (UID: \"ade3e0f6-9c50-4ce3-ab0c-587c10f7ba15\") " Mar 18 16:54:23.959246 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:54:23.959231 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ade3e0f6-9c50-4ce3-ab0c-587c10f7ba15-console-config\") pod \"ade3e0f6-9c50-4ce3-ab0c-587c10f7ba15\" (UID: \"ade3e0f6-9c50-4ce3-ab0c-587c10f7ba15\") " Mar 18 16:54:23.959482 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:54:23.959259 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ade3e0f6-9c50-4ce3-ab0c-587c10f7ba15-service-ca\") pod \"ade3e0f6-9c50-4ce3-ab0c-587c10f7ba15\" (UID: \"ade3e0f6-9c50-4ce3-ab0c-587c10f7ba15\") " Mar 18 16:54:23.959482 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:54:23.959294 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ade3e0f6-9c50-4ce3-ab0c-587c10f7ba15-console-serving-cert\") pod \"ade3e0f6-9c50-4ce3-ab0c-587c10f7ba15\" (UID: \"ade3e0f6-9c50-4ce3-ab0c-587c10f7ba15\") " Mar 18 16:54:23.959482 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:54:23.959352 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ade3e0f6-9c50-4ce3-ab0c-587c10f7ba15-oauth-serving-cert\") pod \"ade3e0f6-9c50-4ce3-ab0c-587c10f7ba15\" (UID: \"ade3e0f6-9c50-4ce3-ab0c-587c10f7ba15\") " Mar 18 16:54:23.959482 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:54:23.959376 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ade3e0f6-9c50-4ce3-ab0c-587c10f7ba15-console-oauth-config\") pod \"ade3e0f6-9c50-4ce3-ab0c-587c10f7ba15\" (UID: \"ade3e0f6-9c50-4ce3-ab0c-587c10f7ba15\") " Mar 18 16:54:23.959700 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:54:23.959629 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ade3e0f6-9c50-4ce3-ab0c-587c10f7ba15-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "ade3e0f6-9c50-4ce3-ab0c-587c10f7ba15" (UID: "ade3e0f6-9c50-4ce3-ab0c-587c10f7ba15"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 16:54:23.959752 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:54:23.959702 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ade3e0f6-9c50-4ce3-ab0c-587c10f7ba15-console-config" (OuterVolumeSpecName: "console-config") pod "ade3e0f6-9c50-4ce3-ab0c-587c10f7ba15" (UID: "ade3e0f6-9c50-4ce3-ab0c-587c10f7ba15"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 16:54:23.959807 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:54:23.959768 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ade3e0f6-9c50-4ce3-ab0c-587c10f7ba15-service-ca" (OuterVolumeSpecName: "service-ca") pod "ade3e0f6-9c50-4ce3-ab0c-587c10f7ba15" (UID: "ade3e0f6-9c50-4ce3-ab0c-587c10f7ba15"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 16:54:23.959962 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:54:23.959941 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ade3e0f6-9c50-4ce3-ab0c-587c10f7ba15-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "ade3e0f6-9c50-4ce3-ab0c-587c10f7ba15" (UID: "ade3e0f6-9c50-4ce3-ab0c-587c10f7ba15"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 16:54:23.961431 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:54:23.961404 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ade3e0f6-9c50-4ce3-ab0c-587c10f7ba15-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "ade3e0f6-9c50-4ce3-ab0c-587c10f7ba15" (UID: "ade3e0f6-9c50-4ce3-ab0c-587c10f7ba15"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:54:23.961618 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:54:23.961467 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ade3e0f6-9c50-4ce3-ab0c-587c10f7ba15-kube-api-access-mkfq4" (OuterVolumeSpecName: "kube-api-access-mkfq4") pod "ade3e0f6-9c50-4ce3-ab0c-587c10f7ba15" (UID: "ade3e0f6-9c50-4ce3-ab0c-587c10f7ba15"). InnerVolumeSpecName "kube-api-access-mkfq4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 16:54:23.961618 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:54:23.961466 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ade3e0f6-9c50-4ce3-ab0c-587c10f7ba15-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "ade3e0f6-9c50-4ce3-ab0c-587c10f7ba15" (UID: "ade3e0f6-9c50-4ce3-ab0c-587c10f7ba15"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:54:24.060870 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:54:24.060832 2578 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ade3e0f6-9c50-4ce3-ab0c-587c10f7ba15-trusted-ca-bundle\") on node \"ip-10-0-143-175.ec2.internal\" DevicePath \"\"" Mar 18 16:54:24.060870 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:54:24.060862 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mkfq4\" (UniqueName: \"kubernetes.io/projected/ade3e0f6-9c50-4ce3-ab0c-587c10f7ba15-kube-api-access-mkfq4\") on node \"ip-10-0-143-175.ec2.internal\" DevicePath \"\"" Mar 18 16:54:24.060870 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:54:24.060874 2578 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ade3e0f6-9c50-4ce3-ab0c-587c10f7ba15-console-config\") on node \"ip-10-0-143-175.ec2.internal\" DevicePath \"\"" Mar 18 16:54:24.061145 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:54:24.060883 2578 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ade3e0f6-9c50-4ce3-ab0c-587c10f7ba15-service-ca\") on node \"ip-10-0-143-175.ec2.internal\" DevicePath \"\"" Mar 18 16:54:24.061145 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:54:24.060892 2578 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ade3e0f6-9c50-4ce3-ab0c-587c10f7ba15-console-serving-cert\") on node \"ip-10-0-143-175.ec2.internal\" DevicePath \"\"" Mar 18 16:54:24.061145 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:54:24.060900 2578 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ade3e0f6-9c50-4ce3-ab0c-587c10f7ba15-oauth-serving-cert\") on node \"ip-10-0-143-175.ec2.internal\" DevicePath \"\"" Mar 18 16:54:24.061145 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:54:24.060911 2578 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ade3e0f6-9c50-4ce3-ab0c-587c10f7ba15-console-oauth-config\") on node \"ip-10-0-143-175.ec2.internal\" DevicePath \"\"" Mar 18 16:54:24.478184 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:54:24.478153 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5fd97696f7-fffnn_ade3e0f6-9c50-4ce3-ab0c-587c10f7ba15/console/0.log" Mar 18 16:54:24.478371 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:54:24.478197 2578 generic.go:358] "Generic (PLEG): container finished" podID="ade3e0f6-9c50-4ce3-ab0c-587c10f7ba15" containerID="4d6029c3b15fd7c1371a21e23d399415587beab7d7f8dbb7d391200d5fd78886" exitCode=2 Mar 18 16:54:24.478371 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:54:24.478290 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5fd97696f7-fffnn" Mar 18 16:54:24.478371 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:54:24.478289 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5fd97696f7-fffnn" event={"ID":"ade3e0f6-9c50-4ce3-ab0c-587c10f7ba15","Type":"ContainerDied","Data":"4d6029c3b15fd7c1371a21e23d399415587beab7d7f8dbb7d391200d5fd78886"} Mar 18 16:54:24.478371 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:54:24.478331 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5fd97696f7-fffnn" event={"ID":"ade3e0f6-9c50-4ce3-ab0c-587c10f7ba15","Type":"ContainerDied","Data":"a8eb5053bcecb7ffa239e572c464665ded5fbebb250b9ee84feb86e3fcb886cd"} Mar 18 16:54:24.478371 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:54:24.478352 2578 scope.go:117] "RemoveContainer" containerID="4d6029c3b15fd7c1371a21e23d399415587beab7d7f8dbb7d391200d5fd78886" Mar 18 16:54:24.487610 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:54:24.487590 2578 scope.go:117] "RemoveContainer" containerID="4d6029c3b15fd7c1371a21e23d399415587beab7d7f8dbb7d391200d5fd78886" Mar 18 16:54:24.487873 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:54:24.487853 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d6029c3b15fd7c1371a21e23d399415587beab7d7f8dbb7d391200d5fd78886\": container with ID starting with 4d6029c3b15fd7c1371a21e23d399415587beab7d7f8dbb7d391200d5fd78886 not found: ID does not exist" containerID="4d6029c3b15fd7c1371a21e23d399415587beab7d7f8dbb7d391200d5fd78886" Mar 18 16:54:24.487937 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:54:24.487886 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d6029c3b15fd7c1371a21e23d399415587beab7d7f8dbb7d391200d5fd78886"} err="failed to get container status \"4d6029c3b15fd7c1371a21e23d399415587beab7d7f8dbb7d391200d5fd78886\": rpc error: code = NotFound desc = could not find container \"4d6029c3b15fd7c1371a21e23d399415587beab7d7f8dbb7d391200d5fd78886\": container with ID starting with 4d6029c3b15fd7c1371a21e23d399415587beab7d7f8dbb7d391200d5fd78886 not found: ID does not exist" Mar 18 16:54:24.498958 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:54:24.498935 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5fd97696f7-fffnn"] Mar 18 16:54:24.501896 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:54:24.501875 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5fd97696f7-fffnn"] Mar 18 16:54:24.593115 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:54:24.593083 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ade3e0f6-9c50-4ce3-ab0c-587c10f7ba15" path="/var/lib/kubelet/pods/ade3e0f6-9c50-4ce3-ab0c-587c10f7ba15/volumes" Mar 18 16:54:29.849467 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:54:29.849371 2578 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized" image="quay.io/opendatahub/odh-model-serving-api:fast" Mar 18 16:54:29.849877 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:54:29.849586 2578 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:server,Image:quay.io/opendatahub/odh-model-serving-api:fast,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},ContainerPort{Name:metrics,HostPort:0,ContainerPort:9090,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:TLS_CERT_DIR,Value:/tls,ValueFrom:nil,},EnvVar{Name:GATEWAY_LABEL_SELECTOR,Value:,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{50 -3} {} 50m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tls-certs,ReadOnly:true,MountPath:/tls,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xp5hl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000640000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod model-serving-api-9699c8d45-stcsx_kserve(4f796fc2-c6ea-4936-8fae-6cd0d2b876db): ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized" logger="UnhandledError" Mar 18 16:54:29.850780 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:54:29.850752 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ErrImagePull: \"unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 16:54:32.397779 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:54:32.397732 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-f89879bcc-6z6lg" podUID="77931099-0e41-4ac7-81e1-41e67c4b0c9b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Mar 18 16:54:32.398177 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:54:32.397742 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6579795fb5-vvblb" podUID="e9d24bd6-a499-475b-81ac-535409aeb9bf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Mar 18 16:54:33.355393 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:54:33.355350 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-a225b-predictor-b8745d9cc-cqs9k" podUID="0172b66d-e446-422b-a91f-901dcaad88d2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Mar 18 16:54:42.398610 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:54:42.398568 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6579795fb5-vvblb" podUID="e9d24bd6-a499-475b-81ac-535409aeb9bf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Mar 18 16:54:42.399028 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:54:42.398570 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-f89879bcc-6z6lg" podUID="77931099-0e41-4ac7-81e1-41e67c4b0c9b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Mar 18 16:54:42.589709 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:54:42.589677 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 16:54:43.356012 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:54:43.355974 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-a225b-predictor-b8745d9cc-cqs9k" podUID="0172b66d-e446-422b-a91f-901dcaad88d2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Mar 18 16:54:52.397724 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:54:52.397682 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-f89879bcc-6z6lg" podUID="77931099-0e41-4ac7-81e1-41e67c4b0c9b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Mar 18 16:54:52.398189 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:54:52.397679 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6579795fb5-vvblb" podUID="e9d24bd6-a499-475b-81ac-535409aeb9bf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Mar 18 16:54:53.356266 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:54:53.356226 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-a225b-predictor-b8745d9cc-cqs9k" Mar 18 16:54:57.589197 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:54:57.589068 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 16:54:57.589407 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:54:57.589267 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 16:55:02.397839 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:55:02.397792 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6579795fb5-vvblb" podUID="e9d24bd6-a499-475b-81ac-535409aeb9bf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Mar 18 16:55:02.398221 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:55:02.397792 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-f89879bcc-6z6lg" podUID="77931099-0e41-4ac7-81e1-41e67c4b0c9b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Mar 18 16:55:07.588788 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:55:07.588745 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6579795fb5-vvblb" podUID="e9d24bd6-a499-475b-81ac-535409aeb9bf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Mar 18 16:55:08.588972 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:55:08.588931 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-f89879bcc-6z6lg" podUID="77931099-0e41-4ac7-81e1-41e67c4b0c9b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Mar 18 16:55:11.589321 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:55:11.589280 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 16:55:13.658830 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:55:13.658793 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/switch-graph-a225b-c4ddc8b65-wjsvf"] Mar 18 16:55:13.659408 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:55:13.659297 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ade3e0f6-9c50-4ce3-ab0c-587c10f7ba15" containerName="console" Mar 18 16:55:13.659408 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:55:13.659318 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="ade3e0f6-9c50-4ce3-ab0c-587c10f7ba15" containerName="console" Mar 18 16:55:13.659408 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:55:13.659394 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="ade3e0f6-9c50-4ce3-ab0c-587c10f7ba15" containerName="console" Mar 18 16:55:13.663453 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:55:13.663431 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-a225b-c4ddc8b65-wjsvf" Mar 18 16:55:13.665566 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:55:13.665517 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-a225b-serving-cert\"" Mar 18 16:55:13.665688 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:55:13.665570 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-a225b-kube-rbac-proxy-sar-config\"" Mar 18 16:55:13.673516 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:55:13.673494 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-a225b-c4ddc8b65-wjsvf"] Mar 18 16:55:13.793976 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:55:13.793946 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/717f798e-7a76-498a-a553-4bf29eec66b0-proxy-tls\") pod \"switch-graph-a225b-c4ddc8b65-wjsvf\" (UID: \"717f798e-7a76-498a-a553-4bf29eec66b0\") " pod="kserve-ci-e2e-test/switch-graph-a225b-c4ddc8b65-wjsvf" Mar 18 16:55:13.794159 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:55:13.794005 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/717f798e-7a76-498a-a553-4bf29eec66b0-openshift-service-ca-bundle\") pod \"switch-graph-a225b-c4ddc8b65-wjsvf\" (UID: \"717f798e-7a76-498a-a553-4bf29eec66b0\") " pod="kserve-ci-e2e-test/switch-graph-a225b-c4ddc8b65-wjsvf" Mar 18 16:55:13.895396 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:55:13.895352 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/717f798e-7a76-498a-a553-4bf29eec66b0-proxy-tls\") pod \"switch-graph-a225b-c4ddc8b65-wjsvf\" (UID: \"717f798e-7a76-498a-a553-4bf29eec66b0\") " pod="kserve-ci-e2e-test/switch-graph-a225b-c4ddc8b65-wjsvf" Mar 18 16:55:13.895601 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:55:13.895417 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/717f798e-7a76-498a-a553-4bf29eec66b0-openshift-service-ca-bundle\") pod \"switch-graph-a225b-c4ddc8b65-wjsvf\" (UID: \"717f798e-7a76-498a-a553-4bf29eec66b0\") " pod="kserve-ci-e2e-test/switch-graph-a225b-c4ddc8b65-wjsvf" Mar 18 16:55:13.896090 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:55:13.896063 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/717f798e-7a76-498a-a553-4bf29eec66b0-openshift-service-ca-bundle\") pod \"switch-graph-a225b-c4ddc8b65-wjsvf\" (UID: \"717f798e-7a76-498a-a553-4bf29eec66b0\") " pod="kserve-ci-e2e-test/switch-graph-a225b-c4ddc8b65-wjsvf" Mar 18 16:55:13.897840 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:55:13.897813 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/717f798e-7a76-498a-a553-4bf29eec66b0-proxy-tls\") pod \"switch-graph-a225b-c4ddc8b65-wjsvf\" (UID: \"717f798e-7a76-498a-a553-4bf29eec66b0\") " pod="kserve-ci-e2e-test/switch-graph-a225b-c4ddc8b65-wjsvf" Mar 18 16:55:13.974110 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:55:13.974028 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-a225b-c4ddc8b65-wjsvf" Mar 18 16:55:14.098227 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:55:14.098102 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-a225b-c4ddc8b65-wjsvf"] Mar 18 16:55:14.100896 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:55:14.100865 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod717f798e_7a76_498a_a553_4bf29eec66b0.slice/crio-f26ca98224890462749201c681be21006a90c9cb1e28999887edc4cbf978a04d WatchSource:0}: Error finding container f26ca98224890462749201c681be21006a90c9cb1e28999887edc4cbf978a04d: Status 404 returned error can't find the container with id f26ca98224890462749201c681be21006a90c9cb1e28999887edc4cbf978a04d Mar 18 16:55:14.648394 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:55:14.648353 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-a225b-c4ddc8b65-wjsvf" event={"ID":"717f798e-7a76-498a-a553-4bf29eec66b0","Type":"ContainerStarted","Data":"f26ca98224890462749201c681be21006a90c9cb1e28999887edc4cbf978a04d"} Mar 18 16:55:15.653644 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:55:15.653607 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-a225b-c4ddc8b65-wjsvf" event={"ID":"717f798e-7a76-498a-a553-4bf29eec66b0","Type":"ContainerStarted","Data":"b1d7178b2f2fc99f70c5b1f85a5f70b3d97b8cbccab5f0796830660fc98b3543"} Mar 18 16:55:15.654070 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:55:15.653712 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-a225b-c4ddc8b65-wjsvf" Mar 18 16:55:15.672414 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:55:15.672355 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/switch-graph-a225b-c4ddc8b65-wjsvf" podStartSLOduration=1.190878003 podStartE2EDuration="2.672339011s" podCreationTimestamp="2026-03-18 16:55:13 +0000 UTC" firstStartedPulling="2026-03-18 16:55:14.10277109 +0000 UTC m=+668.114879892" lastFinishedPulling="2026-03-18 16:55:15.584232088 +0000 UTC m=+669.596340900" observedRunningTime="2026-03-18 16:55:15.671965641 +0000 UTC m=+669.684074466" watchObservedRunningTime="2026-03-18 16:55:15.672339011 +0000 UTC m=+669.684447829" Mar 18 16:55:17.588867 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:55:17.588824 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6579795fb5-vvblb" podUID="e9d24bd6-a499-475b-81ac-535409aeb9bf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Mar 18 16:55:18.589439 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:55:18.589396 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-f89879bcc-6z6lg" podUID="77931099-0e41-4ac7-81e1-41e67c4b0c9b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Mar 18 16:55:21.662713 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:55:21.662680 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/switch-graph-a225b-c4ddc8b65-wjsvf" Mar 18 16:55:23.615385 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:55:23.615346 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-a225b-c4ddc8b65-wjsvf"] Mar 18 16:55:23.615897 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:55:23.615689 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/switch-graph-a225b-c4ddc8b65-wjsvf" podUID="717f798e-7a76-498a-a553-4bf29eec66b0" containerName="switch-graph-a225b" containerID="cri-o://b1d7178b2f2fc99f70c5b1f85a5f70b3d97b8cbccab5f0796830660fc98b3543" gracePeriod=30 Mar 18 16:55:24.065132 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:55:24.065095 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-a225b-predictor-b8745d9cc-cqs9k"] Mar 18 16:55:24.065903 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:55:24.065349 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-a225b-predictor-b8745d9cc-cqs9k" podUID="0172b66d-e446-422b-a91f-901dcaad88d2" containerName="kserve-container" containerID="cri-o://35dc1883e9fbfd14410f51f917a42991e0d8eaf0c4800178788d4553c717073b" gracePeriod=30 Mar 18 16:55:24.067753 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:55:24.067726 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-dfe46-predictor-756f46b767-v7t9k"] Mar 18 16:55:24.071279 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:55:24.071260 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-dfe46-predictor-756f46b767-v7t9k" Mar 18 16:55:24.080003 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:55:24.079978 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-dfe46-predictor-756f46b767-v7t9k"] Mar 18 16:55:24.183921 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:55:24.183885 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pwx4\" (UniqueName: \"kubernetes.io/projected/95a83498-405c-443e-b367-47e3e745d28c-kube-api-access-6pwx4\") pod \"success-200-isvc-dfe46-predictor-756f46b767-v7t9k\" (UID: \"95a83498-405c-443e-b367-47e3e745d28c\") " pod="kserve-ci-e2e-test/success-200-isvc-dfe46-predictor-756f46b767-v7t9k" Mar 18 16:55:24.285125 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:55:24.285090 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6pwx4\" (UniqueName: \"kubernetes.io/projected/95a83498-405c-443e-b367-47e3e745d28c-kube-api-access-6pwx4\") pod \"success-200-isvc-dfe46-predictor-756f46b767-v7t9k\" (UID: \"95a83498-405c-443e-b367-47e3e745d28c\") " pod="kserve-ci-e2e-test/success-200-isvc-dfe46-predictor-756f46b767-v7t9k" Mar 18 16:55:24.293310 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:55:24.293279 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6pwx4\" (UniqueName: \"kubernetes.io/projected/95a83498-405c-443e-b367-47e3e745d28c-kube-api-access-6pwx4\") pod \"success-200-isvc-dfe46-predictor-756f46b767-v7t9k\" (UID: \"95a83498-405c-443e-b367-47e3e745d28c\") " pod="kserve-ci-e2e-test/success-200-isvc-dfe46-predictor-756f46b767-v7t9k" Mar 18 16:55:24.384891 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:55:24.384814 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-dfe46-predictor-756f46b767-v7t9k" Mar 18 16:55:24.511986 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:55:24.511957 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-dfe46-predictor-756f46b767-v7t9k"] Mar 18 16:55:24.514869 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:55:24.514837 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod95a83498_405c_443e_b367_47e3e745d28c.slice/crio-3d9015e22c71619e096b8bc897e6c3cdeedf8acd8ffa0513113f224fef3c1bdc WatchSource:0}: Error finding container 3d9015e22c71619e096b8bc897e6c3cdeedf8acd8ffa0513113f224fef3c1bdc: Status 404 returned error can't find the container with id 3d9015e22c71619e096b8bc897e6c3cdeedf8acd8ffa0513113f224fef3c1bdc Mar 18 16:55:24.683402 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:55:24.683369 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-dfe46-predictor-756f46b767-v7t9k" event={"ID":"95a83498-405c-443e-b367-47e3e745d28c","Type":"ContainerStarted","Data":"77a5e5e7ae8bcb450d6a2ab3b74cc8a13db429e0e4d6508be64a9675988865bd"} Mar 18 16:55:24.683402 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:55:24.683409 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-dfe46-predictor-756f46b767-v7t9k" event={"ID":"95a83498-405c-443e-b367-47e3e745d28c","Type":"ContainerStarted","Data":"3d9015e22c71619e096b8bc897e6c3cdeedf8acd8ffa0513113f224fef3c1bdc"} Mar 18 16:55:24.683801 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:55:24.683516 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-dfe46-predictor-756f46b767-v7t9k" Mar 18 16:55:24.684996 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:55:24.684969 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-dfe46-predictor-756f46b767-v7t9k" podUID="95a83498-405c-443e-b367-47e3e745d28c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Mar 18 16:55:24.697337 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:55:24.697299 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-dfe46-predictor-756f46b767-v7t9k" podStartSLOduration=0.697287465 podStartE2EDuration="697.287465ms" podCreationTimestamp="2026-03-18 16:55:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:55:24.6961171 +0000 UTC m=+678.708225922" watchObservedRunningTime="2026-03-18 16:55:24.697287465 +0000 UTC m=+678.709396284" Mar 18 16:55:25.686936 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:55:25.686898 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-dfe46-predictor-756f46b767-v7t9k" podUID="95a83498-405c-443e-b367-47e3e745d28c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Mar 18 16:55:26.594222 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:55:26.594193 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 16:55:26.661452 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:55:26.661413 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-a225b-c4ddc8b65-wjsvf" podUID="717f798e-7a76-498a-a553-4bf29eec66b0" containerName="switch-graph-a225b" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 16:55:27.316312 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:55:27.316289 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-a225b-predictor-b8745d9cc-cqs9k" Mar 18 16:55:27.412034 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:55:27.411937 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qrl6q\" (UniqueName: \"kubernetes.io/projected/0172b66d-e446-422b-a91f-901dcaad88d2-kube-api-access-qrl6q\") pod \"0172b66d-e446-422b-a91f-901dcaad88d2\" (UID: \"0172b66d-e446-422b-a91f-901dcaad88d2\") " Mar 18 16:55:27.414193 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:55:27.414162 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0172b66d-e446-422b-a91f-901dcaad88d2-kube-api-access-qrl6q" (OuterVolumeSpecName: "kube-api-access-qrl6q") pod "0172b66d-e446-422b-a91f-901dcaad88d2" (UID: "0172b66d-e446-422b-a91f-901dcaad88d2"). InnerVolumeSpecName "kube-api-access-qrl6q". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 16:55:27.512709 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:55:27.512673 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qrl6q\" (UniqueName: \"kubernetes.io/projected/0172b66d-e446-422b-a91f-901dcaad88d2-kube-api-access-qrl6q\") on node \"ip-10-0-143-175.ec2.internal\" DevicePath \"\"" Mar 18 16:55:27.590223 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:55:27.590194 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6579795fb5-vvblb" Mar 18 16:55:27.694544 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:55:27.694442 2578 generic.go:358] "Generic (PLEG): container finished" podID="0172b66d-e446-422b-a91f-901dcaad88d2" containerID="35dc1883e9fbfd14410f51f917a42991e0d8eaf0c4800178788d4553c717073b" exitCode=0 Mar 18 16:55:27.694544 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:55:27.694506 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-a225b-predictor-b8745d9cc-cqs9k" Mar 18 16:55:27.694726 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:55:27.694538 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-a225b-predictor-b8745d9cc-cqs9k" event={"ID":"0172b66d-e446-422b-a91f-901dcaad88d2","Type":"ContainerDied","Data":"35dc1883e9fbfd14410f51f917a42991e0d8eaf0c4800178788d4553c717073b"} Mar 18 16:55:27.694726 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:55:27.694590 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-a225b-predictor-b8745d9cc-cqs9k" event={"ID":"0172b66d-e446-422b-a91f-901dcaad88d2","Type":"ContainerDied","Data":"c704a2bc1300cc67826770881057a2765a0bd83631a52f83293180fbade7734f"} Mar 18 16:55:27.694726 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:55:27.694613 2578 scope.go:117] "RemoveContainer" containerID="35dc1883e9fbfd14410f51f917a42991e0d8eaf0c4800178788d4553c717073b" Mar 18 16:55:27.704266 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:55:27.704242 2578 scope.go:117] "RemoveContainer" containerID="35dc1883e9fbfd14410f51f917a42991e0d8eaf0c4800178788d4553c717073b" Mar 18 16:55:27.704546 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:55:27.704513 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35dc1883e9fbfd14410f51f917a42991e0d8eaf0c4800178788d4553c717073b\": container with ID starting with 35dc1883e9fbfd14410f51f917a42991e0d8eaf0c4800178788d4553c717073b not found: ID does not exist" containerID="35dc1883e9fbfd14410f51f917a42991e0d8eaf0c4800178788d4553c717073b" Mar 18 16:55:27.704613 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:55:27.704556 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35dc1883e9fbfd14410f51f917a42991e0d8eaf0c4800178788d4553c717073b"} err="failed to get container status \"35dc1883e9fbfd14410f51f917a42991e0d8eaf0c4800178788d4553c717073b\": rpc error: code = NotFound desc = could not find container \"35dc1883e9fbfd14410f51f917a42991e0d8eaf0c4800178788d4553c717073b\": container with ID starting with 35dc1883e9fbfd14410f51f917a42991e0d8eaf0c4800178788d4553c717073b not found: ID does not exist" Mar 18 16:55:27.714368 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:55:27.714344 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-a225b-predictor-b8745d9cc-cqs9k"] Mar 18 16:55:27.718236 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:55:27.718214 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-a225b-predictor-b8745d9cc-cqs9k"] Mar 18 16:55:28.592634 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:55:28.592604 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0172b66d-e446-422b-a91f-901dcaad88d2" path="/var/lib/kubelet/pods/0172b66d-e446-422b-a91f-901dcaad88d2/volumes" Mar 18 16:55:28.592998 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:55:28.592900 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-f89879bcc-6z6lg" Mar 18 16:55:31.661341 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:55:31.661304 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-a225b-c4ddc8b65-wjsvf" podUID="717f798e-7a76-498a-a553-4bf29eec66b0" containerName="switch-graph-a225b" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 16:55:35.686902 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:55:35.686859 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-dfe46-predictor-756f46b767-v7t9k" podUID="95a83498-405c-443e-b367-47e3e745d28c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Mar 18 16:55:36.664325 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:55:36.664280 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-a225b-c4ddc8b65-wjsvf" podUID="717f798e-7a76-498a-a553-4bf29eec66b0" containerName="switch-graph-a225b" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 16:55:36.664497 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:55:36.664433 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-a225b-c4ddc8b65-wjsvf" Mar 18 16:55:39.589689 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:55:39.589662 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 16:55:41.661285 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:55:41.661248 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-a225b-c4ddc8b65-wjsvf" podUID="717f798e-7a76-498a-a553-4bf29eec66b0" containerName="switch-graph-a225b" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 16:55:45.687081 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:55:45.687039 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-dfe46-predictor-756f46b767-v7t9k" podUID="95a83498-405c-443e-b367-47e3e745d28c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Mar 18 16:55:46.661103 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:55:46.661066 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-a225b-c4ddc8b65-wjsvf" podUID="717f798e-7a76-498a-a553-4bf29eec66b0" containerName="switch-graph-a225b" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 16:55:50.913802 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:55:50.913704 2578 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized" image="quay.io/opendatahub/odh-model-serving-api:fast" Mar 18 16:55:50.914187 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:55:50.913906 2578 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:server,Image:quay.io/opendatahub/odh-model-serving-api:fast,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},ContainerPort{Name:metrics,HostPort:0,ContainerPort:9090,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:TLS_CERT_DIR,Value:/tls,ValueFrom:nil,},EnvVar{Name:GATEWAY_LABEL_SELECTOR,Value:,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{50 -3} {} 50m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tls-certs,ReadOnly:true,MountPath:/tls,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xp5hl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000640000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod model-serving-api-9699c8d45-stcsx_kserve(4f796fc2-c6ea-4936-8fae-6cd0d2b876db): ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized" logger="UnhandledError" Mar 18 16:55:50.915105 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:55:50.915080 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ErrImagePull: \"unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 16:55:51.661198 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:55:51.661148 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-a225b-c4ddc8b65-wjsvf" podUID="717f798e-7a76-498a-a553-4bf29eec66b0" containerName="switch-graph-a225b" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 16:55:53.708304 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:55:53.708251 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/model-chainer-5894969db4-z496p"] Mar 18 16:55:53.708792 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:55:53.708762 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0172b66d-e446-422b-a91f-901dcaad88d2" containerName="kserve-container" Mar 18 16:55:53.708792 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:55:53.708777 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="0172b66d-e446-422b-a91f-901dcaad88d2" containerName="kserve-container" Mar 18 16:55:53.708913 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:55:53.708863 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="0172b66d-e446-422b-a91f-901dcaad88d2" containerName="kserve-container" Mar 18 16:55:53.712832 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:55:53.712807 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-5894969db4-z496p" Mar 18 16:55:53.715083 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:55:53.715057 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-serving-cert\"" Mar 18 16:55:53.715343 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:55:53.715321 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-kube-rbac-proxy-sar-config\"" Mar 18 16:55:53.720154 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:55:53.720131 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-5894969db4-z496p"] Mar 18 16:55:53.736215 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:55:53.736187 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ee702450-de44-4b59-810b-1b89e630f1ab-proxy-tls\") pod \"model-chainer-5894969db4-z496p\" (UID: \"ee702450-de44-4b59-810b-1b89e630f1ab\") " pod="kserve-ci-e2e-test/model-chainer-5894969db4-z496p" Mar 18 16:55:53.736380 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:55:53.736232 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ee702450-de44-4b59-810b-1b89e630f1ab-openshift-service-ca-bundle\") pod \"model-chainer-5894969db4-z496p\" (UID: \"ee702450-de44-4b59-810b-1b89e630f1ab\") " pod="kserve-ci-e2e-test/model-chainer-5894969db4-z496p" Mar 18 16:55:53.768093 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:55:53.768069 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-a225b-c4ddc8b65-wjsvf" Mar 18 16:55:53.780375 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:55:53.780350 2578 generic.go:358] "Generic (PLEG): container finished" podID="717f798e-7a76-498a-a553-4bf29eec66b0" containerID="b1d7178b2f2fc99f70c5b1f85a5f70b3d97b8cbccab5f0796830660fc98b3543" exitCode=0 Mar 18 16:55:53.780495 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:55:53.780415 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-a225b-c4ddc8b65-wjsvf" Mar 18 16:55:53.780495 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:55:53.780415 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-a225b-c4ddc8b65-wjsvf" event={"ID":"717f798e-7a76-498a-a553-4bf29eec66b0","Type":"ContainerDied","Data":"b1d7178b2f2fc99f70c5b1f85a5f70b3d97b8cbccab5f0796830660fc98b3543"} Mar 18 16:55:53.780495 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:55:53.780457 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-a225b-c4ddc8b65-wjsvf" event={"ID":"717f798e-7a76-498a-a553-4bf29eec66b0","Type":"ContainerDied","Data":"f26ca98224890462749201c681be21006a90c9cb1e28999887edc4cbf978a04d"} Mar 18 16:55:53.780495 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:55:53.780479 2578 scope.go:117] "RemoveContainer" containerID="b1d7178b2f2fc99f70c5b1f85a5f70b3d97b8cbccab5f0796830660fc98b3543" Mar 18 16:55:53.790761 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:55:53.790740 2578 scope.go:117] "RemoveContainer" containerID="b1d7178b2f2fc99f70c5b1f85a5f70b3d97b8cbccab5f0796830660fc98b3543" Mar 18 16:55:53.791051 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:55:53.791031 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1d7178b2f2fc99f70c5b1f85a5f70b3d97b8cbccab5f0796830660fc98b3543\": container with ID starting with b1d7178b2f2fc99f70c5b1f85a5f70b3d97b8cbccab5f0796830660fc98b3543 not found: ID does not exist" containerID="b1d7178b2f2fc99f70c5b1f85a5f70b3d97b8cbccab5f0796830660fc98b3543" Mar 18 16:55:53.791101 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:55:53.791062 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1d7178b2f2fc99f70c5b1f85a5f70b3d97b8cbccab5f0796830660fc98b3543"} err="failed to get container status \"b1d7178b2f2fc99f70c5b1f85a5f70b3d97b8cbccab5f0796830660fc98b3543\": rpc error: code = NotFound desc = could not find container \"b1d7178b2f2fc99f70c5b1f85a5f70b3d97b8cbccab5f0796830660fc98b3543\": container with ID starting with b1d7178b2f2fc99f70c5b1f85a5f70b3d97b8cbccab5f0796830660fc98b3543 not found: ID does not exist" Mar 18 16:55:53.837180 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:55:53.837151 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/717f798e-7a76-498a-a553-4bf29eec66b0-proxy-tls\") pod \"717f798e-7a76-498a-a553-4bf29eec66b0\" (UID: \"717f798e-7a76-498a-a553-4bf29eec66b0\") " Mar 18 16:55:53.837358 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:55:53.837246 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/717f798e-7a76-498a-a553-4bf29eec66b0-openshift-service-ca-bundle\") pod \"717f798e-7a76-498a-a553-4bf29eec66b0\" (UID: \"717f798e-7a76-498a-a553-4bf29eec66b0\") " Mar 18 16:55:53.837426 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:55:53.837384 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ee702450-de44-4b59-810b-1b89e630f1ab-proxy-tls\") pod \"model-chainer-5894969db4-z496p\" (UID: \"ee702450-de44-4b59-810b-1b89e630f1ab\") " pod="kserve-ci-e2e-test/model-chainer-5894969db4-z496p" Mar 18 16:55:53.837481 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:55:53.837423 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ee702450-de44-4b59-810b-1b89e630f1ab-openshift-service-ca-bundle\") pod \"model-chainer-5894969db4-z496p\" (UID: \"ee702450-de44-4b59-810b-1b89e630f1ab\") " pod="kserve-ci-e2e-test/model-chainer-5894969db4-z496p" Mar 18 16:55:53.837767 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:55:53.837668 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/717f798e-7a76-498a-a553-4bf29eec66b0-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "717f798e-7a76-498a-a553-4bf29eec66b0" (UID: "717f798e-7a76-498a-a553-4bf29eec66b0"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 16:55:53.838185 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:55:53.838155 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ee702450-de44-4b59-810b-1b89e630f1ab-openshift-service-ca-bundle\") pod \"model-chainer-5894969db4-z496p\" (UID: \"ee702450-de44-4b59-810b-1b89e630f1ab\") " pod="kserve-ci-e2e-test/model-chainer-5894969db4-z496p" Mar 18 16:55:53.839695 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:55:53.839661 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/717f798e-7a76-498a-a553-4bf29eec66b0-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "717f798e-7a76-498a-a553-4bf29eec66b0" (UID: "717f798e-7a76-498a-a553-4bf29eec66b0"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:55:53.839853 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:55:53.839835 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ee702450-de44-4b59-810b-1b89e630f1ab-proxy-tls\") pod \"model-chainer-5894969db4-z496p\" (UID: \"ee702450-de44-4b59-810b-1b89e630f1ab\") " pod="kserve-ci-e2e-test/model-chainer-5894969db4-z496p" Mar 18 16:55:53.938169 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:55:53.938077 2578 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/717f798e-7a76-498a-a553-4bf29eec66b0-openshift-service-ca-bundle\") on node \"ip-10-0-143-175.ec2.internal\" DevicePath \"\"" Mar 18 16:55:53.938169 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:55:53.938120 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/717f798e-7a76-498a-a553-4bf29eec66b0-proxy-tls\") on node \"ip-10-0-143-175.ec2.internal\" DevicePath \"\"" Mar 18 16:55:54.028117 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:55:54.028089 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-5894969db4-z496p" Mar 18 16:55:54.104702 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:55:54.104672 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-a225b-c4ddc8b65-wjsvf"] Mar 18 16:55:54.112267 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:55:54.112238 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/switch-graph-a225b-c4ddc8b65-wjsvf"] Mar 18 16:55:54.150696 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:55:54.150665 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-5894969db4-z496p"] Mar 18 16:55:54.154006 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:55:54.153980 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee702450_de44_4b59_810b_1b89e630f1ab.slice/crio-c24670b4ca7b9cf0506404370db1fb39fa0ceeee1869e925766efa92c5324ffa WatchSource:0}: Error finding container c24670b4ca7b9cf0506404370db1fb39fa0ceeee1869e925766efa92c5324ffa: Status 404 returned error can't find the container with id c24670b4ca7b9cf0506404370db1fb39fa0ceeee1869e925766efa92c5324ffa Mar 18 16:55:54.592628 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:55:54.592598 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="717f798e-7a76-498a-a553-4bf29eec66b0" path="/var/lib/kubelet/pods/717f798e-7a76-498a-a553-4bf29eec66b0/volumes" Mar 18 16:55:54.785423 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:55:54.785388 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-5894969db4-z496p" event={"ID":"ee702450-de44-4b59-810b-1b89e630f1ab","Type":"ContainerStarted","Data":"00c157743796573a6fd39bd8983ef8ebcbbdf5429376c3f12db3972190817f39"} Mar 18 16:55:54.785423 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:55:54.785424 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-5894969db4-z496p" event={"ID":"ee702450-de44-4b59-810b-1b89e630f1ab","Type":"ContainerStarted","Data":"c24670b4ca7b9cf0506404370db1fb39fa0ceeee1869e925766efa92c5324ffa"} Mar 18 16:55:54.785842 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:55:54.785473 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-5894969db4-z496p" Mar 18 16:55:54.800581 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:55:54.800513 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/model-chainer-5894969db4-z496p" podStartSLOduration=1.8005018229999998 podStartE2EDuration="1.800501823s" podCreationTimestamp="2026-03-18 16:55:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:55:54.799426245 +0000 UTC m=+708.811535070" watchObservedRunningTime="2026-03-18 16:55:54.800501823 +0000 UTC m=+708.812610651" Mar 18 16:55:55.687323 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:55:55.687227 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-dfe46-predictor-756f46b767-v7t9k" podUID="95a83498-405c-443e-b367-47e3e745d28c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Mar 18 16:56:00.797250 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:00.797215 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/model-chainer-5894969db4-z496p" Mar 18 16:56:03.589144 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:56:03.589109 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 16:56:03.790085 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:03.790043 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-5894969db4-z496p"] Mar 18 16:56:03.790355 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:03.790307 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/model-chainer-5894969db4-z496p" podUID="ee702450-de44-4b59-810b-1b89e630f1ab" containerName="model-chainer" containerID="cri-o://00c157743796573a6fd39bd8983ef8ebcbbdf5429376c3f12db3972190817f39" gracePeriod=30 Mar 18 16:56:04.080429 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:04.080392 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-f89879bcc-6z6lg"] Mar 18 16:56:04.080720 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:04.080697 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-f89879bcc-6z6lg" podUID="77931099-0e41-4ac7-81e1-41e67c4b0c9b" containerName="kserve-container" containerID="cri-o://c3ffa98f1625a3fdd5df40150f7a8c03bbc9ac00954e5368ae7d00ab09f5750f" gracePeriod=30 Mar 18 16:56:04.094898 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:04.094868 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-9913d-predictor-6b8667cb5c-rct6m"] Mar 18 16:56:04.095261 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:04.095246 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="717f798e-7a76-498a-a553-4bf29eec66b0" containerName="switch-graph-a225b" Mar 18 16:56:04.095325 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:04.095262 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="717f798e-7a76-498a-a553-4bf29eec66b0" containerName="switch-graph-a225b" Mar 18 16:56:04.095380 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:04.095328 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="717f798e-7a76-498a-a553-4bf29eec66b0" containerName="switch-graph-a225b" Mar 18 16:56:04.098361 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:04.098343 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-9913d-predictor-6b8667cb5c-rct6m" Mar 18 16:56:04.103558 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:04.103518 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-9913d-predictor-6b8667cb5c-rct6m"] Mar 18 16:56:04.229985 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:04.229951 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzts6\" (UniqueName: \"kubernetes.io/projected/e10de7c8-98d3-46a1-87ee-53839a451906-kube-api-access-dzts6\") pod \"success-200-isvc-9913d-predictor-6b8667cb5c-rct6m\" (UID: \"e10de7c8-98d3-46a1-87ee-53839a451906\") " pod="kserve-ci-e2e-test/success-200-isvc-9913d-predictor-6b8667cb5c-rct6m" Mar 18 16:56:04.274724 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:04.274691 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6579795fb5-vvblb"] Mar 18 16:56:04.275260 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:04.275226 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6579795fb5-vvblb" podUID="e9d24bd6-a499-475b-81ac-535409aeb9bf" containerName="kserve-container" containerID="cri-o://166a6c14693b238cc551928589c8467d4e046fab1c918ba03da8cbfb3f20cb84" gracePeriod=30 Mar 18 16:56:04.331028 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:04.330937 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dzts6\" (UniqueName: \"kubernetes.io/projected/e10de7c8-98d3-46a1-87ee-53839a451906-kube-api-access-dzts6\") pod \"success-200-isvc-9913d-predictor-6b8667cb5c-rct6m\" (UID: \"e10de7c8-98d3-46a1-87ee-53839a451906\") " pod="kserve-ci-e2e-test/success-200-isvc-9913d-predictor-6b8667cb5c-rct6m" Mar 18 16:56:04.344813 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:04.344777 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzts6\" (UniqueName: \"kubernetes.io/projected/e10de7c8-98d3-46a1-87ee-53839a451906-kube-api-access-dzts6\") pod \"success-200-isvc-9913d-predictor-6b8667cb5c-rct6m\" (UID: \"e10de7c8-98d3-46a1-87ee-53839a451906\") " pod="kserve-ci-e2e-test/success-200-isvc-9913d-predictor-6b8667cb5c-rct6m" Mar 18 16:56:04.409880 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:04.409841 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-9913d-predictor-6b8667cb5c-rct6m" Mar 18 16:56:04.538667 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:04.538631 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-9913d-predictor-6b8667cb5c-rct6m"] Mar 18 16:56:04.542679 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:56:04.542640 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode10de7c8_98d3_46a1_87ee_53839a451906.slice/crio-a780d50446591e3aaba265c2e4e9586531d271719c791328d1b43513089b9b5a WatchSource:0}: Error finding container a780d50446591e3aaba265c2e4e9586531d271719c791328d1b43513089b9b5a: Status 404 returned error can't find the container with id a780d50446591e3aaba265c2e4e9586531d271719c791328d1b43513089b9b5a Mar 18 16:56:04.821209 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:04.821178 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-9913d-predictor-6b8667cb5c-rct6m" event={"ID":"e10de7c8-98d3-46a1-87ee-53839a451906","Type":"ContainerStarted","Data":"12364d588b7ecf8385f9f937cc0ca24434750ceaeb1beaea8f1b9c5a45cea988"} Mar 18 16:56:04.821209 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:04.821213 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-9913d-predictor-6b8667cb5c-rct6m" event={"ID":"e10de7c8-98d3-46a1-87ee-53839a451906","Type":"ContainerStarted","Data":"a780d50446591e3aaba265c2e4e9586531d271719c791328d1b43513089b9b5a"} Mar 18 16:56:04.821657 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:04.821437 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-9913d-predictor-6b8667cb5c-rct6m" Mar 18 16:56:04.822869 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:04.822844 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-9913d-predictor-6b8667cb5c-rct6m" podUID="e10de7c8-98d3-46a1-87ee-53839a451906" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Mar 18 16:56:04.840023 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:04.839908 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-9913d-predictor-6b8667cb5c-rct6m" podStartSLOduration=0.839894394 podStartE2EDuration="839.894394ms" podCreationTimestamp="2026-03-18 16:56:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:56:04.836519803 +0000 UTC m=+718.848628628" watchObservedRunningTime="2026-03-18 16:56:04.839894394 +0000 UTC m=+718.852003219" Mar 18 16:56:05.687810 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:05.687760 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-dfe46-predictor-756f46b767-v7t9k" podUID="95a83498-405c-443e-b367-47e3e745d28c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Mar 18 16:56:05.795503 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:05.795457 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-5894969db4-z496p" podUID="ee702450-de44-4b59-810b-1b89e630f1ab" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 16:56:05.825345 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:05.825309 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-9913d-predictor-6b8667cb5c-rct6m" podUID="e10de7c8-98d3-46a1-87ee-53839a451906" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Mar 18 16:56:07.588844 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:07.588796 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6579795fb5-vvblb" podUID="e9d24bd6-a499-475b-81ac-535409aeb9bf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Mar 18 16:56:08.589786 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:08.589736 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-f89879bcc-6z6lg" podUID="77931099-0e41-4ac7-81e1-41e67c4b0c9b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Mar 18 16:56:09.628588 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:09.628562 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-f89879bcc-6z6lg" Mar 18 16:56:09.678232 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:09.678196 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qtcnw\" (UniqueName: \"kubernetes.io/projected/77931099-0e41-4ac7-81e1-41e67c4b0c9b-kube-api-access-qtcnw\") pod \"77931099-0e41-4ac7-81e1-41e67c4b0c9b\" (UID: \"77931099-0e41-4ac7-81e1-41e67c4b0c9b\") " Mar 18 16:56:09.678392 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:09.678243 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/77931099-0e41-4ac7-81e1-41e67c4b0c9b-kserve-provision-location\") pod \"77931099-0e41-4ac7-81e1-41e67c4b0c9b\" (UID: \"77931099-0e41-4ac7-81e1-41e67c4b0c9b\") " Mar 18 16:56:09.678611 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:09.678584 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77931099-0e41-4ac7-81e1-41e67c4b0c9b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "77931099-0e41-4ac7-81e1-41e67c4b0c9b" (UID: "77931099-0e41-4ac7-81e1-41e67c4b0c9b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 18 16:56:09.680459 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:09.680425 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77931099-0e41-4ac7-81e1-41e67c4b0c9b-kube-api-access-qtcnw" (OuterVolumeSpecName: "kube-api-access-qtcnw") pod "77931099-0e41-4ac7-81e1-41e67c4b0c9b" (UID: "77931099-0e41-4ac7-81e1-41e67c4b0c9b"). InnerVolumeSpecName "kube-api-access-qtcnw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 16:56:09.779087 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:09.779052 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qtcnw\" (UniqueName: \"kubernetes.io/projected/77931099-0e41-4ac7-81e1-41e67c4b0c9b-kube-api-access-qtcnw\") on node \"ip-10-0-143-175.ec2.internal\" DevicePath \"\"" Mar 18 16:56:09.779087 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:09.779080 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/77931099-0e41-4ac7-81e1-41e67c4b0c9b-kserve-provision-location\") on node \"ip-10-0-143-175.ec2.internal\" DevicePath \"\"" Mar 18 16:56:09.806093 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:09.806071 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6579795fb5-vvblb" Mar 18 16:56:09.840248 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:09.840218 2578 generic.go:358] "Generic (PLEG): container finished" podID="77931099-0e41-4ac7-81e1-41e67c4b0c9b" containerID="c3ffa98f1625a3fdd5df40150f7a8c03bbc9ac00954e5368ae7d00ab09f5750f" exitCode=0 Mar 18 16:56:09.840407 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:09.840359 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-f89879bcc-6z6lg" Mar 18 16:56:09.840407 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:09.840360 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-f89879bcc-6z6lg" event={"ID":"77931099-0e41-4ac7-81e1-41e67c4b0c9b","Type":"ContainerDied","Data":"c3ffa98f1625a3fdd5df40150f7a8c03bbc9ac00954e5368ae7d00ab09f5750f"} Mar 18 16:56:09.840407 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:09.840405 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-f89879bcc-6z6lg" event={"ID":"77931099-0e41-4ac7-81e1-41e67c4b0c9b","Type":"ContainerDied","Data":"cf44396f1f3a20dc546f6ec93b6a482515f8bebf0c2bd7016daf19bf0ffe78f9"} Mar 18 16:56:09.840610 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:09.840426 2578 scope.go:117] "RemoveContainer" containerID="c3ffa98f1625a3fdd5df40150f7a8c03bbc9ac00954e5368ae7d00ab09f5750f" Mar 18 16:56:09.842423 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:09.842357 2578 generic.go:358] "Generic (PLEG): container finished" podID="e9d24bd6-a499-475b-81ac-535409aeb9bf" containerID="166a6c14693b238cc551928589c8467d4e046fab1c918ba03da8cbfb3f20cb84" exitCode=0 Mar 18 16:56:09.842593 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:09.842430 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6579795fb5-vvblb" Mar 18 16:56:09.842593 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:09.842431 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6579795fb5-vvblb" event={"ID":"e9d24bd6-a499-475b-81ac-535409aeb9bf","Type":"ContainerDied","Data":"166a6c14693b238cc551928589c8467d4e046fab1c918ba03da8cbfb3f20cb84"} Mar 18 16:56:09.842593 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:09.842580 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6579795fb5-vvblb" event={"ID":"e9d24bd6-a499-475b-81ac-535409aeb9bf","Type":"ContainerDied","Data":"dee4fe45dff776b582136d1032f1aec454e4eba50da1ff46bede883a392d15ec"} Mar 18 16:56:09.850665 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:09.850645 2578 scope.go:117] "RemoveContainer" containerID="b828013c758597f2deef8b4f4f498ea02ad00127ed903653a430c3aae6ff8ad4" Mar 18 16:56:09.859401 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:09.859374 2578 scope.go:117] "RemoveContainer" containerID="c3ffa98f1625a3fdd5df40150f7a8c03bbc9ac00954e5368ae7d00ab09f5750f" Mar 18 16:56:09.859717 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:56:09.859698 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3ffa98f1625a3fdd5df40150f7a8c03bbc9ac00954e5368ae7d00ab09f5750f\": container with ID starting with c3ffa98f1625a3fdd5df40150f7a8c03bbc9ac00954e5368ae7d00ab09f5750f not found: ID does not exist" containerID="c3ffa98f1625a3fdd5df40150f7a8c03bbc9ac00954e5368ae7d00ab09f5750f" Mar 18 16:56:09.859795 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:09.859724 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3ffa98f1625a3fdd5df40150f7a8c03bbc9ac00954e5368ae7d00ab09f5750f"} err="failed to get container status \"c3ffa98f1625a3fdd5df40150f7a8c03bbc9ac00954e5368ae7d00ab09f5750f\": rpc error: code = NotFound desc = could not find container \"c3ffa98f1625a3fdd5df40150f7a8c03bbc9ac00954e5368ae7d00ab09f5750f\": container with ID starting with c3ffa98f1625a3fdd5df40150f7a8c03bbc9ac00954e5368ae7d00ab09f5750f not found: ID does not exist" Mar 18 16:56:09.859795 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:09.859741 2578 scope.go:117] "RemoveContainer" containerID="b828013c758597f2deef8b4f4f498ea02ad00127ed903653a430c3aae6ff8ad4" Mar 18 16:56:09.859970 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:56:09.859951 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b828013c758597f2deef8b4f4f498ea02ad00127ed903653a430c3aae6ff8ad4\": container with ID starting with b828013c758597f2deef8b4f4f498ea02ad00127ed903653a430c3aae6ff8ad4 not found: ID does not exist" containerID="b828013c758597f2deef8b4f4f498ea02ad00127ed903653a430c3aae6ff8ad4" Mar 18 16:56:09.860031 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:09.859981 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b828013c758597f2deef8b4f4f498ea02ad00127ed903653a430c3aae6ff8ad4"} err="failed to get container status \"b828013c758597f2deef8b4f4f498ea02ad00127ed903653a430c3aae6ff8ad4\": rpc error: code = NotFound desc = could not find container \"b828013c758597f2deef8b4f4f498ea02ad00127ed903653a430c3aae6ff8ad4\": container with ID starting with b828013c758597f2deef8b4f4f498ea02ad00127ed903653a430c3aae6ff8ad4 not found: ID does not exist" Mar 18 16:56:09.860031 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:09.860005 2578 scope.go:117] "RemoveContainer" containerID="166a6c14693b238cc551928589c8467d4e046fab1c918ba03da8cbfb3f20cb84" Mar 18 16:56:09.864988 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:09.864968 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-f89879bcc-6z6lg"] Mar 18 16:56:09.867612 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:09.867591 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-f89879bcc-6z6lg"] Mar 18 16:56:09.867926 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:09.867910 2578 scope.go:117] "RemoveContainer" containerID="d7c4df6247b6738d260e62ebd95a02210b87f2cc43d657aa86580194e07a7fa3" Mar 18 16:56:09.876347 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:09.876328 2578 scope.go:117] "RemoveContainer" containerID="166a6c14693b238cc551928589c8467d4e046fab1c918ba03da8cbfb3f20cb84" Mar 18 16:56:09.876636 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:56:09.876617 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"166a6c14693b238cc551928589c8467d4e046fab1c918ba03da8cbfb3f20cb84\": container with ID starting with 166a6c14693b238cc551928589c8467d4e046fab1c918ba03da8cbfb3f20cb84 not found: ID does not exist" containerID="166a6c14693b238cc551928589c8467d4e046fab1c918ba03da8cbfb3f20cb84" Mar 18 16:56:09.876685 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:09.876643 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"166a6c14693b238cc551928589c8467d4e046fab1c918ba03da8cbfb3f20cb84"} err="failed to get container status \"166a6c14693b238cc551928589c8467d4e046fab1c918ba03da8cbfb3f20cb84\": rpc error: code = NotFound desc = could not find container \"166a6c14693b238cc551928589c8467d4e046fab1c918ba03da8cbfb3f20cb84\": container with ID starting with 166a6c14693b238cc551928589c8467d4e046fab1c918ba03da8cbfb3f20cb84 not found: ID does not exist" Mar 18 16:56:09.876685 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:09.876661 2578 scope.go:117] "RemoveContainer" containerID="d7c4df6247b6738d260e62ebd95a02210b87f2cc43d657aa86580194e07a7fa3" Mar 18 16:56:09.876894 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:56:09.876876 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7c4df6247b6738d260e62ebd95a02210b87f2cc43d657aa86580194e07a7fa3\": container with ID starting with d7c4df6247b6738d260e62ebd95a02210b87f2cc43d657aa86580194e07a7fa3 not found: ID does not exist" containerID="d7c4df6247b6738d260e62ebd95a02210b87f2cc43d657aa86580194e07a7fa3" Mar 18 16:56:09.876948 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:09.876902 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7c4df6247b6738d260e62ebd95a02210b87f2cc43d657aa86580194e07a7fa3"} err="failed to get container status \"d7c4df6247b6738d260e62ebd95a02210b87f2cc43d657aa86580194e07a7fa3\": rpc error: code = NotFound desc = could not find container \"d7c4df6247b6738d260e62ebd95a02210b87f2cc43d657aa86580194e07a7fa3\": container with ID starting with d7c4df6247b6738d260e62ebd95a02210b87f2cc43d657aa86580194e07a7fa3 not found: ID does not exist" Mar 18 16:56:09.880233 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:09.880216 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e9d24bd6-a499-475b-81ac-535409aeb9bf-kserve-provision-location\") pod \"e9d24bd6-a499-475b-81ac-535409aeb9bf\" (UID: \"e9d24bd6-a499-475b-81ac-535409aeb9bf\") " Mar 18 16:56:09.880296 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:09.880286 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vn6ns\" (UniqueName: \"kubernetes.io/projected/e9d24bd6-a499-475b-81ac-535409aeb9bf-kube-api-access-vn6ns\") pod \"e9d24bd6-a499-475b-81ac-535409aeb9bf\" (UID: \"e9d24bd6-a499-475b-81ac-535409aeb9bf\") " Mar 18 16:56:09.880533 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:09.880496 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9d24bd6-a499-475b-81ac-535409aeb9bf-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "e9d24bd6-a499-475b-81ac-535409aeb9bf" (UID: "e9d24bd6-a499-475b-81ac-535409aeb9bf"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 18 16:56:09.882125 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:09.882107 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9d24bd6-a499-475b-81ac-535409aeb9bf-kube-api-access-vn6ns" (OuterVolumeSpecName: "kube-api-access-vn6ns") pod "e9d24bd6-a499-475b-81ac-535409aeb9bf" (UID: "e9d24bd6-a499-475b-81ac-535409aeb9bf"). InnerVolumeSpecName "kube-api-access-vn6ns". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 16:56:09.981021 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:09.980912 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e9d24bd6-a499-475b-81ac-535409aeb9bf-kserve-provision-location\") on node \"ip-10-0-143-175.ec2.internal\" DevicePath \"\"" Mar 18 16:56:09.981021 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:09.980959 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vn6ns\" (UniqueName: \"kubernetes.io/projected/e9d24bd6-a499-475b-81ac-535409aeb9bf-kube-api-access-vn6ns\") on node \"ip-10-0-143-175.ec2.internal\" DevicePath \"\"" Mar 18 16:56:10.163349 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:10.163310 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6579795fb5-vvblb"] Mar 18 16:56:10.169311 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:10.169282 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-6579795fb5-vvblb"] Mar 18 16:56:10.593084 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:10.593049 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77931099-0e41-4ac7-81e1-41e67c4b0c9b" path="/var/lib/kubelet/pods/77931099-0e41-4ac7-81e1-41e67c4b0c9b/volumes" Mar 18 16:56:10.593453 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:10.593439 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9d24bd6-a499-475b-81ac-535409aeb9bf" path="/var/lib/kubelet/pods/e9d24bd6-a499-475b-81ac-535409aeb9bf/volumes" Mar 18 16:56:10.795509 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:10.795476 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-5894969db4-z496p" podUID="ee702450-de44-4b59-810b-1b89e630f1ab" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 16:56:15.687957 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:15.687909 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-dfe46-predictor-756f46b767-v7t9k" podUID="95a83498-405c-443e-b367-47e3e745d28c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Mar 18 16:56:15.795100 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:15.795061 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-5894969db4-z496p" podUID="ee702450-de44-4b59-810b-1b89e630f1ab" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 16:56:15.795282 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:15.795162 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-5894969db4-z496p" Mar 18 16:56:15.825902 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:15.825863 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-9913d-predictor-6b8667cb5c-rct6m" podUID="e10de7c8-98d3-46a1-87ee-53839a451906" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Mar 18 16:56:16.591653 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:56:16.591622 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 16:56:20.794781 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:20.794741 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-5894969db4-z496p" podUID="ee702450-de44-4b59-810b-1b89e630f1ab" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 16:56:25.688449 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:25.688415 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-dfe46-predictor-756f46b767-v7t9k" Mar 18 16:56:25.794512 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:25.794473 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-5894969db4-z496p" podUID="ee702450-de44-4b59-810b-1b89e630f1ab" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 16:56:25.825482 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:25.825446 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-9913d-predictor-6b8667cb5c-rct6m" podUID="e10de7c8-98d3-46a1-87ee-53839a451906" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Mar 18 16:56:29.589158 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:56:29.589119 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 16:56:30.794618 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:30.794579 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-5894969db4-z496p" podUID="ee702450-de44-4b59-810b-1b89e630f1ab" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 16:56:33.843736 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:56:33.843663 2578 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee702450_de44_4b59_810b_1b89e630f1ab.slice/crio-c24670b4ca7b9cf0506404370db1fb39fa0ceeee1869e925766efa92c5324ffa\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod77931099_0e41_4ac7_81e1_41e67c4b0c9b.slice/crio-c3ffa98f1625a3fdd5df40150f7a8c03bbc9ac00954e5368ae7d00ab09f5750f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9d24bd6_a499_475b_81ac_535409aeb9bf.slice/crio-166a6c14693b238cc551928589c8467d4e046fab1c918ba03da8cbfb3f20cb84.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9d24bd6_a499_475b_81ac_535409aeb9bf.slice/crio-conmon-166a6c14693b238cc551928589c8467d4e046fab1c918ba03da8cbfb3f20cb84.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod77931099_0e41_4ac7_81e1_41e67c4b0c9b.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod77931099_0e41_4ac7_81e1_41e67c4b0c9b.slice/crio-cf44396f1f3a20dc546f6ec93b6a482515f8bebf0c2bd7016daf19bf0ffe78f9\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9d24bd6_a499_475b_81ac_535409aeb9bf.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9d24bd6_a499_475b_81ac_535409aeb9bf.slice/crio-dee4fe45dff776b582136d1032f1aec454e4eba50da1ff46bede883a392d15ec\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod77931099_0e41_4ac7_81e1_41e67c4b0c9b.slice/crio-conmon-c3ffa98f1625a3fdd5df40150f7a8c03bbc9ac00954e5368ae7d00ab09f5750f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee702450_de44_4b59_810b_1b89e630f1ab.slice/crio-00c157743796573a6fd39bd8983ef8ebcbbdf5429376c3f12db3972190817f39.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee702450_de44_4b59_810b_1b89e630f1ab.slice/crio-conmon-00c157743796573a6fd39bd8983ef8ebcbbdf5429376c3f12db3972190817f39.scope\": RecentStats: unable to find data in memory cache]" Mar 18 16:56:33.844252 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:56:33.843742 2578 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee702450_de44_4b59_810b_1b89e630f1ab.slice/crio-00c157743796573a6fd39bd8983ef8ebcbbdf5429376c3f12db3972190817f39.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9d24bd6_a499_475b_81ac_535409aeb9bf.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod77931099_0e41_4ac7_81e1_41e67c4b0c9b.slice/crio-conmon-c3ffa98f1625a3fdd5df40150f7a8c03bbc9ac00954e5368ae7d00ab09f5750f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod77931099_0e41_4ac7_81e1_41e67c4b0c9b.slice/crio-c3ffa98f1625a3fdd5df40150f7a8c03bbc9ac00954e5368ae7d00ab09f5750f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9d24bd6_a499_475b_81ac_535409aeb9bf.slice/crio-dee4fe45dff776b582136d1032f1aec454e4eba50da1ff46bede883a392d15ec\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod77931099_0e41_4ac7_81e1_41e67c4b0c9b.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod77931099_0e41_4ac7_81e1_41e67c4b0c9b.slice/crio-cf44396f1f3a20dc546f6ec93b6a482515f8bebf0c2bd7016daf19bf0ffe78f9\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9d24bd6_a499_475b_81ac_535409aeb9bf.slice/crio-166a6c14693b238cc551928589c8467d4e046fab1c918ba03da8cbfb3f20cb84.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9d24bd6_a499_475b_81ac_535409aeb9bf.slice/crio-conmon-166a6c14693b238cc551928589c8467d4e046fab1c918ba03da8cbfb3f20cb84.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee702450_de44_4b59_810b_1b89e630f1ab.slice/crio-conmon-00c157743796573a6fd39bd8983ef8ebcbbdf5429376c3f12db3972190817f39.scope\": RecentStats: unable to find data in memory cache]" Mar 18 16:56:33.844389 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:56:33.843863 2578 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9d24bd6_a499_475b_81ac_535409aeb9bf.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9d24bd6_a499_475b_81ac_535409aeb9bf.slice/crio-conmon-166a6c14693b238cc551928589c8467d4e046fab1c918ba03da8cbfb3f20cb84.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee702450_de44_4b59_810b_1b89e630f1ab.slice/crio-conmon-00c157743796573a6fd39bd8983ef8ebcbbdf5429376c3f12db3972190817f39.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod77931099_0e41_4ac7_81e1_41e67c4b0c9b.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod77931099_0e41_4ac7_81e1_41e67c4b0c9b.slice/crio-c3ffa98f1625a3fdd5df40150f7a8c03bbc9ac00954e5368ae7d00ab09f5750f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod77931099_0e41_4ac7_81e1_41e67c4b0c9b.slice/crio-cf44396f1f3a20dc546f6ec93b6a482515f8bebf0c2bd7016daf19bf0ffe78f9\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee702450_de44_4b59_810b_1b89e630f1ab.slice/crio-00c157743796573a6fd39bd8983ef8ebcbbdf5429376c3f12db3972190817f39.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod77931099_0e41_4ac7_81e1_41e67c4b0c9b.slice/crio-conmon-c3ffa98f1625a3fdd5df40150f7a8c03bbc9ac00954e5368ae7d00ab09f5750f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9d24bd6_a499_475b_81ac_535409aeb9bf.slice/crio-166a6c14693b238cc551928589c8467d4e046fab1c918ba03da8cbfb3f20cb84.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9d24bd6_a499_475b_81ac_535409aeb9bf.slice/crio-dee4fe45dff776b582136d1032f1aec454e4eba50da1ff46bede883a392d15ec\": RecentStats: unable to find data in memory cache]" Mar 18 16:56:33.845786 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:56:33.843897 2578 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod77931099_0e41_4ac7_81e1_41e67c4b0c9b.slice/crio-c3ffa98f1625a3fdd5df40150f7a8c03bbc9ac00954e5368ae7d00ab09f5750f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod77931099_0e41_4ac7_81e1_41e67c4b0c9b.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9d24bd6_a499_475b_81ac_535409aeb9bf.slice/crio-conmon-166a6c14693b238cc551928589c8467d4e046fab1c918ba03da8cbfb3f20cb84.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod77931099_0e41_4ac7_81e1_41e67c4b0c9b.slice/crio-cf44396f1f3a20dc546f6ec93b6a482515f8bebf0c2bd7016daf19bf0ffe78f9\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod77931099_0e41_4ac7_81e1_41e67c4b0c9b.slice/crio-conmon-c3ffa98f1625a3fdd5df40150f7a8c03bbc9ac00954e5368ae7d00ab09f5750f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee702450_de44_4b59_810b_1b89e630f1ab.slice/crio-conmon-00c157743796573a6fd39bd8983ef8ebcbbdf5429376c3f12db3972190817f39.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee702450_de44_4b59_810b_1b89e630f1ab.slice/crio-00c157743796573a6fd39bd8983ef8ebcbbdf5429376c3f12db3972190817f39.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9d24bd6_a499_475b_81ac_535409aeb9bf.slice/crio-dee4fe45dff776b582136d1032f1aec454e4eba50da1ff46bede883a392d15ec\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9d24bd6_a499_475b_81ac_535409aeb9bf.slice/crio-166a6c14693b238cc551928589c8467d4e046fab1c918ba03da8cbfb3f20cb84.scope\": RecentStats: unable to find data in memory cache]" Mar 18 16:56:33.921620 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:33.921582 2578 generic.go:358] "Generic (PLEG): container finished" podID="ee702450-de44-4b59-810b-1b89e630f1ab" containerID="00c157743796573a6fd39bd8983ef8ebcbbdf5429376c3f12db3972190817f39" exitCode=137 Mar 18 16:56:33.921797 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:33.921648 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-5894969db4-z496p" event={"ID":"ee702450-de44-4b59-810b-1b89e630f1ab","Type":"ContainerDied","Data":"00c157743796573a6fd39bd8983ef8ebcbbdf5429376c3f12db3972190817f39"} Mar 18 16:56:33.991669 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:33.991642 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-5894969db4-z496p" Mar 18 16:56:34.089886 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:34.089805 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ee702450-de44-4b59-810b-1b89e630f1ab-proxy-tls\") pod \"ee702450-de44-4b59-810b-1b89e630f1ab\" (UID: \"ee702450-de44-4b59-810b-1b89e630f1ab\") " Mar 18 16:56:34.089886 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:34.089871 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ee702450-de44-4b59-810b-1b89e630f1ab-openshift-service-ca-bundle\") pod \"ee702450-de44-4b59-810b-1b89e630f1ab\" (UID: \"ee702450-de44-4b59-810b-1b89e630f1ab\") " Mar 18 16:56:34.090228 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:34.090199 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee702450-de44-4b59-810b-1b89e630f1ab-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "ee702450-de44-4b59-810b-1b89e630f1ab" (UID: "ee702450-de44-4b59-810b-1b89e630f1ab"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 16:56:34.091940 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:34.091915 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee702450-de44-4b59-810b-1b89e630f1ab-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "ee702450-de44-4b59-810b-1b89e630f1ab" (UID: "ee702450-de44-4b59-810b-1b89e630f1ab"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:56:34.190344 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:34.190313 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ee702450-de44-4b59-810b-1b89e630f1ab-proxy-tls\") on node \"ip-10-0-143-175.ec2.internal\" DevicePath \"\"" Mar 18 16:56:34.190344 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:34.190341 2578 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ee702450-de44-4b59-810b-1b89e630f1ab-openshift-service-ca-bundle\") on node \"ip-10-0-143-175.ec2.internal\" DevicePath \"\"" Mar 18 16:56:34.926494 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:34.926420 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-5894969db4-z496p" Mar 18 16:56:34.926494 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:34.926444 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-5894969db4-z496p" event={"ID":"ee702450-de44-4b59-810b-1b89e630f1ab","Type":"ContainerDied","Data":"c24670b4ca7b9cf0506404370db1fb39fa0ceeee1869e925766efa92c5324ffa"} Mar 18 16:56:34.926974 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:34.926496 2578 scope.go:117] "RemoveContainer" containerID="00c157743796573a6fd39bd8983ef8ebcbbdf5429376c3f12db3972190817f39" Mar 18 16:56:34.940713 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:34.940686 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-5894969db4-z496p"] Mar 18 16:56:34.944673 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:34.944651 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/model-chainer-5894969db4-z496p"] Mar 18 16:56:35.826038 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:35.825997 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-9913d-predictor-6b8667cb5c-rct6m" podUID="e10de7c8-98d3-46a1-87ee-53839a451906" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Mar 18 16:56:36.593193 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:36.593153 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee702450-de44-4b59-810b-1b89e630f1ab" path="/var/lib/kubelet/pods/ee702450-de44-4b59-810b-1b89e630f1ab/volumes" Mar 18 16:56:41.589280 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:56:41.589247 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 16:56:44.042306 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:44.042271 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/switch-graph-dfe46-d9c5f577d-l5vh7"] Mar 18 16:56:44.042774 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:44.042756 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="77931099-0e41-4ac7-81e1-41e67c4b0c9b" containerName="storage-initializer" Mar 18 16:56:44.042827 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:44.042777 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="77931099-0e41-4ac7-81e1-41e67c4b0c9b" containerName="storage-initializer" Mar 18 16:56:44.042827 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:44.042800 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e9d24bd6-a499-475b-81ac-535409aeb9bf" containerName="storage-initializer" Mar 18 16:56:44.042827 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:44.042809 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9d24bd6-a499-475b-81ac-535409aeb9bf" containerName="storage-initializer" Mar 18 16:56:44.042827 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:44.042820 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e9d24bd6-a499-475b-81ac-535409aeb9bf" containerName="kserve-container" Mar 18 16:56:44.042947 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:44.042829 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9d24bd6-a499-475b-81ac-535409aeb9bf" containerName="kserve-container" Mar 18 16:56:44.042947 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:44.042847 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ee702450-de44-4b59-810b-1b89e630f1ab" containerName="model-chainer" Mar 18 16:56:44.042947 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:44.042858 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee702450-de44-4b59-810b-1b89e630f1ab" containerName="model-chainer" Mar 18 16:56:44.042947 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:44.042872 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="77931099-0e41-4ac7-81e1-41e67c4b0c9b" containerName="kserve-container" Mar 18 16:56:44.042947 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:44.042880 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="77931099-0e41-4ac7-81e1-41e67c4b0c9b" containerName="kserve-container" Mar 18 16:56:44.043085 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:44.042949 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="e9d24bd6-a499-475b-81ac-535409aeb9bf" containerName="kserve-container" Mar 18 16:56:44.043085 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:44.042962 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="ee702450-de44-4b59-810b-1b89e630f1ab" containerName="model-chainer" Mar 18 16:56:44.043085 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:44.042978 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="77931099-0e41-4ac7-81e1-41e67c4b0c9b" containerName="kserve-container" Mar 18 16:56:44.047694 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:44.047676 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-dfe46-d9c5f577d-l5vh7" Mar 18 16:56:44.049833 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:44.049806 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-dfe46-kube-rbac-proxy-sar-config\"" Mar 18 16:56:44.050008 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:44.049856 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-dfe46-serving-cert\"" Mar 18 16:56:44.052031 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:44.052007 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-dfe46-d9c5f577d-l5vh7"] Mar 18 16:56:44.179657 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:44.179618 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/86ab7c7e-c09f-49e8-9c9c-4ffa6ecc6ab0-proxy-tls\") pod \"switch-graph-dfe46-d9c5f577d-l5vh7\" (UID: \"86ab7c7e-c09f-49e8-9c9c-4ffa6ecc6ab0\") " pod="kserve-ci-e2e-test/switch-graph-dfe46-d9c5f577d-l5vh7" Mar 18 16:56:44.179657 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:44.179658 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/86ab7c7e-c09f-49e8-9c9c-4ffa6ecc6ab0-openshift-service-ca-bundle\") pod \"switch-graph-dfe46-d9c5f577d-l5vh7\" (UID: \"86ab7c7e-c09f-49e8-9c9c-4ffa6ecc6ab0\") " pod="kserve-ci-e2e-test/switch-graph-dfe46-d9c5f577d-l5vh7" Mar 18 16:56:44.280070 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:44.280032 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/86ab7c7e-c09f-49e8-9c9c-4ffa6ecc6ab0-proxy-tls\") pod \"switch-graph-dfe46-d9c5f577d-l5vh7\" (UID: \"86ab7c7e-c09f-49e8-9c9c-4ffa6ecc6ab0\") " pod="kserve-ci-e2e-test/switch-graph-dfe46-d9c5f577d-l5vh7" Mar 18 16:56:44.280070 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:44.280072 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/86ab7c7e-c09f-49e8-9c9c-4ffa6ecc6ab0-openshift-service-ca-bundle\") pod \"switch-graph-dfe46-d9c5f577d-l5vh7\" (UID: \"86ab7c7e-c09f-49e8-9c9c-4ffa6ecc6ab0\") " pod="kserve-ci-e2e-test/switch-graph-dfe46-d9c5f577d-l5vh7" Mar 18 16:56:44.280747 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:44.280719 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/86ab7c7e-c09f-49e8-9c9c-4ffa6ecc6ab0-openshift-service-ca-bundle\") pod \"switch-graph-dfe46-d9c5f577d-l5vh7\" (UID: \"86ab7c7e-c09f-49e8-9c9c-4ffa6ecc6ab0\") " pod="kserve-ci-e2e-test/switch-graph-dfe46-d9c5f577d-l5vh7" Mar 18 16:56:44.282392 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:44.282374 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/86ab7c7e-c09f-49e8-9c9c-4ffa6ecc6ab0-proxy-tls\") pod \"switch-graph-dfe46-d9c5f577d-l5vh7\" (UID: \"86ab7c7e-c09f-49e8-9c9c-4ffa6ecc6ab0\") " pod="kserve-ci-e2e-test/switch-graph-dfe46-d9c5f577d-l5vh7" Mar 18 16:56:44.359167 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:44.359093 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-dfe46-d9c5f577d-l5vh7" Mar 18 16:56:44.478348 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:44.478315 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-dfe46-d9c5f577d-l5vh7"] Mar 18 16:56:44.481575 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:56:44.481550 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86ab7c7e_c09f_49e8_9c9c_4ffa6ecc6ab0.slice/crio-c80e83ab6657761cf0a6343fdf3b8e2f2558d4d58f79e3bacf07dfc6a2727051 WatchSource:0}: Error finding container c80e83ab6657761cf0a6343fdf3b8e2f2558d4d58f79e3bacf07dfc6a2727051: Status 404 returned error can't find the container with id c80e83ab6657761cf0a6343fdf3b8e2f2558d4d58f79e3bacf07dfc6a2727051 Mar 18 16:56:44.966374 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:44.966341 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-dfe46-d9c5f577d-l5vh7" event={"ID":"86ab7c7e-c09f-49e8-9c9c-4ffa6ecc6ab0","Type":"ContainerStarted","Data":"b096406ce3fb3baff05c0d39a03c2f8b12b2924e633670ae13b0a8b8804e20e7"} Mar 18 16:56:44.966374 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:44.966376 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-dfe46-d9c5f577d-l5vh7" event={"ID":"86ab7c7e-c09f-49e8-9c9c-4ffa6ecc6ab0","Type":"ContainerStarted","Data":"c80e83ab6657761cf0a6343fdf3b8e2f2558d4d58f79e3bacf07dfc6a2727051"} Mar 18 16:56:44.966612 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:44.966473 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-dfe46-d9c5f577d-l5vh7" Mar 18 16:56:44.983569 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:44.983455 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/switch-graph-dfe46-d9c5f577d-l5vh7" podStartSLOduration=0.9834386 podStartE2EDuration="983.4386ms" podCreationTimestamp="2026-03-18 16:56:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:56:44.983120692 +0000 UTC m=+758.995229516" watchObservedRunningTime="2026-03-18 16:56:44.9834386 +0000 UTC m=+758.995547427" Mar 18 16:56:45.826307 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:45.826267 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-9913d-predictor-6b8667cb5c-rct6m" podUID="e10de7c8-98d3-46a1-87ee-53839a451906" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Mar 18 16:56:50.975891 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:50.975855 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/switch-graph-dfe46-d9c5f577d-l5vh7" Mar 18 16:56:52.589356 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:56:52.589310 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 16:56:55.825708 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:56:55.825657 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-9913d-predictor-6b8667cb5c-rct6m" podUID="e10de7c8-98d3-46a1-87ee-53839a451906" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Mar 18 16:57:05.589564 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:57:05.589471 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 16:57:05.826730 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:57:05.826694 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-9913d-predictor-6b8667cb5c-rct6m" Mar 18 16:57:20.589134 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:57:20.589098 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 16:57:23.992082 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:57:23.992040 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/sequence-graph-9913d-8999cff78-v8r6t"] Mar 18 16:57:23.996665 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:57:23.996643 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-9913d-8999cff78-v8r6t" Mar 18 16:57:23.998652 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:57:23.998626 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-9913d-kube-rbac-proxy-sar-config\"" Mar 18 16:57:23.998652 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:57:23.998638 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-9913d-serving-cert\"" Mar 18 16:57:24.001957 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:57:24.001935 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-9913d-8999cff78-v8r6t"] Mar 18 16:57:24.023708 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:57:24.023681 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/16f2220f-605e-47aa-ba58-5f1f72775f0e-openshift-service-ca-bundle\") pod \"sequence-graph-9913d-8999cff78-v8r6t\" (UID: \"16f2220f-605e-47aa-ba58-5f1f72775f0e\") " pod="kserve-ci-e2e-test/sequence-graph-9913d-8999cff78-v8r6t" Mar 18 16:57:24.023861 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:57:24.023733 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/16f2220f-605e-47aa-ba58-5f1f72775f0e-proxy-tls\") pod \"sequence-graph-9913d-8999cff78-v8r6t\" (UID: \"16f2220f-605e-47aa-ba58-5f1f72775f0e\") " pod="kserve-ci-e2e-test/sequence-graph-9913d-8999cff78-v8r6t" Mar 18 16:57:24.125094 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:57:24.125052 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/16f2220f-605e-47aa-ba58-5f1f72775f0e-openshift-service-ca-bundle\") pod \"sequence-graph-9913d-8999cff78-v8r6t\" (UID: \"16f2220f-605e-47aa-ba58-5f1f72775f0e\") " pod="kserve-ci-e2e-test/sequence-graph-9913d-8999cff78-v8r6t" Mar 18 16:57:24.125273 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:57:24.125108 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/16f2220f-605e-47aa-ba58-5f1f72775f0e-proxy-tls\") pod \"sequence-graph-9913d-8999cff78-v8r6t\" (UID: \"16f2220f-605e-47aa-ba58-5f1f72775f0e\") " pod="kserve-ci-e2e-test/sequence-graph-9913d-8999cff78-v8r6t" Mar 18 16:57:24.125273 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:57:24.125237 2578 secret.go:189] Couldn't get secret kserve-ci-e2e-test/sequence-graph-9913d-serving-cert: secret "sequence-graph-9913d-serving-cert" not found Mar 18 16:57:24.125344 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:57:24.125304 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/16f2220f-605e-47aa-ba58-5f1f72775f0e-proxy-tls podName:16f2220f-605e-47aa-ba58-5f1f72775f0e nodeName:}" failed. No retries permitted until 2026-03-18 16:57:24.625287508 +0000 UTC m=+798.637396314 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/16f2220f-605e-47aa-ba58-5f1f72775f0e-proxy-tls") pod "sequence-graph-9913d-8999cff78-v8r6t" (UID: "16f2220f-605e-47aa-ba58-5f1f72775f0e") : secret "sequence-graph-9913d-serving-cert" not found Mar 18 16:57:24.125768 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:57:24.125747 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/16f2220f-605e-47aa-ba58-5f1f72775f0e-openshift-service-ca-bundle\") pod \"sequence-graph-9913d-8999cff78-v8r6t\" (UID: \"16f2220f-605e-47aa-ba58-5f1f72775f0e\") " pod="kserve-ci-e2e-test/sequence-graph-9913d-8999cff78-v8r6t" Mar 18 16:57:24.629718 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:57:24.629675 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/16f2220f-605e-47aa-ba58-5f1f72775f0e-proxy-tls\") pod \"sequence-graph-9913d-8999cff78-v8r6t\" (UID: \"16f2220f-605e-47aa-ba58-5f1f72775f0e\") " pod="kserve-ci-e2e-test/sequence-graph-9913d-8999cff78-v8r6t" Mar 18 16:57:24.632143 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:57:24.632119 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/16f2220f-605e-47aa-ba58-5f1f72775f0e-proxy-tls\") pod \"sequence-graph-9913d-8999cff78-v8r6t\" (UID: \"16f2220f-605e-47aa-ba58-5f1f72775f0e\") " pod="kserve-ci-e2e-test/sequence-graph-9913d-8999cff78-v8r6t" Mar 18 16:57:24.908887 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:57:24.908796 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-9913d-8999cff78-v8r6t" Mar 18 16:57:25.027001 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:57:25.026973 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-9913d-8999cff78-v8r6t"] Mar 18 16:57:25.029630 ip-10-0-143-175 kubenswrapper[2578]: W0318 16:57:25.029602 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16f2220f_605e_47aa_ba58_5f1f72775f0e.slice/crio-23f78bcbdd008bf392aeb5958900f4c44a3e9dc7dc0cee257e76dbef4c528655 WatchSource:0}: Error finding container 23f78bcbdd008bf392aeb5958900f4c44a3e9dc7dc0cee257e76dbef4c528655: Status 404 returned error can't find the container with id 23f78bcbdd008bf392aeb5958900f4c44a3e9dc7dc0cee257e76dbef4c528655 Mar 18 16:57:25.094927 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:57:25.094899 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-9913d-8999cff78-v8r6t" event={"ID":"16f2220f-605e-47aa-ba58-5f1f72775f0e","Type":"ContainerStarted","Data":"23f78bcbdd008bf392aeb5958900f4c44a3e9dc7dc0cee257e76dbef4c528655"} Mar 18 16:57:26.100350 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:57:26.100305 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-9913d-8999cff78-v8r6t" event={"ID":"16f2220f-605e-47aa-ba58-5f1f72775f0e","Type":"ContainerStarted","Data":"5b30cea81476b9699a7c64591152fc4501cab700964212ce86625fce59f56cff"} Mar 18 16:57:26.100754 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:57:26.100358 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-9913d-8999cff78-v8r6t" Mar 18 16:57:26.116894 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:57:26.116850 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/sequence-graph-9913d-8999cff78-v8r6t" podStartSLOduration=3.116836058 podStartE2EDuration="3.116836058s" podCreationTimestamp="2026-03-18 16:57:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:57:26.115375581 +0000 UTC m=+800.127484411" watchObservedRunningTime="2026-03-18 16:57:26.116836058 +0000 UTC m=+800.128944882" Mar 18 16:57:32.110212 ip-10-0-143-175 kubenswrapper[2578]: I0318 16:57:32.110184 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sequence-graph-9913d-8999cff78-v8r6t" Mar 18 16:57:34.589775 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:57:34.589655 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 16:57:47.589375 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:57:47.589337 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 16:57:59.589576 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:57:59.589544 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 16:58:12.589509 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:58:12.589478 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 16:58:23.589828 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:58:23.589794 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 16:58:38.878732 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:58:38.878634 2578 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized" image="quay.io/opendatahub/odh-model-serving-api:fast" Mar 18 16:58:38.879083 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:58:38.878825 2578 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:server,Image:quay.io/opendatahub/odh-model-serving-api:fast,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},ContainerPort{Name:metrics,HostPort:0,ContainerPort:9090,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:TLS_CERT_DIR,Value:/tls,ValueFrom:nil,},EnvVar{Name:GATEWAY_LABEL_SELECTOR,Value:,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{50 -3} {} 50m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tls-certs,ReadOnly:true,MountPath:/tls,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xp5hl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000640000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod model-serving-api-9699c8d45-stcsx_kserve(4f796fc2-c6ea-4936-8fae-6cd0d2b876db): ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized" logger="UnhandledError" Mar 18 16:58:38.880000 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:58:38.879972 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ErrImagePull: \"unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 16:58:50.589379 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:58:50.589342 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 16:59:02.589692 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:59:02.589641 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 16:59:15.590044 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:59:15.590001 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 16:59:26.593491 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:59:26.591553 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 16:59:39.589987 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:59:39.589947 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 16:59:54.589384 ip-10-0-143-175 kubenswrapper[2578]: E0318 16:59:54.589345 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:00:08.589374 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:00:08.589224 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 17:00:08.589724 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:00:08.589412 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:00:19.589287 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:00:19.589250 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:00:34.589319 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:00:34.589271 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:00:46.591593 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:00:46.591558 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:00:57.589291 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:00:57.589257 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:01:09.589983 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:01:09.589948 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:01:24.589542 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:01:24.589492 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:01:38.589710 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:01:38.589677 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:01:52.590046 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:01:52.590009 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:02:06.592033 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:02:06.591943 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:02:20.590089 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:02:20.590051 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:02:31.589465 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:02:31.589429 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:02:42.589462 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:02:42.589393 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:02:53.589918 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:02:53.589893 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:03:05.589646 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:03:05.589611 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:03:16.591202 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:03:16.591158 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:03:30.589954 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:03:30.589914 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:03:46.165070 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:03:46.165022 2578 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized" image="quay.io/opendatahub/odh-model-serving-api:fast" Mar 18 17:03:46.165610 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:03:46.165237 2578 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:server,Image:quay.io/opendatahub/odh-model-serving-api:fast,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},ContainerPort{Name:metrics,HostPort:0,ContainerPort:9090,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:TLS_CERT_DIR,Value:/tls,ValueFrom:nil,},EnvVar{Name:GATEWAY_LABEL_SELECTOR,Value:,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{50 -3} {} 50m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tls-certs,ReadOnly:true,MountPath:/tls,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xp5hl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000640000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod model-serving-api-9699c8d45-stcsx_kserve(4f796fc2-c6ea-4936-8fae-6cd0d2b876db): ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized" logger="UnhandledError" Mar 18 17:03:46.166449 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:03:46.166417 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ErrImagePull: \"unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:03:57.589548 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:03:57.589486 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:04:12.589724 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:04:12.589634 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:04:23.589706 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:04:23.589647 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:04:37.591764 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:04:37.589786 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:04:48.589890 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:04:48.589741 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:04:58.707843 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:04:58.707758 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-dfe46-d9c5f577d-l5vh7"] Mar 18 17:04:58.708306 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:04:58.708098 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/switch-graph-dfe46-d9c5f577d-l5vh7" podUID="86ab7c7e-c09f-49e8-9c9c-4ffa6ecc6ab0" containerName="switch-graph-dfe46" containerID="cri-o://b096406ce3fb3baff05c0d39a03c2f8b12b2924e633670ae13b0a8b8804e20e7" gracePeriod=30 Mar 18 17:04:59.271821 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:04:59.271790 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-4ddda-predictor-5c87999b7-2hjjq"] Mar 18 17:04:59.275069 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:04:59.275047 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-4ddda-predictor-5c87999b7-2hjjq" Mar 18 17:04:59.277766 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:04:59.277738 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-dfe46-predictor-756f46b767-v7t9k"] Mar 18 17:04:59.278004 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:04:59.277978 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-dfe46-predictor-756f46b767-v7t9k" podUID="95a83498-405c-443e-b367-47e3e745d28c" containerName="kserve-container" containerID="cri-o://77a5e5e7ae8bcb450d6a2ab3b74cc8a13db429e0e4d6508be64a9675988865bd" gracePeriod=30 Mar 18 17:04:59.281014 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:04:59.280991 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-4ddda-predictor-5c87999b7-2hjjq"] Mar 18 17:04:59.399087 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:04:59.399049 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mw667\" (UniqueName: \"kubernetes.io/projected/4ee55d1a-665a-4b72-b65e-2e82d8f0f04c-kube-api-access-mw667\") pod \"success-200-isvc-4ddda-predictor-5c87999b7-2hjjq\" (UID: \"4ee55d1a-665a-4b72-b65e-2e82d8f0f04c\") " pod="kserve-ci-e2e-test/success-200-isvc-4ddda-predictor-5c87999b7-2hjjq" Mar 18 17:04:59.500151 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:04:59.500111 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mw667\" (UniqueName: \"kubernetes.io/projected/4ee55d1a-665a-4b72-b65e-2e82d8f0f04c-kube-api-access-mw667\") pod \"success-200-isvc-4ddda-predictor-5c87999b7-2hjjq\" (UID: \"4ee55d1a-665a-4b72-b65e-2e82d8f0f04c\") " pod="kserve-ci-e2e-test/success-200-isvc-4ddda-predictor-5c87999b7-2hjjq" Mar 18 17:04:59.507687 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:04:59.507656 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mw667\" (UniqueName: \"kubernetes.io/projected/4ee55d1a-665a-4b72-b65e-2e82d8f0f04c-kube-api-access-mw667\") pod \"success-200-isvc-4ddda-predictor-5c87999b7-2hjjq\" (UID: \"4ee55d1a-665a-4b72-b65e-2e82d8f0f04c\") " pod="kserve-ci-e2e-test/success-200-isvc-4ddda-predictor-5c87999b7-2hjjq" Mar 18 17:04:59.586651 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:04:59.586604 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-4ddda-predictor-5c87999b7-2hjjq" Mar 18 17:04:59.709067 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:04:59.709036 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-4ddda-predictor-5c87999b7-2hjjq"] Mar 18 17:04:59.713173 ip-10-0-143-175 kubenswrapper[2578]: W0318 17:04:59.713142 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ee55d1a_665a_4b72_b65e_2e82d8f0f04c.slice/crio-c0042d4c89c8b39291f15e0c88a3e8373a9017c24cc01851d535b37cff74a66b WatchSource:0}: Error finding container c0042d4c89c8b39291f15e0c88a3e8373a9017c24cc01851d535b37cff74a66b: Status 404 returned error can't find the container with id c0042d4c89c8b39291f15e0c88a3e8373a9017c24cc01851d535b37cff74a66b Mar 18 17:05:00.556155 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:05:00.556121 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-4ddda-predictor-5c87999b7-2hjjq" event={"ID":"4ee55d1a-665a-4b72-b65e-2e82d8f0f04c","Type":"ContainerStarted","Data":"3ce7b099388a2547a9e4fd31c61638dfaa32cbdcda1eee7894a899af019b0b93"} Mar 18 17:05:00.556155 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:05:00.556159 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-4ddda-predictor-5c87999b7-2hjjq" event={"ID":"4ee55d1a-665a-4b72-b65e-2e82d8f0f04c","Type":"ContainerStarted","Data":"c0042d4c89c8b39291f15e0c88a3e8373a9017c24cc01851d535b37cff74a66b"} Mar 18 17:05:00.556375 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:05:00.556284 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-4ddda-predictor-5c87999b7-2hjjq" Mar 18 17:05:00.557757 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:05:00.557734 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-4ddda-predictor-5c87999b7-2hjjq" podUID="4ee55d1a-665a-4b72-b65e-2e82d8f0f04c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Mar 18 17:05:00.570595 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:05:00.570362 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-4ddda-predictor-5c87999b7-2hjjq" podStartSLOduration=1.570346877 podStartE2EDuration="1.570346877s" podCreationTimestamp="2026-03-18 17:04:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:05:00.569398503 +0000 UTC m=+1254.581507328" watchObservedRunningTime="2026-03-18 17:05:00.570346877 +0000 UTC m=+1254.582455702" Mar 18 17:05:00.974454 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:05:00.974369 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-dfe46-d9c5f577d-l5vh7" podUID="86ab7c7e-c09f-49e8-9c9c-4ffa6ecc6ab0" containerName="switch-graph-dfe46" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 17:05:01.559500 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:05:01.559452 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-4ddda-predictor-5c87999b7-2hjjq" podUID="4ee55d1a-665a-4b72-b65e-2e82d8f0f04c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Mar 18 17:05:01.589730 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:05:01.589700 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:05:02.420697 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:05:02.420671 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-dfe46-predictor-756f46b767-v7t9k" Mar 18 17:05:02.525366 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:05:02.525270 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6pwx4\" (UniqueName: \"kubernetes.io/projected/95a83498-405c-443e-b367-47e3e745d28c-kube-api-access-6pwx4\") pod \"95a83498-405c-443e-b367-47e3e745d28c\" (UID: \"95a83498-405c-443e-b367-47e3e745d28c\") " Mar 18 17:05:02.527319 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:05:02.527291 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95a83498-405c-443e-b367-47e3e745d28c-kube-api-access-6pwx4" (OuterVolumeSpecName: "kube-api-access-6pwx4") pod "95a83498-405c-443e-b367-47e3e745d28c" (UID: "95a83498-405c-443e-b367-47e3e745d28c"). InnerVolumeSpecName "kube-api-access-6pwx4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 17:05:02.564941 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:05:02.564906 2578 generic.go:358] "Generic (PLEG): container finished" podID="95a83498-405c-443e-b367-47e3e745d28c" containerID="77a5e5e7ae8bcb450d6a2ab3b74cc8a13db429e0e4d6508be64a9675988865bd" exitCode=0 Mar 18 17:05:02.565111 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:05:02.564984 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-dfe46-predictor-756f46b767-v7t9k" Mar 18 17:05:02.565111 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:05:02.564992 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-dfe46-predictor-756f46b767-v7t9k" event={"ID":"95a83498-405c-443e-b367-47e3e745d28c","Type":"ContainerDied","Data":"77a5e5e7ae8bcb450d6a2ab3b74cc8a13db429e0e4d6508be64a9675988865bd"} Mar 18 17:05:02.565111 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:05:02.565027 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-dfe46-predictor-756f46b767-v7t9k" event={"ID":"95a83498-405c-443e-b367-47e3e745d28c","Type":"ContainerDied","Data":"3d9015e22c71619e096b8bc897e6c3cdeedf8acd8ffa0513113f224fef3c1bdc"} Mar 18 17:05:02.565111 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:05:02.565043 2578 scope.go:117] "RemoveContainer" containerID="77a5e5e7ae8bcb450d6a2ab3b74cc8a13db429e0e4d6508be64a9675988865bd" Mar 18 17:05:02.573621 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:05:02.573602 2578 scope.go:117] "RemoveContainer" containerID="77a5e5e7ae8bcb450d6a2ab3b74cc8a13db429e0e4d6508be64a9675988865bd" Mar 18 17:05:02.573910 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:05:02.573891 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77a5e5e7ae8bcb450d6a2ab3b74cc8a13db429e0e4d6508be64a9675988865bd\": container with ID starting with 77a5e5e7ae8bcb450d6a2ab3b74cc8a13db429e0e4d6508be64a9675988865bd not found: ID does not exist" containerID="77a5e5e7ae8bcb450d6a2ab3b74cc8a13db429e0e4d6508be64a9675988865bd" Mar 18 17:05:02.573962 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:05:02.573921 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77a5e5e7ae8bcb450d6a2ab3b74cc8a13db429e0e4d6508be64a9675988865bd"} err="failed to get container status \"77a5e5e7ae8bcb450d6a2ab3b74cc8a13db429e0e4d6508be64a9675988865bd\": rpc error: code = NotFound desc = could not find container \"77a5e5e7ae8bcb450d6a2ab3b74cc8a13db429e0e4d6508be64a9675988865bd\": container with ID starting with 77a5e5e7ae8bcb450d6a2ab3b74cc8a13db429e0e4d6508be64a9675988865bd not found: ID does not exist" Mar 18 17:05:02.585105 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:05:02.585080 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-dfe46-predictor-756f46b767-v7t9k"] Mar 18 17:05:02.592731 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:05:02.592708 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-dfe46-predictor-756f46b767-v7t9k"] Mar 18 17:05:02.626625 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:05:02.626592 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6pwx4\" (UniqueName: \"kubernetes.io/projected/95a83498-405c-443e-b367-47e3e745d28c-kube-api-access-6pwx4\") on node \"ip-10-0-143-175.ec2.internal\" DevicePath \"\"" Mar 18 17:05:04.592277 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:05:04.592239 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95a83498-405c-443e-b367-47e3e745d28c" path="/var/lib/kubelet/pods/95a83498-405c-443e-b367-47e3e745d28c/volumes" Mar 18 17:05:05.973979 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:05:05.973941 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-dfe46-d9c5f577d-l5vh7" podUID="86ab7c7e-c09f-49e8-9c9c-4ffa6ecc6ab0" containerName="switch-graph-dfe46" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 17:05:10.973781 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:05:10.973737 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-dfe46-d9c5f577d-l5vh7" podUID="86ab7c7e-c09f-49e8-9c9c-4ffa6ecc6ab0" containerName="switch-graph-dfe46" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 17:05:10.974246 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:05:10.973882 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-dfe46-d9c5f577d-l5vh7" Mar 18 17:05:11.559751 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:05:11.559711 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-4ddda-predictor-5c87999b7-2hjjq" podUID="4ee55d1a-665a-4b72-b65e-2e82d8f0f04c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Mar 18 17:05:15.589767 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:05:15.589747 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 17:05:15.590028 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:05:15.589978 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:05:15.973909 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:05:15.973809 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-dfe46-d9c5f577d-l5vh7" podUID="86ab7c7e-c09f-49e8-9c9c-4ffa6ecc6ab0" containerName="switch-graph-dfe46" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 17:05:20.974582 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:05:20.974541 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-dfe46-d9c5f577d-l5vh7" podUID="86ab7c7e-c09f-49e8-9c9c-4ffa6ecc6ab0" containerName="switch-graph-dfe46" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 17:05:21.560541 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:05:21.560481 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-4ddda-predictor-5c87999b7-2hjjq" podUID="4ee55d1a-665a-4b72-b65e-2e82d8f0f04c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Mar 18 17:05:25.974207 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:05:25.974166 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-dfe46-d9c5f577d-l5vh7" podUID="86ab7c7e-c09f-49e8-9c9c-4ffa6ecc6ab0" containerName="switch-graph-dfe46" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 17:05:26.592479 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:05:26.592439 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:05:28.850173 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:05:28.850148 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-dfe46-d9c5f577d-l5vh7" Mar 18 17:05:28.948338 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:05:28.948297 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/86ab7c7e-c09f-49e8-9c9c-4ffa6ecc6ab0-proxy-tls\") pod \"86ab7c7e-c09f-49e8-9c9c-4ffa6ecc6ab0\" (UID: \"86ab7c7e-c09f-49e8-9c9c-4ffa6ecc6ab0\") " Mar 18 17:05:28.948558 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:05:28.948345 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/86ab7c7e-c09f-49e8-9c9c-4ffa6ecc6ab0-openshift-service-ca-bundle\") pod \"86ab7c7e-c09f-49e8-9c9c-4ffa6ecc6ab0\" (UID: \"86ab7c7e-c09f-49e8-9c9c-4ffa6ecc6ab0\") " Mar 18 17:05:28.948784 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:05:28.948753 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86ab7c7e-c09f-49e8-9c9c-4ffa6ecc6ab0-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "86ab7c7e-c09f-49e8-9c9c-4ffa6ecc6ab0" (UID: "86ab7c7e-c09f-49e8-9c9c-4ffa6ecc6ab0"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 17:05:28.950462 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:05:28.950442 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86ab7c7e-c09f-49e8-9c9c-4ffa6ecc6ab0-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "86ab7c7e-c09f-49e8-9c9c-4ffa6ecc6ab0" (UID: "86ab7c7e-c09f-49e8-9c9c-4ffa6ecc6ab0"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 17:05:29.049489 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:05:29.049458 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/86ab7c7e-c09f-49e8-9c9c-4ffa6ecc6ab0-proxy-tls\") on node \"ip-10-0-143-175.ec2.internal\" DevicePath \"\"" Mar 18 17:05:29.049489 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:05:29.049485 2578 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/86ab7c7e-c09f-49e8-9c9c-4ffa6ecc6ab0-openshift-service-ca-bundle\") on node \"ip-10-0-143-175.ec2.internal\" DevicePath \"\"" Mar 18 17:05:29.657000 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:05:29.656960 2578 generic.go:358] "Generic (PLEG): container finished" podID="86ab7c7e-c09f-49e8-9c9c-4ffa6ecc6ab0" containerID="b096406ce3fb3baff05c0d39a03c2f8b12b2924e633670ae13b0a8b8804e20e7" exitCode=0 Mar 18 17:05:29.657242 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:05:29.657028 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-dfe46-d9c5f577d-l5vh7" Mar 18 17:05:29.657242 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:05:29.657047 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-dfe46-d9c5f577d-l5vh7" event={"ID":"86ab7c7e-c09f-49e8-9c9c-4ffa6ecc6ab0","Type":"ContainerDied","Data":"b096406ce3fb3baff05c0d39a03c2f8b12b2924e633670ae13b0a8b8804e20e7"} Mar 18 17:05:29.657242 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:05:29.657089 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-dfe46-d9c5f577d-l5vh7" event={"ID":"86ab7c7e-c09f-49e8-9c9c-4ffa6ecc6ab0","Type":"ContainerDied","Data":"c80e83ab6657761cf0a6343fdf3b8e2f2558d4d58f79e3bacf07dfc6a2727051"} Mar 18 17:05:29.657242 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:05:29.657105 2578 scope.go:117] "RemoveContainer" containerID="b096406ce3fb3baff05c0d39a03c2f8b12b2924e633670ae13b0a8b8804e20e7" Mar 18 17:05:29.666024 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:05:29.666005 2578 scope.go:117] "RemoveContainer" containerID="b096406ce3fb3baff05c0d39a03c2f8b12b2924e633670ae13b0a8b8804e20e7" Mar 18 17:05:29.666275 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:05:29.666256 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b096406ce3fb3baff05c0d39a03c2f8b12b2924e633670ae13b0a8b8804e20e7\": container with ID starting with b096406ce3fb3baff05c0d39a03c2f8b12b2924e633670ae13b0a8b8804e20e7 not found: ID does not exist" containerID="b096406ce3fb3baff05c0d39a03c2f8b12b2924e633670ae13b0a8b8804e20e7" Mar 18 17:05:29.666322 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:05:29.666284 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b096406ce3fb3baff05c0d39a03c2f8b12b2924e633670ae13b0a8b8804e20e7"} err="failed to get container status \"b096406ce3fb3baff05c0d39a03c2f8b12b2924e633670ae13b0a8b8804e20e7\": rpc error: code = NotFound desc = could not find container \"b096406ce3fb3baff05c0d39a03c2f8b12b2924e633670ae13b0a8b8804e20e7\": container with ID starting with b096406ce3fb3baff05c0d39a03c2f8b12b2924e633670ae13b0a8b8804e20e7 not found: ID does not exist" Mar 18 17:05:29.677769 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:05:29.677744 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-dfe46-d9c5f577d-l5vh7"] Mar 18 17:05:29.681623 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:05:29.681604 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/switch-graph-dfe46-d9c5f577d-l5vh7"] Mar 18 17:05:30.592704 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:05:30.592663 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86ab7c7e-c09f-49e8-9c9c-4ffa6ecc6ab0" path="/var/lib/kubelet/pods/86ab7c7e-c09f-49e8-9c9c-4ffa6ecc6ab0/volumes" Mar 18 17:05:31.559938 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:05:31.559893 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-4ddda-predictor-5c87999b7-2hjjq" podUID="4ee55d1a-665a-4b72-b65e-2e82d8f0f04c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Mar 18 17:05:38.630435 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:05:38.630402 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-9913d-8999cff78-v8r6t"] Mar 18 17:05:38.630867 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:05:38.630670 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sequence-graph-9913d-8999cff78-v8r6t" podUID="16f2220f-605e-47aa-ba58-5f1f72775f0e" containerName="sequence-graph-9913d" containerID="cri-o://5b30cea81476b9699a7c64591152fc4501cab700964212ce86625fce59f56cff" gracePeriod=30 Mar 18 17:05:38.971729 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:05:38.971643 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-9913d-predictor-6b8667cb5c-rct6m"] Mar 18 17:05:38.972177 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:05:38.972060 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-9913d-predictor-6b8667cb5c-rct6m" podUID="e10de7c8-98d3-46a1-87ee-53839a451906" containerName="kserve-container" containerID="cri-o://12364d588b7ecf8385f9f937cc0ca24434750ceaeb1beaea8f1b9c5a45cea988" gracePeriod=30 Mar 18 17:05:38.982744 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:05:38.982717 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-18dc8-predictor-8c499d9c-282kw"] Mar 18 17:05:38.983107 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:05:38.983091 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="95a83498-405c-443e-b367-47e3e745d28c" containerName="kserve-container" Mar 18 17:05:38.983180 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:05:38.983110 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="95a83498-405c-443e-b367-47e3e745d28c" containerName="kserve-container" Mar 18 17:05:38.983180 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:05:38.983128 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="86ab7c7e-c09f-49e8-9c9c-4ffa6ecc6ab0" containerName="switch-graph-dfe46" Mar 18 17:05:38.983180 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:05:38.983137 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="86ab7c7e-c09f-49e8-9c9c-4ffa6ecc6ab0" containerName="switch-graph-dfe46" Mar 18 17:05:38.983328 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:05:38.983240 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="95a83498-405c-443e-b367-47e3e745d28c" containerName="kserve-container" Mar 18 17:05:38.983328 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:05:38.983256 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="86ab7c7e-c09f-49e8-9c9c-4ffa6ecc6ab0" containerName="switch-graph-dfe46" Mar 18 17:05:38.986106 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:05:38.986088 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-18dc8-predictor-8c499d9c-282kw" Mar 18 17:05:38.994896 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:05:38.994875 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-18dc8-predictor-8c499d9c-282kw"] Mar 18 17:05:39.026914 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:05:39.026887 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fw99k\" (UniqueName: \"kubernetes.io/projected/0ec39ecc-1be3-42be-86e0-f176bf8e08e1-kube-api-access-fw99k\") pod \"success-200-isvc-18dc8-predictor-8c499d9c-282kw\" (UID: \"0ec39ecc-1be3-42be-86e0-f176bf8e08e1\") " pod="kserve-ci-e2e-test/success-200-isvc-18dc8-predictor-8c499d9c-282kw" Mar 18 17:05:39.127594 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:05:39.127560 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fw99k\" (UniqueName: \"kubernetes.io/projected/0ec39ecc-1be3-42be-86e0-f176bf8e08e1-kube-api-access-fw99k\") pod \"success-200-isvc-18dc8-predictor-8c499d9c-282kw\" (UID: \"0ec39ecc-1be3-42be-86e0-f176bf8e08e1\") " pod="kserve-ci-e2e-test/success-200-isvc-18dc8-predictor-8c499d9c-282kw" Mar 18 17:05:39.134812 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:05:39.134783 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fw99k\" (UniqueName: \"kubernetes.io/projected/0ec39ecc-1be3-42be-86e0-f176bf8e08e1-kube-api-access-fw99k\") pod \"success-200-isvc-18dc8-predictor-8c499d9c-282kw\" (UID: \"0ec39ecc-1be3-42be-86e0-f176bf8e08e1\") " pod="kserve-ci-e2e-test/success-200-isvc-18dc8-predictor-8c499d9c-282kw" Mar 18 17:05:39.297103 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:05:39.297067 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-18dc8-predictor-8c499d9c-282kw" Mar 18 17:05:39.424227 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:05:39.424161 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-18dc8-predictor-8c499d9c-282kw"] Mar 18 17:05:39.427413 ip-10-0-143-175 kubenswrapper[2578]: W0318 17:05:39.427385 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ec39ecc_1be3_42be_86e0_f176bf8e08e1.slice/crio-f0fe75bdb59986d001fc7a7954eee0609fd030c6173459e94ab6bcb33888a1ad WatchSource:0}: Error finding container f0fe75bdb59986d001fc7a7954eee0609fd030c6173459e94ab6bcb33888a1ad: Status 404 returned error can't find the container with id f0fe75bdb59986d001fc7a7954eee0609fd030c6173459e94ab6bcb33888a1ad Mar 18 17:05:39.589718 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:05:39.589679 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:05:39.693577 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:05:39.693545 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-18dc8-predictor-8c499d9c-282kw" event={"ID":"0ec39ecc-1be3-42be-86e0-f176bf8e08e1","Type":"ContainerStarted","Data":"db0c261dbca995d71639c7b03911426d66ad740e551b682424741641e3c73c4d"} Mar 18 17:05:39.693577 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:05:39.693581 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-18dc8-predictor-8c499d9c-282kw" event={"ID":"0ec39ecc-1be3-42be-86e0-f176bf8e08e1","Type":"ContainerStarted","Data":"f0fe75bdb59986d001fc7a7954eee0609fd030c6173459e94ab6bcb33888a1ad"} Mar 18 17:05:39.694053 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:05:39.693727 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-18dc8-predictor-8c499d9c-282kw" Mar 18 17:05:39.695241 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:05:39.695211 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-18dc8-predictor-8c499d9c-282kw" podUID="0ec39ecc-1be3-42be-86e0-f176bf8e08e1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Mar 18 17:05:39.708192 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:05:39.707615 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-18dc8-predictor-8c499d9c-282kw" podStartSLOduration=1.707598137 podStartE2EDuration="1.707598137s" podCreationTimestamp="2026-03-18 17:05:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:05:39.707202477 +0000 UTC m=+1293.719311305" watchObservedRunningTime="2026-03-18 17:05:39.707598137 +0000 UTC m=+1293.719706963" Mar 18 17:05:40.697305 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:05:40.697264 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-18dc8-predictor-8c499d9c-282kw" podUID="0ec39ecc-1be3-42be-86e0-f176bf8e08e1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Mar 18 17:05:41.559986 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:05:41.559937 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-4ddda-predictor-5c87999b7-2hjjq" podUID="4ee55d1a-665a-4b72-b65e-2e82d8f0f04c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Mar 18 17:05:42.108860 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:05:42.108825 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-9913d-8999cff78-v8r6t" podUID="16f2220f-605e-47aa-ba58-5f1f72775f0e" containerName="sequence-graph-9913d" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 17:05:42.224103 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:05:42.224080 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-9913d-predictor-6b8667cb5c-rct6m" Mar 18 17:05:42.252827 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:05:42.252790 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dzts6\" (UniqueName: \"kubernetes.io/projected/e10de7c8-98d3-46a1-87ee-53839a451906-kube-api-access-dzts6\") pod \"e10de7c8-98d3-46a1-87ee-53839a451906\" (UID: \"e10de7c8-98d3-46a1-87ee-53839a451906\") " Mar 18 17:05:42.254843 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:05:42.254814 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e10de7c8-98d3-46a1-87ee-53839a451906-kube-api-access-dzts6" (OuterVolumeSpecName: "kube-api-access-dzts6") pod "e10de7c8-98d3-46a1-87ee-53839a451906" (UID: "e10de7c8-98d3-46a1-87ee-53839a451906"). InnerVolumeSpecName "kube-api-access-dzts6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 17:05:42.353645 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:05:42.353564 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dzts6\" (UniqueName: \"kubernetes.io/projected/e10de7c8-98d3-46a1-87ee-53839a451906-kube-api-access-dzts6\") on node \"ip-10-0-143-175.ec2.internal\" DevicePath \"\"" Mar 18 17:05:42.703949 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:05:42.703861 2578 generic.go:358] "Generic (PLEG): container finished" podID="e10de7c8-98d3-46a1-87ee-53839a451906" containerID="12364d588b7ecf8385f9f937cc0ca24434750ceaeb1beaea8f1b9c5a45cea988" exitCode=0 Mar 18 17:05:42.703949 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:05:42.703936 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-9913d-predictor-6b8667cb5c-rct6m" Mar 18 17:05:42.704143 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:05:42.703942 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-9913d-predictor-6b8667cb5c-rct6m" event={"ID":"e10de7c8-98d3-46a1-87ee-53839a451906","Type":"ContainerDied","Data":"12364d588b7ecf8385f9f937cc0ca24434750ceaeb1beaea8f1b9c5a45cea988"} Mar 18 17:05:42.704143 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:05:42.703980 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-9913d-predictor-6b8667cb5c-rct6m" event={"ID":"e10de7c8-98d3-46a1-87ee-53839a451906","Type":"ContainerDied","Data":"a780d50446591e3aaba265c2e4e9586531d271719c791328d1b43513089b9b5a"} Mar 18 17:05:42.704143 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:05:42.703996 2578 scope.go:117] "RemoveContainer" containerID="12364d588b7ecf8385f9f937cc0ca24434750ceaeb1beaea8f1b9c5a45cea988" Mar 18 17:05:42.712066 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:05:42.712039 2578 scope.go:117] "RemoveContainer" containerID="12364d588b7ecf8385f9f937cc0ca24434750ceaeb1beaea8f1b9c5a45cea988" Mar 18 17:05:42.712302 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:05:42.712282 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12364d588b7ecf8385f9f937cc0ca24434750ceaeb1beaea8f1b9c5a45cea988\": container with ID starting with 12364d588b7ecf8385f9f937cc0ca24434750ceaeb1beaea8f1b9c5a45cea988 not found: ID does not exist" containerID="12364d588b7ecf8385f9f937cc0ca24434750ceaeb1beaea8f1b9c5a45cea988" Mar 18 17:05:42.712387 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:05:42.712310 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12364d588b7ecf8385f9f937cc0ca24434750ceaeb1beaea8f1b9c5a45cea988"} err="failed to get container status \"12364d588b7ecf8385f9f937cc0ca24434750ceaeb1beaea8f1b9c5a45cea988\": rpc error: code = NotFound desc = could not find container \"12364d588b7ecf8385f9f937cc0ca24434750ceaeb1beaea8f1b9c5a45cea988\": container with ID starting with 12364d588b7ecf8385f9f937cc0ca24434750ceaeb1beaea8f1b9c5a45cea988 not found: ID does not exist" Mar 18 17:05:42.718958 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:05:42.718937 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-9913d-predictor-6b8667cb5c-rct6m"] Mar 18 17:05:42.721314 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:05:42.721294 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-9913d-predictor-6b8667cb5c-rct6m"] Mar 18 17:05:44.593416 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:05:44.593380 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e10de7c8-98d3-46a1-87ee-53839a451906" path="/var/lib/kubelet/pods/e10de7c8-98d3-46a1-87ee-53839a451906/volumes" Mar 18 17:05:47.109860 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:05:47.109821 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-9913d-8999cff78-v8r6t" podUID="16f2220f-605e-47aa-ba58-5f1f72775f0e" containerName="sequence-graph-9913d" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 17:05:50.589651 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:05:50.589511 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:05:50.697628 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:05:50.697584 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-18dc8-predictor-8c499d9c-282kw" podUID="0ec39ecc-1be3-42be-86e0-f176bf8e08e1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Mar 18 17:05:51.560594 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:05:51.560551 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-4ddda-predictor-5c87999b7-2hjjq" podUID="4ee55d1a-665a-4b72-b65e-2e82d8f0f04c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Mar 18 17:05:52.108031 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:05:52.107991 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-9913d-8999cff78-v8r6t" podUID="16f2220f-605e-47aa-ba58-5f1f72775f0e" containerName="sequence-graph-9913d" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 17:05:52.108493 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:05:52.108103 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-9913d-8999cff78-v8r6t" Mar 18 17:05:57.108285 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:05:57.108245 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-9913d-8999cff78-v8r6t" podUID="16f2220f-605e-47aa-ba58-5f1f72775f0e" containerName="sequence-graph-9913d" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 17:06:00.697368 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:00.697327 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-18dc8-predictor-8c499d9c-282kw" podUID="0ec39ecc-1be3-42be-86e0-f176bf8e08e1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Mar 18 17:06:01.561235 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:01.561205 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-4ddda-predictor-5c87999b7-2hjjq" Mar 18 17:06:02.108409 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:02.108352 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-9913d-8999cff78-v8r6t" podUID="16f2220f-605e-47aa-ba58-5f1f72775f0e" containerName="sequence-graph-9913d" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 17:06:02.590234 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:06:02.590188 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:06:07.107569 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:07.107516 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-9913d-8999cff78-v8r6t" podUID="16f2220f-605e-47aa-ba58-5f1f72775f0e" containerName="sequence-graph-9913d" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 17:06:08.776567 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:08.776540 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-9913d-8999cff78-v8r6t" Mar 18 17:06:08.788246 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:08.788217 2578 generic.go:358] "Generic (PLEG): container finished" podID="16f2220f-605e-47aa-ba58-5f1f72775f0e" containerID="5b30cea81476b9699a7c64591152fc4501cab700964212ce86625fce59f56cff" exitCode=0 Mar 18 17:06:08.788422 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:08.788265 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-9913d-8999cff78-v8r6t" event={"ID":"16f2220f-605e-47aa-ba58-5f1f72775f0e","Type":"ContainerDied","Data":"5b30cea81476b9699a7c64591152fc4501cab700964212ce86625fce59f56cff"} Mar 18 17:06:08.788422 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:08.788281 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-9913d-8999cff78-v8r6t" Mar 18 17:06:08.788422 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:08.788289 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-9913d-8999cff78-v8r6t" event={"ID":"16f2220f-605e-47aa-ba58-5f1f72775f0e","Type":"ContainerDied","Data":"23f78bcbdd008bf392aeb5958900f4c44a3e9dc7dc0cee257e76dbef4c528655"} Mar 18 17:06:08.788422 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:08.788305 2578 scope.go:117] "RemoveContainer" containerID="5b30cea81476b9699a7c64591152fc4501cab700964212ce86625fce59f56cff" Mar 18 17:06:08.797419 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:08.797393 2578 scope.go:117] "RemoveContainer" containerID="5b30cea81476b9699a7c64591152fc4501cab700964212ce86625fce59f56cff" Mar 18 17:06:08.797741 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:06:08.797718 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b30cea81476b9699a7c64591152fc4501cab700964212ce86625fce59f56cff\": container with ID starting with 5b30cea81476b9699a7c64591152fc4501cab700964212ce86625fce59f56cff not found: ID does not exist" containerID="5b30cea81476b9699a7c64591152fc4501cab700964212ce86625fce59f56cff" Mar 18 17:06:08.797828 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:08.797749 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b30cea81476b9699a7c64591152fc4501cab700964212ce86625fce59f56cff"} err="failed to get container status \"5b30cea81476b9699a7c64591152fc4501cab700964212ce86625fce59f56cff\": rpc error: code = NotFound desc = could not find container \"5b30cea81476b9699a7c64591152fc4501cab700964212ce86625fce59f56cff\": container with ID starting with 5b30cea81476b9699a7c64591152fc4501cab700964212ce86625fce59f56cff not found: ID does not exist" Mar 18 17:06:08.872846 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:08.872805 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/16f2220f-605e-47aa-ba58-5f1f72775f0e-proxy-tls\") pod \"16f2220f-605e-47aa-ba58-5f1f72775f0e\" (UID: \"16f2220f-605e-47aa-ba58-5f1f72775f0e\") " Mar 18 17:06:08.873049 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:08.872867 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/16f2220f-605e-47aa-ba58-5f1f72775f0e-openshift-service-ca-bundle\") pod \"16f2220f-605e-47aa-ba58-5f1f72775f0e\" (UID: \"16f2220f-605e-47aa-ba58-5f1f72775f0e\") " Mar 18 17:06:08.873237 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:08.873210 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16f2220f-605e-47aa-ba58-5f1f72775f0e-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "16f2220f-605e-47aa-ba58-5f1f72775f0e" (UID: "16f2220f-605e-47aa-ba58-5f1f72775f0e"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 17:06:08.874935 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:08.874906 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16f2220f-605e-47aa-ba58-5f1f72775f0e-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "16f2220f-605e-47aa-ba58-5f1f72775f0e" (UID: "16f2220f-605e-47aa-ba58-5f1f72775f0e"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 17:06:08.974158 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:08.974078 2578 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/16f2220f-605e-47aa-ba58-5f1f72775f0e-openshift-service-ca-bundle\") on node \"ip-10-0-143-175.ec2.internal\" DevicePath \"\"" Mar 18 17:06:08.974158 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:08.974107 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/16f2220f-605e-47aa-ba58-5f1f72775f0e-proxy-tls\") on node \"ip-10-0-143-175.ec2.internal\" DevicePath \"\"" Mar 18 17:06:09.108403 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:09.108370 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-9913d-8999cff78-v8r6t"] Mar 18 17:06:09.111445 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:09.111419 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-9913d-8999cff78-v8r6t"] Mar 18 17:06:10.593211 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:10.593176 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16f2220f-605e-47aa-ba58-5f1f72775f0e" path="/var/lib/kubelet/pods/16f2220f-605e-47aa-ba58-5f1f72775f0e/volumes" Mar 18 17:06:10.697794 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:10.697749 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-18dc8-predictor-8c499d9c-282kw" podUID="0ec39ecc-1be3-42be-86e0-f176bf8e08e1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Mar 18 17:06:16.590663 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:06:16.590632 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:06:19.205913 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:19.205877 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-4ddda-559cdcff4b-9mqdv"] Mar 18 17:06:19.206278 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:19.206186 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="16f2220f-605e-47aa-ba58-5f1f72775f0e" containerName="sequence-graph-9913d" Mar 18 17:06:19.206278 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:19.206196 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="16f2220f-605e-47aa-ba58-5f1f72775f0e" containerName="sequence-graph-9913d" Mar 18 17:06:19.206278 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:19.206219 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e10de7c8-98d3-46a1-87ee-53839a451906" containerName="kserve-container" Mar 18 17:06:19.206278 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:19.206226 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="e10de7c8-98d3-46a1-87ee-53839a451906" containerName="kserve-container" Mar 18 17:06:19.206278 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:19.206271 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="e10de7c8-98d3-46a1-87ee-53839a451906" containerName="kserve-container" Mar 18 17:06:19.206278 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:19.206279 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="16f2220f-605e-47aa-ba58-5f1f72775f0e" containerName="sequence-graph-9913d" Mar 18 17:06:19.210323 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:19.210304 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-4ddda-559cdcff4b-9mqdv" Mar 18 17:06:19.212354 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:19.212327 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"ensemble-graph-4ddda-kube-rbac-proxy-sar-config\"" Mar 18 17:06:19.212459 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:19.212327 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"ensemble-graph-4ddda-serving-cert\"" Mar 18 17:06:19.216128 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:19.216099 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-4ddda-559cdcff4b-9mqdv"] Mar 18 17:06:19.366435 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:19.366405 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6974693a-2c12-408d-b531-df87aeae8976-proxy-tls\") pod \"ensemble-graph-4ddda-559cdcff4b-9mqdv\" (UID: \"6974693a-2c12-408d-b531-df87aeae8976\") " pod="kserve-ci-e2e-test/ensemble-graph-4ddda-559cdcff4b-9mqdv" Mar 18 17:06:19.366622 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:19.366446 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6974693a-2c12-408d-b531-df87aeae8976-openshift-service-ca-bundle\") pod \"ensemble-graph-4ddda-559cdcff4b-9mqdv\" (UID: \"6974693a-2c12-408d-b531-df87aeae8976\") " pod="kserve-ci-e2e-test/ensemble-graph-4ddda-559cdcff4b-9mqdv" Mar 18 17:06:19.467774 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:19.467678 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6974693a-2c12-408d-b531-df87aeae8976-proxy-tls\") pod \"ensemble-graph-4ddda-559cdcff4b-9mqdv\" (UID: \"6974693a-2c12-408d-b531-df87aeae8976\") " pod="kserve-ci-e2e-test/ensemble-graph-4ddda-559cdcff4b-9mqdv" Mar 18 17:06:19.467774 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:19.467733 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6974693a-2c12-408d-b531-df87aeae8976-openshift-service-ca-bundle\") pod \"ensemble-graph-4ddda-559cdcff4b-9mqdv\" (UID: \"6974693a-2c12-408d-b531-df87aeae8976\") " pod="kserve-ci-e2e-test/ensemble-graph-4ddda-559cdcff4b-9mqdv" Mar 18 17:06:19.468016 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:06:19.467819 2578 secret.go:189] Couldn't get secret kserve-ci-e2e-test/ensemble-graph-4ddda-serving-cert: secret "ensemble-graph-4ddda-serving-cert" not found Mar 18 17:06:19.468016 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:06:19.467894 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6974693a-2c12-408d-b531-df87aeae8976-proxy-tls podName:6974693a-2c12-408d-b531-df87aeae8976 nodeName:}" failed. No retries permitted until 2026-03-18 17:06:19.967876789 +0000 UTC m=+1333.979985601 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/6974693a-2c12-408d-b531-df87aeae8976-proxy-tls") pod "ensemble-graph-4ddda-559cdcff4b-9mqdv" (UID: "6974693a-2c12-408d-b531-df87aeae8976") : secret "ensemble-graph-4ddda-serving-cert" not found Mar 18 17:06:19.468397 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:19.468374 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6974693a-2c12-408d-b531-df87aeae8976-openshift-service-ca-bundle\") pod \"ensemble-graph-4ddda-559cdcff4b-9mqdv\" (UID: \"6974693a-2c12-408d-b531-df87aeae8976\") " pod="kserve-ci-e2e-test/ensemble-graph-4ddda-559cdcff4b-9mqdv" Mar 18 17:06:19.972449 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:19.972410 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6974693a-2c12-408d-b531-df87aeae8976-proxy-tls\") pod \"ensemble-graph-4ddda-559cdcff4b-9mqdv\" (UID: \"6974693a-2c12-408d-b531-df87aeae8976\") " pod="kserve-ci-e2e-test/ensemble-graph-4ddda-559cdcff4b-9mqdv" Mar 18 17:06:19.974829 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:19.974798 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6974693a-2c12-408d-b531-df87aeae8976-proxy-tls\") pod \"ensemble-graph-4ddda-559cdcff4b-9mqdv\" (UID: \"6974693a-2c12-408d-b531-df87aeae8976\") " pod="kserve-ci-e2e-test/ensemble-graph-4ddda-559cdcff4b-9mqdv" Mar 18 17:06:20.121902 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:20.121862 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-4ddda-559cdcff4b-9mqdv" Mar 18 17:06:20.243111 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:20.243026 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-4ddda-559cdcff4b-9mqdv"] Mar 18 17:06:20.246131 ip-10-0-143-175 kubenswrapper[2578]: W0318 17:06:20.246096 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6974693a_2c12_408d_b531_df87aeae8976.slice/crio-4a5ce94cc933d57ca4b9fb562a084a2200e62a708ed5219ac2be7dca97ba8b67 WatchSource:0}: Error finding container 4a5ce94cc933d57ca4b9fb562a084a2200e62a708ed5219ac2be7dca97ba8b67: Status 404 returned error can't find the container with id 4a5ce94cc933d57ca4b9fb562a084a2200e62a708ed5219ac2be7dca97ba8b67 Mar 18 17:06:20.697593 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:20.697495 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-18dc8-predictor-8c499d9c-282kw" podUID="0ec39ecc-1be3-42be-86e0-f176bf8e08e1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Mar 18 17:06:20.828610 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:20.828574 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-4ddda-559cdcff4b-9mqdv" event={"ID":"6974693a-2c12-408d-b531-df87aeae8976","Type":"ContainerStarted","Data":"96d331a7c05287cc0a4345f0a782cbaa255aaac897aa63d13a039ae016e793f9"} Mar 18 17:06:20.828610 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:20.828612 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-4ddda-559cdcff4b-9mqdv" event={"ID":"6974693a-2c12-408d-b531-df87aeae8976","Type":"ContainerStarted","Data":"4a5ce94cc933d57ca4b9fb562a084a2200e62a708ed5219ac2be7dca97ba8b67"} Mar 18 17:06:20.828824 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:20.828708 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/ensemble-graph-4ddda-559cdcff4b-9mqdv" Mar 18 17:06:20.873378 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:20.873331 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/ensemble-graph-4ddda-559cdcff4b-9mqdv" podStartSLOduration=1.873315903 podStartE2EDuration="1.873315903s" podCreationTimestamp="2026-03-18 17:06:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:06:20.870761014 +0000 UTC m=+1334.882869850" watchObservedRunningTime="2026-03-18 17:06:20.873315903 +0000 UTC m=+1334.885424728" Mar 18 17:06:26.839045 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:26.838960 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/ensemble-graph-4ddda-559cdcff4b-9mqdv" Mar 18 17:06:29.252683 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:29.252650 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-4ddda-559cdcff4b-9mqdv"] Mar 18 17:06:29.253083 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:29.252854 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/ensemble-graph-4ddda-559cdcff4b-9mqdv" podUID="6974693a-2c12-408d-b531-df87aeae8976" containerName="ensemble-graph-4ddda" containerID="cri-o://96d331a7c05287cc0a4345f0a782cbaa255aaac897aa63d13a039ae016e793f9" gracePeriod=30 Mar 18 17:06:29.576841 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:29.576803 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-19f83-predictor-b5c694bf5-xkjns"] Mar 18 17:06:29.580599 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:29.580575 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-19f83-predictor-b5c694bf5-xkjns" Mar 18 17:06:29.585516 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:29.585492 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-19f83-predictor-b5c694bf5-xkjns"] Mar 18 17:06:29.661000 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:29.660959 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29pbd\" (UniqueName: \"kubernetes.io/projected/97d9ff2f-d076-4548-83cd-129f594d7745-kube-api-access-29pbd\") pod \"success-200-isvc-19f83-predictor-b5c694bf5-xkjns\" (UID: \"97d9ff2f-d076-4548-83cd-129f594d7745\") " pod="kserve-ci-e2e-test/success-200-isvc-19f83-predictor-b5c694bf5-xkjns" Mar 18 17:06:29.670760 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:29.670729 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-4ddda-predictor-5c87999b7-2hjjq"] Mar 18 17:06:29.671087 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:29.671060 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-4ddda-predictor-5c87999b7-2hjjq" podUID="4ee55d1a-665a-4b72-b65e-2e82d8f0f04c" containerName="kserve-container" containerID="cri-o://3ce7b099388a2547a9e4fd31c61638dfaa32cbdcda1eee7894a899af019b0b93" gracePeriod=30 Mar 18 17:06:29.761545 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:29.761491 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-29pbd\" (UniqueName: \"kubernetes.io/projected/97d9ff2f-d076-4548-83cd-129f594d7745-kube-api-access-29pbd\") pod \"success-200-isvc-19f83-predictor-b5c694bf5-xkjns\" (UID: \"97d9ff2f-d076-4548-83cd-129f594d7745\") " pod="kserve-ci-e2e-test/success-200-isvc-19f83-predictor-b5c694bf5-xkjns" Mar 18 17:06:29.768471 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:29.768438 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-29pbd\" (UniqueName: \"kubernetes.io/projected/97d9ff2f-d076-4548-83cd-129f594d7745-kube-api-access-29pbd\") pod \"success-200-isvc-19f83-predictor-b5c694bf5-xkjns\" (UID: \"97d9ff2f-d076-4548-83cd-129f594d7745\") " pod="kserve-ci-e2e-test/success-200-isvc-19f83-predictor-b5c694bf5-xkjns" Mar 18 17:06:29.891621 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:29.891540 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-19f83-predictor-b5c694bf5-xkjns" Mar 18 17:06:30.021845 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:30.021795 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-19f83-predictor-b5c694bf5-xkjns"] Mar 18 17:06:30.024936 ip-10-0-143-175 kubenswrapper[2578]: W0318 17:06:30.024908 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97d9ff2f_d076_4548_83cd_129f594d7745.slice/crio-bf4764c1ca9bed19c03010222b5cee31c738c13fcf5edb1e5b45539be719dd4b WatchSource:0}: Error finding container bf4764c1ca9bed19c03010222b5cee31c738c13fcf5edb1e5b45539be719dd4b: Status 404 returned error can't find the container with id bf4764c1ca9bed19c03010222b5cee31c738c13fcf5edb1e5b45539be719dd4b Mar 18 17:06:30.697257 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:30.697216 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-18dc8-predictor-8c499d9c-282kw" podUID="0ec39ecc-1be3-42be-86e0-f176bf8e08e1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Mar 18 17:06:30.864587 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:30.864548 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-19f83-predictor-b5c694bf5-xkjns" event={"ID":"97d9ff2f-d076-4548-83cd-129f594d7745","Type":"ContainerStarted","Data":"2e19f7d8e3a622c66578d77066763b5aa8b3880e23beb7aa1e6016abb3f81db6"} Mar 18 17:06:30.864587 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:30.864591 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-19f83-predictor-b5c694bf5-xkjns" event={"ID":"97d9ff2f-d076-4548-83cd-129f594d7745","Type":"ContainerStarted","Data":"bf4764c1ca9bed19c03010222b5cee31c738c13fcf5edb1e5b45539be719dd4b"} Mar 18 17:06:30.864789 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:30.864728 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-19f83-predictor-b5c694bf5-xkjns" Mar 18 17:06:30.866163 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:30.866134 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-19f83-predictor-b5c694bf5-xkjns" podUID="97d9ff2f-d076-4548-83cd-129f594d7745" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.42:8080: connect: connection refused" Mar 18 17:06:30.880193 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:30.880145 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-19f83-predictor-b5c694bf5-xkjns" podStartSLOduration=1.8801310930000001 podStartE2EDuration="1.880131093s" podCreationTimestamp="2026-03-18 17:06:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:06:30.879370446 +0000 UTC m=+1344.891479270" watchObservedRunningTime="2026-03-18 17:06:30.880131093 +0000 UTC m=+1344.892239919" Mar 18 17:06:31.560579 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:31.560501 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-4ddda-predictor-5c87999b7-2hjjq" podUID="4ee55d1a-665a-4b72-b65e-2e82d8f0f04c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Mar 18 17:06:31.589247 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:06:31.589215 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:06:31.837402 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:31.837272 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-4ddda-559cdcff4b-9mqdv" podUID="6974693a-2c12-408d-b531-df87aeae8976" containerName="ensemble-graph-4ddda" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 17:06:31.868207 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:31.868172 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-19f83-predictor-b5c694bf5-xkjns" podUID="97d9ff2f-d076-4548-83cd-129f594d7745" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.42:8080: connect: connection refused" Mar 18 17:06:32.873066 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:32.873028 2578 generic.go:358] "Generic (PLEG): container finished" podID="4ee55d1a-665a-4b72-b65e-2e82d8f0f04c" containerID="3ce7b099388a2547a9e4fd31c61638dfaa32cbdcda1eee7894a899af019b0b93" exitCode=0 Mar 18 17:06:32.873584 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:32.873095 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-4ddda-predictor-5c87999b7-2hjjq" event={"ID":"4ee55d1a-665a-4b72-b65e-2e82d8f0f04c","Type":"ContainerDied","Data":"3ce7b099388a2547a9e4fd31c61638dfaa32cbdcda1eee7894a899af019b0b93"} Mar 18 17:06:32.924077 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:32.924055 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-4ddda-predictor-5c87999b7-2hjjq" Mar 18 17:06:32.990616 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:32.990504 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mw667\" (UniqueName: \"kubernetes.io/projected/4ee55d1a-665a-4b72-b65e-2e82d8f0f04c-kube-api-access-mw667\") pod \"4ee55d1a-665a-4b72-b65e-2e82d8f0f04c\" (UID: \"4ee55d1a-665a-4b72-b65e-2e82d8f0f04c\") " Mar 18 17:06:32.992577 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:32.992551 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ee55d1a-665a-4b72-b65e-2e82d8f0f04c-kube-api-access-mw667" (OuterVolumeSpecName: "kube-api-access-mw667") pod "4ee55d1a-665a-4b72-b65e-2e82d8f0f04c" (UID: "4ee55d1a-665a-4b72-b65e-2e82d8f0f04c"). InnerVolumeSpecName "kube-api-access-mw667". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 17:06:33.091144 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:33.091110 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mw667\" (UniqueName: \"kubernetes.io/projected/4ee55d1a-665a-4b72-b65e-2e82d8f0f04c-kube-api-access-mw667\") on node \"ip-10-0-143-175.ec2.internal\" DevicePath \"\"" Mar 18 17:06:33.877428 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:33.877398 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-4ddda-predictor-5c87999b7-2hjjq" event={"ID":"4ee55d1a-665a-4b72-b65e-2e82d8f0f04c","Type":"ContainerDied","Data":"c0042d4c89c8b39291f15e0c88a3e8373a9017c24cc01851d535b37cff74a66b"} Mar 18 17:06:33.877428 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:33.877433 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-4ddda-predictor-5c87999b7-2hjjq" Mar 18 17:06:33.877917 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:33.877438 2578 scope.go:117] "RemoveContainer" containerID="3ce7b099388a2547a9e4fd31c61638dfaa32cbdcda1eee7894a899af019b0b93" Mar 18 17:06:33.896759 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:33.896637 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-4ddda-predictor-5c87999b7-2hjjq"] Mar 18 17:06:33.898644 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:33.898625 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-4ddda-predictor-5c87999b7-2hjjq"] Mar 18 17:06:34.592397 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:34.592367 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ee55d1a-665a-4b72-b65e-2e82d8f0f04c" path="/var/lib/kubelet/pods/4ee55d1a-665a-4b72-b65e-2e82d8f0f04c/volumes" Mar 18 17:06:36.836992 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:36.836942 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-4ddda-559cdcff4b-9mqdv" podUID="6974693a-2c12-408d-b531-df87aeae8976" containerName="ensemble-graph-4ddda" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 17:06:40.698293 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:40.698260 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-18dc8-predictor-8c499d9c-282kw" Mar 18 17:06:41.838878 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:41.838823 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-4ddda-559cdcff4b-9mqdv" podUID="6974693a-2c12-408d-b531-df87aeae8976" containerName="ensemble-graph-4ddda" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 17:06:41.839353 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:41.838964 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/ensemble-graph-4ddda-559cdcff4b-9mqdv" Mar 18 17:06:41.869053 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:41.869015 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-19f83-predictor-b5c694bf5-xkjns" podUID="97d9ff2f-d076-4548-83cd-129f594d7745" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.42:8080: connect: connection refused" Mar 18 17:06:44.589382 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:06:44.589352 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:06:46.837581 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:46.837539 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-4ddda-559cdcff4b-9mqdv" podUID="6974693a-2c12-408d-b531-df87aeae8976" containerName="ensemble-graph-4ddda" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 17:06:51.837065 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:51.837025 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-4ddda-559cdcff4b-9mqdv" podUID="6974693a-2c12-408d-b531-df87aeae8976" containerName="ensemble-graph-4ddda" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 17:06:51.868715 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:51.868676 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-19f83-predictor-b5c694bf5-xkjns" podUID="97d9ff2f-d076-4548-83cd-129f594d7745" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.42:8080: connect: connection refused" Mar 18 17:06:56.837584 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:56.837545 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-4ddda-559cdcff4b-9mqdv" podUID="6974693a-2c12-408d-b531-df87aeae8976" containerName="ensemble-graph-4ddda" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 17:06:58.589422 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:06:58.589226 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:06:58.867371 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:58.867292 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/sequence-graph-18dc8-546dc7d746-98whx"] Mar 18 17:06:58.867643 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:58.867629 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4ee55d1a-665a-4b72-b65e-2e82d8f0f04c" containerName="kserve-container" Mar 18 17:06:58.867693 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:58.867646 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ee55d1a-665a-4b72-b65e-2e82d8f0f04c" containerName="kserve-container" Mar 18 17:06:58.867728 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:58.867701 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="4ee55d1a-665a-4b72-b65e-2e82d8f0f04c" containerName="kserve-container" Mar 18 17:06:58.871726 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:58.871711 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-18dc8-546dc7d746-98whx" Mar 18 17:06:58.873508 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:58.873486 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-18dc8-kube-rbac-proxy-sar-config\"" Mar 18 17:06:58.873661 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:58.873492 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-18dc8-serving-cert\"" Mar 18 17:06:58.877294 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:58.877272 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-18dc8-546dc7d746-98whx"] Mar 18 17:06:59.004336 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:59.004304 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/603b043e-b360-43a4-9780-75059a19a711-openshift-service-ca-bundle\") pod \"sequence-graph-18dc8-546dc7d746-98whx\" (UID: \"603b043e-b360-43a4-9780-75059a19a711\") " pod="kserve-ci-e2e-test/sequence-graph-18dc8-546dc7d746-98whx" Mar 18 17:06:59.004506 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:59.004350 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/603b043e-b360-43a4-9780-75059a19a711-proxy-tls\") pod \"sequence-graph-18dc8-546dc7d746-98whx\" (UID: \"603b043e-b360-43a4-9780-75059a19a711\") " pod="kserve-ci-e2e-test/sequence-graph-18dc8-546dc7d746-98whx" Mar 18 17:06:59.105223 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:59.105181 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/603b043e-b360-43a4-9780-75059a19a711-openshift-service-ca-bundle\") pod \"sequence-graph-18dc8-546dc7d746-98whx\" (UID: \"603b043e-b360-43a4-9780-75059a19a711\") " pod="kserve-ci-e2e-test/sequence-graph-18dc8-546dc7d746-98whx" Mar 18 17:06:59.105421 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:59.105238 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/603b043e-b360-43a4-9780-75059a19a711-proxy-tls\") pod \"sequence-graph-18dc8-546dc7d746-98whx\" (UID: \"603b043e-b360-43a4-9780-75059a19a711\") " pod="kserve-ci-e2e-test/sequence-graph-18dc8-546dc7d746-98whx" Mar 18 17:06:59.105994 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:59.105972 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/603b043e-b360-43a4-9780-75059a19a711-openshift-service-ca-bundle\") pod \"sequence-graph-18dc8-546dc7d746-98whx\" (UID: \"603b043e-b360-43a4-9780-75059a19a711\") " pod="kserve-ci-e2e-test/sequence-graph-18dc8-546dc7d746-98whx" Mar 18 17:06:59.107781 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:59.107753 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/603b043e-b360-43a4-9780-75059a19a711-proxy-tls\") pod \"sequence-graph-18dc8-546dc7d746-98whx\" (UID: \"603b043e-b360-43a4-9780-75059a19a711\") " pod="kserve-ci-e2e-test/sequence-graph-18dc8-546dc7d746-98whx" Mar 18 17:06:59.183880 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:59.183789 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-18dc8-546dc7d746-98whx" Mar 18 17:06:59.307158 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:59.307040 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-18dc8-546dc7d746-98whx"] Mar 18 17:06:59.315718 ip-10-0-143-175 kubenswrapper[2578]: W0318 17:06:59.315690 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod603b043e_b360_43a4_9780_75059a19a711.slice/crio-1e02f6ed7a2f0ed0939f6770fe5a8c67e711eccf30f3d1e7eca77c8e6dc6a9c9 WatchSource:0}: Error finding container 1e02f6ed7a2f0ed0939f6770fe5a8c67e711eccf30f3d1e7eca77c8e6dc6a9c9: Status 404 returned error can't find the container with id 1e02f6ed7a2f0ed0939f6770fe5a8c67e711eccf30f3d1e7eca77c8e6dc6a9c9 Mar 18 17:06:59.395790 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:59.395770 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-4ddda-559cdcff4b-9mqdv" Mar 18 17:06:59.508421 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:59.508315 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6974693a-2c12-408d-b531-df87aeae8976-openshift-service-ca-bundle\") pod \"6974693a-2c12-408d-b531-df87aeae8976\" (UID: \"6974693a-2c12-408d-b531-df87aeae8976\") " Mar 18 17:06:59.508421 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:59.508418 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6974693a-2c12-408d-b531-df87aeae8976-proxy-tls\") pod \"6974693a-2c12-408d-b531-df87aeae8976\" (UID: \"6974693a-2c12-408d-b531-df87aeae8976\") " Mar 18 17:06:59.508709 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:59.508682 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6974693a-2c12-408d-b531-df87aeae8976-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "6974693a-2c12-408d-b531-df87aeae8976" (UID: "6974693a-2c12-408d-b531-df87aeae8976"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 17:06:59.510454 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:59.510434 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6974693a-2c12-408d-b531-df87aeae8976-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "6974693a-2c12-408d-b531-df87aeae8976" (UID: "6974693a-2c12-408d-b531-df87aeae8976"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 17:06:59.609422 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:59.609379 2578 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6974693a-2c12-408d-b531-df87aeae8976-openshift-service-ca-bundle\") on node \"ip-10-0-143-175.ec2.internal\" DevicePath \"\"" Mar 18 17:06:59.609422 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:59.609410 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6974693a-2c12-408d-b531-df87aeae8976-proxy-tls\") on node \"ip-10-0-143-175.ec2.internal\" DevicePath \"\"" Mar 18 17:06:59.962661 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:59.962625 2578 generic.go:358] "Generic (PLEG): container finished" podID="6974693a-2c12-408d-b531-df87aeae8976" containerID="96d331a7c05287cc0a4345f0a782cbaa255aaac897aa63d13a039ae016e793f9" exitCode=0 Mar 18 17:06:59.962864 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:59.962688 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-4ddda-559cdcff4b-9mqdv" Mar 18 17:06:59.962864 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:59.962717 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-4ddda-559cdcff4b-9mqdv" event={"ID":"6974693a-2c12-408d-b531-df87aeae8976","Type":"ContainerDied","Data":"96d331a7c05287cc0a4345f0a782cbaa255aaac897aa63d13a039ae016e793f9"} Mar 18 17:06:59.962864 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:59.962756 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-4ddda-559cdcff4b-9mqdv" event={"ID":"6974693a-2c12-408d-b531-df87aeae8976","Type":"ContainerDied","Data":"4a5ce94cc933d57ca4b9fb562a084a2200e62a708ed5219ac2be7dca97ba8b67"} Mar 18 17:06:59.962864 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:59.962777 2578 scope.go:117] "RemoveContainer" containerID="96d331a7c05287cc0a4345f0a782cbaa255aaac897aa63d13a039ae016e793f9" Mar 18 17:06:59.964234 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:59.964212 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-18dc8-546dc7d746-98whx" event={"ID":"603b043e-b360-43a4-9780-75059a19a711","Type":"ContainerStarted","Data":"a0f5d4fedc85a89bb810de130597a5df33b755c10591c9cd19fe2c9815e3b30b"} Mar 18 17:06:59.964309 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:59.964239 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-18dc8-546dc7d746-98whx" event={"ID":"603b043e-b360-43a4-9780-75059a19a711","Type":"ContainerStarted","Data":"1e02f6ed7a2f0ed0939f6770fe5a8c67e711eccf30f3d1e7eca77c8e6dc6a9c9"} Mar 18 17:06:59.964367 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:59.964349 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-18dc8-546dc7d746-98whx" Mar 18 17:06:59.973671 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:59.973652 2578 scope.go:117] "RemoveContainer" containerID="96d331a7c05287cc0a4345f0a782cbaa255aaac897aa63d13a039ae016e793f9" Mar 18 17:06:59.973961 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:06:59.973936 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96d331a7c05287cc0a4345f0a782cbaa255aaac897aa63d13a039ae016e793f9\": container with ID starting with 96d331a7c05287cc0a4345f0a782cbaa255aaac897aa63d13a039ae016e793f9 not found: ID does not exist" containerID="96d331a7c05287cc0a4345f0a782cbaa255aaac897aa63d13a039ae016e793f9" Mar 18 17:06:59.974057 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:59.973964 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96d331a7c05287cc0a4345f0a782cbaa255aaac897aa63d13a039ae016e793f9"} err="failed to get container status \"96d331a7c05287cc0a4345f0a782cbaa255aaac897aa63d13a039ae016e793f9\": rpc error: code = NotFound desc = could not find container \"96d331a7c05287cc0a4345f0a782cbaa255aaac897aa63d13a039ae016e793f9\": container with ID starting with 96d331a7c05287cc0a4345f0a782cbaa255aaac897aa63d13a039ae016e793f9 not found: ID does not exist" Mar 18 17:06:59.979647 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:59.979606 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/sequence-graph-18dc8-546dc7d746-98whx" podStartSLOduration=1.979595206 podStartE2EDuration="1.979595206s" podCreationTimestamp="2026-03-18 17:06:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:06:59.979126886 +0000 UTC m=+1373.991235710" watchObservedRunningTime="2026-03-18 17:06:59.979595206 +0000 UTC m=+1373.991704030" Mar 18 17:06:59.998858 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:59.993775 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-4ddda-559cdcff4b-9mqdv"] Mar 18 17:06:59.998858 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:06:59.996974 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-4ddda-559cdcff4b-9mqdv"] Mar 18 17:07:00.593288 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:07:00.593258 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6974693a-2c12-408d-b531-df87aeae8976" path="/var/lib/kubelet/pods/6974693a-2c12-408d-b531-df87aeae8976/volumes" Mar 18 17:07:01.869372 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:07:01.869324 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-19f83-predictor-b5c694bf5-xkjns" podUID="97d9ff2f-d076-4548-83cd-129f594d7745" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.42:8080: connect: connection refused" Mar 18 17:07:05.974627 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:07:05.974600 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sequence-graph-18dc8-546dc7d746-98whx" Mar 18 17:07:08.937498 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:07:08.937458 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-18dc8-546dc7d746-98whx"] Mar 18 17:07:08.937898 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:07:08.937800 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sequence-graph-18dc8-546dc7d746-98whx" podUID="603b043e-b360-43a4-9780-75059a19a711" containerName="sequence-graph-18dc8" containerID="cri-o://a0f5d4fedc85a89bb810de130597a5df33b755c10591c9cd19fe2c9815e3b30b" gracePeriod=30 Mar 18 17:07:09.264198 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:07:09.264166 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-18dc8-predictor-8c499d9c-282kw"] Mar 18 17:07:09.264416 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:07:09.264395 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-18dc8-predictor-8c499d9c-282kw" podUID="0ec39ecc-1be3-42be-86e0-f176bf8e08e1" containerName="kserve-container" containerID="cri-o://db0c261dbca995d71639c7b03911426d66ad740e551b682424741641e3c73c4d" gracePeriod=30 Mar 18 17:07:09.274237 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:07:09.274209 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-8d83a-predictor-7f9fd554b9-gb4cv"] Mar 18 17:07:09.274547 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:07:09.274518 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6974693a-2c12-408d-b531-df87aeae8976" containerName="ensemble-graph-4ddda" Mar 18 17:07:09.274547 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:07:09.274544 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="6974693a-2c12-408d-b531-df87aeae8976" containerName="ensemble-graph-4ddda" Mar 18 17:07:09.274623 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:07:09.274603 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="6974693a-2c12-408d-b531-df87aeae8976" containerName="ensemble-graph-4ddda" Mar 18 17:07:09.278681 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:07:09.278665 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-8d83a-predictor-7f9fd554b9-gb4cv" Mar 18 17:07:09.284299 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:07:09.284267 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-8d83a-predictor-7f9fd554b9-gb4cv"] Mar 18 17:07:09.392279 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:07:09.392242 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdm9n\" (UniqueName: \"kubernetes.io/projected/bb38c4ad-38c5-4006-9eab-4a67b9575f24-kube-api-access-xdm9n\") pod \"success-200-isvc-8d83a-predictor-7f9fd554b9-gb4cv\" (UID: \"bb38c4ad-38c5-4006-9eab-4a67b9575f24\") " pod="kserve-ci-e2e-test/success-200-isvc-8d83a-predictor-7f9fd554b9-gb4cv" Mar 18 17:07:09.492850 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:07:09.492809 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xdm9n\" (UniqueName: \"kubernetes.io/projected/bb38c4ad-38c5-4006-9eab-4a67b9575f24-kube-api-access-xdm9n\") pod \"success-200-isvc-8d83a-predictor-7f9fd554b9-gb4cv\" (UID: \"bb38c4ad-38c5-4006-9eab-4a67b9575f24\") " pod="kserve-ci-e2e-test/success-200-isvc-8d83a-predictor-7f9fd554b9-gb4cv" Mar 18 17:07:09.507839 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:07:09.507801 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdm9n\" (UniqueName: \"kubernetes.io/projected/bb38c4ad-38c5-4006-9eab-4a67b9575f24-kube-api-access-xdm9n\") pod \"success-200-isvc-8d83a-predictor-7f9fd554b9-gb4cv\" (UID: \"bb38c4ad-38c5-4006-9eab-4a67b9575f24\") " pod="kserve-ci-e2e-test/success-200-isvc-8d83a-predictor-7f9fd554b9-gb4cv" Mar 18 17:07:09.590078 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:07:09.590047 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-8d83a-predictor-7f9fd554b9-gb4cv" Mar 18 17:07:09.731388 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:07:09.731355 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-8d83a-predictor-7f9fd554b9-gb4cv"] Mar 18 17:07:09.734091 ip-10-0-143-175 kubenswrapper[2578]: W0318 17:07:09.734055 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb38c4ad_38c5_4006_9eab_4a67b9575f24.slice/crio-6a2f40fe2de51fb813d6bab03801f7c31721e10e313a772c72dc78d600de5a56 WatchSource:0}: Error finding container 6a2f40fe2de51fb813d6bab03801f7c31721e10e313a772c72dc78d600de5a56: Status 404 returned error can't find the container with id 6a2f40fe2de51fb813d6bab03801f7c31721e10e313a772c72dc78d600de5a56 Mar 18 17:07:09.999094 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:07:09.999053 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-8d83a-predictor-7f9fd554b9-gb4cv" event={"ID":"bb38c4ad-38c5-4006-9eab-4a67b9575f24","Type":"ContainerStarted","Data":"6b4ed51a3bd67c4a6857b973ff13e7952cd3072073b2093aac9be6242ba41f1b"} Mar 18 17:07:09.999094 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:07:09.999092 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-8d83a-predictor-7f9fd554b9-gb4cv" event={"ID":"bb38c4ad-38c5-4006-9eab-4a67b9575f24","Type":"ContainerStarted","Data":"6a2f40fe2de51fb813d6bab03801f7c31721e10e313a772c72dc78d600de5a56"} Mar 18 17:07:09.999613 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:07:09.999243 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-8d83a-predictor-7f9fd554b9-gb4cv" Mar 18 17:07:10.000715 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:07:10.000690 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-8d83a-predictor-7f9fd554b9-gb4cv" podUID="bb38c4ad-38c5-4006-9eab-4a67b9575f24" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.44:8080: connect: connection refused" Mar 18 17:07:10.014381 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:07:10.014274 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-8d83a-predictor-7f9fd554b9-gb4cv" podStartSLOduration=1.01425648 podStartE2EDuration="1.01425648s" podCreationTimestamp="2026-03-18 17:07:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:07:10.013429843 +0000 UTC m=+1384.025538669" watchObservedRunningTime="2026-03-18 17:07:10.01425648 +0000 UTC m=+1384.026365306" Mar 18 17:07:10.590022 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:07:10.589953 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:07:10.697516 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:07:10.697469 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-18dc8-predictor-8c499d9c-282kw" podUID="0ec39ecc-1be3-42be-86e0-f176bf8e08e1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Mar 18 17:07:10.973070 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:07:10.972977 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-18dc8-546dc7d746-98whx" podUID="603b043e-b360-43a4-9780-75059a19a711" containerName="sequence-graph-18dc8" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 17:07:11.003073 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:07:11.003031 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-8d83a-predictor-7f9fd554b9-gb4cv" podUID="bb38c4ad-38c5-4006-9eab-4a67b9575f24" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.44:8080: connect: connection refused" Mar 18 17:07:11.868716 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:07:11.868673 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-19f83-predictor-b5c694bf5-xkjns" podUID="97d9ff2f-d076-4548-83cd-129f594d7745" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.42:8080: connect: connection refused" Mar 18 17:07:12.516729 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:07:12.516705 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-18dc8-predictor-8c499d9c-282kw" Mar 18 17:07:12.622175 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:07:12.622085 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fw99k\" (UniqueName: \"kubernetes.io/projected/0ec39ecc-1be3-42be-86e0-f176bf8e08e1-kube-api-access-fw99k\") pod \"0ec39ecc-1be3-42be-86e0-f176bf8e08e1\" (UID: \"0ec39ecc-1be3-42be-86e0-f176bf8e08e1\") " Mar 18 17:07:12.624161 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:07:12.624130 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ec39ecc-1be3-42be-86e0-f176bf8e08e1-kube-api-access-fw99k" (OuterVolumeSpecName: "kube-api-access-fw99k") pod "0ec39ecc-1be3-42be-86e0-f176bf8e08e1" (UID: "0ec39ecc-1be3-42be-86e0-f176bf8e08e1"). InnerVolumeSpecName "kube-api-access-fw99k". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 17:07:12.723343 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:07:12.723308 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fw99k\" (UniqueName: \"kubernetes.io/projected/0ec39ecc-1be3-42be-86e0-f176bf8e08e1-kube-api-access-fw99k\") on node \"ip-10-0-143-175.ec2.internal\" DevicePath \"\"" Mar 18 17:07:13.010295 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:07:13.010208 2578 generic.go:358] "Generic (PLEG): container finished" podID="0ec39ecc-1be3-42be-86e0-f176bf8e08e1" containerID="db0c261dbca995d71639c7b03911426d66ad740e551b682424741641e3c73c4d" exitCode=0 Mar 18 17:07:13.010295 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:07:13.010288 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-18dc8-predictor-8c499d9c-282kw" Mar 18 17:07:13.010502 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:07:13.010293 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-18dc8-predictor-8c499d9c-282kw" event={"ID":"0ec39ecc-1be3-42be-86e0-f176bf8e08e1","Type":"ContainerDied","Data":"db0c261dbca995d71639c7b03911426d66ad740e551b682424741641e3c73c4d"} Mar 18 17:07:13.010502 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:07:13.010330 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-18dc8-predictor-8c499d9c-282kw" event={"ID":"0ec39ecc-1be3-42be-86e0-f176bf8e08e1","Type":"ContainerDied","Data":"f0fe75bdb59986d001fc7a7954eee0609fd030c6173459e94ab6bcb33888a1ad"} Mar 18 17:07:13.010502 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:07:13.010345 2578 scope.go:117] "RemoveContainer" containerID="db0c261dbca995d71639c7b03911426d66ad740e551b682424741641e3c73c4d" Mar 18 17:07:13.022450 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:07:13.022433 2578 scope.go:117] "RemoveContainer" containerID="db0c261dbca995d71639c7b03911426d66ad740e551b682424741641e3c73c4d" Mar 18 17:07:13.022763 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:07:13.022726 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db0c261dbca995d71639c7b03911426d66ad740e551b682424741641e3c73c4d\": container with ID starting with db0c261dbca995d71639c7b03911426d66ad740e551b682424741641e3c73c4d not found: ID does not exist" containerID="db0c261dbca995d71639c7b03911426d66ad740e551b682424741641e3c73c4d" Mar 18 17:07:13.022763 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:07:13.022753 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db0c261dbca995d71639c7b03911426d66ad740e551b682424741641e3c73c4d"} err="failed to get container status \"db0c261dbca995d71639c7b03911426d66ad740e551b682424741641e3c73c4d\": rpc error: code = NotFound desc = could not find container \"db0c261dbca995d71639c7b03911426d66ad740e551b682424741641e3c73c4d\": container with ID starting with db0c261dbca995d71639c7b03911426d66ad740e551b682424741641e3c73c4d not found: ID does not exist" Mar 18 17:07:13.032036 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:07:13.032008 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-18dc8-predictor-8c499d9c-282kw"] Mar 18 17:07:13.035877 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:07:13.035854 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-18dc8-predictor-8c499d9c-282kw"] Mar 18 17:07:14.592335 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:07:14.592300 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ec39ecc-1be3-42be-86e0-f176bf8e08e1" path="/var/lib/kubelet/pods/0ec39ecc-1be3-42be-86e0-f176bf8e08e1/volumes" Mar 18 17:07:15.973267 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:07:15.973228 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-18dc8-546dc7d746-98whx" podUID="603b043e-b360-43a4-9780-75059a19a711" containerName="sequence-graph-18dc8" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 17:07:20.973416 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:07:20.973376 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-18dc8-546dc7d746-98whx" podUID="603b043e-b360-43a4-9780-75059a19a711" containerName="sequence-graph-18dc8" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 17:07:20.973882 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:07:20.973496 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-18dc8-546dc7d746-98whx" Mar 18 17:07:21.003296 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:07:21.003253 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-8d83a-predictor-7f9fd554b9-gb4cv" podUID="bb38c4ad-38c5-4006-9eab-4a67b9575f24" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.44:8080: connect: connection refused" Mar 18 17:07:21.869019 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:07:21.868971 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-19f83-predictor-b5c694bf5-xkjns" podUID="97d9ff2f-d076-4548-83cd-129f594d7745" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.42:8080: connect: connection refused" Mar 18 17:07:22.589341 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:07:22.589310 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:07:25.973384 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:07:25.973336 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-18dc8-546dc7d746-98whx" podUID="603b043e-b360-43a4-9780-75059a19a711" containerName="sequence-graph-18dc8" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 17:07:30.973322 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:07:30.973281 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-18dc8-546dc7d746-98whx" podUID="603b043e-b360-43a4-9780-75059a19a711" containerName="sequence-graph-18dc8" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 17:07:31.004037 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:07:31.004003 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-8d83a-predictor-7f9fd554b9-gb4cv" podUID="bb38c4ad-38c5-4006-9eab-4a67b9575f24" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.44:8080: connect: connection refused" Mar 18 17:07:31.869708 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:07:31.869668 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-19f83-predictor-b5c694bf5-xkjns" Mar 18 17:07:35.973024 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:07:35.972967 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-18dc8-546dc7d746-98whx" podUID="603b043e-b360-43a4-9780-75059a19a711" containerName="sequence-graph-18dc8" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 17:07:37.589711 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:07:37.589678 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:07:39.076814 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:07:39.076789 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-18dc8-546dc7d746-98whx" Mar 18 17:07:39.104340 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:07:39.104307 2578 generic.go:358] "Generic (PLEG): container finished" podID="603b043e-b360-43a4-9780-75059a19a711" containerID="a0f5d4fedc85a89bb810de130597a5df33b755c10591c9cd19fe2c9815e3b30b" exitCode=0 Mar 18 17:07:39.104489 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:07:39.104385 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-18dc8-546dc7d746-98whx" Mar 18 17:07:39.104489 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:07:39.104393 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-18dc8-546dc7d746-98whx" event={"ID":"603b043e-b360-43a4-9780-75059a19a711","Type":"ContainerDied","Data":"a0f5d4fedc85a89bb810de130597a5df33b755c10591c9cd19fe2c9815e3b30b"} Mar 18 17:07:39.104489 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:07:39.104431 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-18dc8-546dc7d746-98whx" event={"ID":"603b043e-b360-43a4-9780-75059a19a711","Type":"ContainerDied","Data":"1e02f6ed7a2f0ed0939f6770fe5a8c67e711eccf30f3d1e7eca77c8e6dc6a9c9"} Mar 18 17:07:39.104489 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:07:39.104450 2578 scope.go:117] "RemoveContainer" containerID="a0f5d4fedc85a89bb810de130597a5df33b755c10591c9cd19fe2c9815e3b30b" Mar 18 17:07:39.114651 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:07:39.114625 2578 scope.go:117] "RemoveContainer" containerID="a0f5d4fedc85a89bb810de130597a5df33b755c10591c9cd19fe2c9815e3b30b" Mar 18 17:07:39.114951 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:07:39.114932 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0f5d4fedc85a89bb810de130597a5df33b755c10591c9cd19fe2c9815e3b30b\": container with ID starting with a0f5d4fedc85a89bb810de130597a5df33b755c10591c9cd19fe2c9815e3b30b not found: ID does not exist" containerID="a0f5d4fedc85a89bb810de130597a5df33b755c10591c9cd19fe2c9815e3b30b" Mar 18 17:07:39.115001 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:07:39.114960 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0f5d4fedc85a89bb810de130597a5df33b755c10591c9cd19fe2c9815e3b30b"} err="failed to get container status \"a0f5d4fedc85a89bb810de130597a5df33b755c10591c9cd19fe2c9815e3b30b\": rpc error: code = NotFound desc = could not find container \"a0f5d4fedc85a89bb810de130597a5df33b755c10591c9cd19fe2c9815e3b30b\": container with ID starting with a0f5d4fedc85a89bb810de130597a5df33b755c10591c9cd19fe2c9815e3b30b not found: ID does not exist" Mar 18 17:07:39.128736 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:07:39.128717 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/603b043e-b360-43a4-9780-75059a19a711-proxy-tls\") pod \"603b043e-b360-43a4-9780-75059a19a711\" (UID: \"603b043e-b360-43a4-9780-75059a19a711\") " Mar 18 17:07:39.128808 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:07:39.128751 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/603b043e-b360-43a4-9780-75059a19a711-openshift-service-ca-bundle\") pod \"603b043e-b360-43a4-9780-75059a19a711\" (UID: \"603b043e-b360-43a4-9780-75059a19a711\") " Mar 18 17:07:39.129131 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:07:39.129107 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/603b043e-b360-43a4-9780-75059a19a711-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "603b043e-b360-43a4-9780-75059a19a711" (UID: "603b043e-b360-43a4-9780-75059a19a711"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 17:07:39.130734 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:07:39.130712 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/603b043e-b360-43a4-9780-75059a19a711-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "603b043e-b360-43a4-9780-75059a19a711" (UID: "603b043e-b360-43a4-9780-75059a19a711"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 17:07:39.229819 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:07:39.229730 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/603b043e-b360-43a4-9780-75059a19a711-proxy-tls\") on node \"ip-10-0-143-175.ec2.internal\" DevicePath \"\"" Mar 18 17:07:39.229819 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:07:39.229765 2578 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/603b043e-b360-43a4-9780-75059a19a711-openshift-service-ca-bundle\") on node \"ip-10-0-143-175.ec2.internal\" DevicePath \"\"" Mar 18 17:07:39.423474 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:07:39.423445 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-18dc8-546dc7d746-98whx"] Mar 18 17:07:39.425696 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:07:39.425671 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-18dc8-546dc7d746-98whx"] Mar 18 17:07:40.593095 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:07:40.593057 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="603b043e-b360-43a4-9780-75059a19a711" path="/var/lib/kubelet/pods/603b043e-b360-43a4-9780-75059a19a711/volumes" Mar 18 17:07:41.003515 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:07:41.003424 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-8d83a-predictor-7f9fd554b9-gb4cv" podUID="bb38c4ad-38c5-4006-9eab-4a67b9575f24" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.44:8080: connect: connection refused" Mar 18 17:07:49.601316 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:07:49.601272 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-19f83-b5859dffb-4pxf7"] Mar 18 17:07:49.601809 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:07:49.601781 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="603b043e-b360-43a4-9780-75059a19a711" containerName="sequence-graph-18dc8" Mar 18 17:07:49.601809 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:07:49.601801 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="603b043e-b360-43a4-9780-75059a19a711" containerName="sequence-graph-18dc8" Mar 18 17:07:49.601927 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:07:49.601832 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0ec39ecc-1be3-42be-86e0-f176bf8e08e1" containerName="kserve-container" Mar 18 17:07:49.601927 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:07:49.601840 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ec39ecc-1be3-42be-86e0-f176bf8e08e1" containerName="kserve-container" Mar 18 17:07:49.601927 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:07:49.601924 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="0ec39ecc-1be3-42be-86e0-f176bf8e08e1" containerName="kserve-container" Mar 18 17:07:49.602066 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:07:49.601940 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="603b043e-b360-43a4-9780-75059a19a711" containerName="sequence-graph-18dc8" Mar 18 17:07:49.606188 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:07:49.606165 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-19f83-b5859dffb-4pxf7" Mar 18 17:07:49.607967 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:07:49.607946 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"ensemble-graph-19f83-serving-cert\"" Mar 18 17:07:49.608093 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:07:49.607952 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"ensemble-graph-19f83-kube-rbac-proxy-sar-config\"" Mar 18 17:07:49.612382 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:07:49.612352 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-19f83-b5859dffb-4pxf7"] Mar 18 17:07:49.703911 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:07:49.703863 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d6a6472d-1553-4fe1-8f9c-13d2dee6d93a-proxy-tls\") pod \"ensemble-graph-19f83-b5859dffb-4pxf7\" (UID: \"d6a6472d-1553-4fe1-8f9c-13d2dee6d93a\") " pod="kserve-ci-e2e-test/ensemble-graph-19f83-b5859dffb-4pxf7" Mar 18 17:07:49.703911 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:07:49.703911 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d6a6472d-1553-4fe1-8f9c-13d2dee6d93a-openshift-service-ca-bundle\") pod \"ensemble-graph-19f83-b5859dffb-4pxf7\" (UID: \"d6a6472d-1553-4fe1-8f9c-13d2dee6d93a\") " pod="kserve-ci-e2e-test/ensemble-graph-19f83-b5859dffb-4pxf7" Mar 18 17:07:49.804925 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:07:49.804893 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d6a6472d-1553-4fe1-8f9c-13d2dee6d93a-proxy-tls\") pod \"ensemble-graph-19f83-b5859dffb-4pxf7\" (UID: \"d6a6472d-1553-4fe1-8f9c-13d2dee6d93a\") " pod="kserve-ci-e2e-test/ensemble-graph-19f83-b5859dffb-4pxf7" Mar 18 17:07:49.804925 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:07:49.804931 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d6a6472d-1553-4fe1-8f9c-13d2dee6d93a-openshift-service-ca-bundle\") pod \"ensemble-graph-19f83-b5859dffb-4pxf7\" (UID: \"d6a6472d-1553-4fe1-8f9c-13d2dee6d93a\") " pod="kserve-ci-e2e-test/ensemble-graph-19f83-b5859dffb-4pxf7" Mar 18 17:07:49.805646 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:07:49.805618 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d6a6472d-1553-4fe1-8f9c-13d2dee6d93a-openshift-service-ca-bundle\") pod \"ensemble-graph-19f83-b5859dffb-4pxf7\" (UID: \"d6a6472d-1553-4fe1-8f9c-13d2dee6d93a\") " pod="kserve-ci-e2e-test/ensemble-graph-19f83-b5859dffb-4pxf7" Mar 18 17:07:49.807557 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:07:49.807537 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d6a6472d-1553-4fe1-8f9c-13d2dee6d93a-proxy-tls\") pod \"ensemble-graph-19f83-b5859dffb-4pxf7\" (UID: \"d6a6472d-1553-4fe1-8f9c-13d2dee6d93a\") " pod="kserve-ci-e2e-test/ensemble-graph-19f83-b5859dffb-4pxf7" Mar 18 17:07:49.917636 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:07:49.917509 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-19f83-b5859dffb-4pxf7" Mar 18 17:07:50.043846 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:07:50.043722 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-19f83-b5859dffb-4pxf7"] Mar 18 17:07:50.046516 ip-10-0-143-175 kubenswrapper[2578]: W0318 17:07:50.046483 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6a6472d_1553_4fe1_8f9c_13d2dee6d93a.slice/crio-feec5d767e7178a88704ef654810a18f1cbc7953eb833a4353ab4f5b1ea72416 WatchSource:0}: Error finding container feec5d767e7178a88704ef654810a18f1cbc7953eb833a4353ab4f5b1ea72416: Status 404 returned error can't find the container with id feec5d767e7178a88704ef654810a18f1cbc7953eb833a4353ab4f5b1ea72416 Mar 18 17:07:50.142300 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:07:50.142264 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-19f83-b5859dffb-4pxf7" event={"ID":"d6a6472d-1553-4fe1-8f9c-13d2dee6d93a","Type":"ContainerStarted","Data":"d169c5684029224d0db5277b15c76b8119491f07a925f4f1428d442f9778867a"} Mar 18 17:07:50.142300 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:07:50.142302 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-19f83-b5859dffb-4pxf7" event={"ID":"d6a6472d-1553-4fe1-8f9c-13d2dee6d93a","Type":"ContainerStarted","Data":"feec5d767e7178a88704ef654810a18f1cbc7953eb833a4353ab4f5b1ea72416"} Mar 18 17:07:50.142547 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:07:50.142399 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/ensemble-graph-19f83-b5859dffb-4pxf7" Mar 18 17:07:50.159017 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:07:50.158967 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/ensemble-graph-19f83-b5859dffb-4pxf7" podStartSLOduration=1.158952687 podStartE2EDuration="1.158952687s" podCreationTimestamp="2026-03-18 17:07:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:07:50.15680972 +0000 UTC m=+1424.168918542" watchObservedRunningTime="2026-03-18 17:07:50.158952687 +0000 UTC m=+1424.171061524" Mar 18 17:07:51.003598 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:07:51.003556 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-8d83a-predictor-7f9fd554b9-gb4cv" podUID="bb38c4ad-38c5-4006-9eab-4a67b9575f24" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.44:8080: connect: connection refused" Mar 18 17:07:52.589288 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:07:52.589239 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:07:56.151215 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:07:56.151142 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/ensemble-graph-19f83-b5859dffb-4pxf7" Mar 18 17:08:01.004195 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:08:01.004145 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-8d83a-predictor-7f9fd554b9-gb4cv" podUID="bb38c4ad-38c5-4006-9eab-4a67b9575f24" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.44:8080: connect: connection refused" Mar 18 17:08:05.589877 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:08:05.589839 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:08:11.005076 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:08:11.005044 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-8d83a-predictor-7f9fd554b9-gb4cv" Mar 18 17:08:18.589195 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:08:18.589150 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:08:29.289217 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:08:29.289177 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/sequence-graph-8d83a-775c8cf4fc-289tk"] Mar 18 17:08:29.293927 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:08:29.293906 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-8d83a-775c8cf4fc-289tk" Mar 18 17:08:29.295769 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:08:29.295741 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-8d83a-kube-rbac-proxy-sar-config\"" Mar 18 17:08:29.295888 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:08:29.295831 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-8d83a-serving-cert\"" Mar 18 17:08:29.299277 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:08:29.299251 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-8d83a-775c8cf4fc-289tk"] Mar 18 17:08:29.439941 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:08:29.439903 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/48e1881b-e2d9-4786-8624-1d663c24a219-openshift-service-ca-bundle\") pod \"sequence-graph-8d83a-775c8cf4fc-289tk\" (UID: \"48e1881b-e2d9-4786-8624-1d663c24a219\") " pod="kserve-ci-e2e-test/sequence-graph-8d83a-775c8cf4fc-289tk" Mar 18 17:08:29.440125 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:08:29.439992 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/48e1881b-e2d9-4786-8624-1d663c24a219-proxy-tls\") pod \"sequence-graph-8d83a-775c8cf4fc-289tk\" (UID: \"48e1881b-e2d9-4786-8624-1d663c24a219\") " pod="kserve-ci-e2e-test/sequence-graph-8d83a-775c8cf4fc-289tk" Mar 18 17:08:29.541003 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:08:29.540913 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/48e1881b-e2d9-4786-8624-1d663c24a219-openshift-service-ca-bundle\") pod \"sequence-graph-8d83a-775c8cf4fc-289tk\" (UID: \"48e1881b-e2d9-4786-8624-1d663c24a219\") " pod="kserve-ci-e2e-test/sequence-graph-8d83a-775c8cf4fc-289tk" Mar 18 17:08:29.541003 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:08:29.540987 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/48e1881b-e2d9-4786-8624-1d663c24a219-proxy-tls\") pod \"sequence-graph-8d83a-775c8cf4fc-289tk\" (UID: \"48e1881b-e2d9-4786-8624-1d663c24a219\") " pod="kserve-ci-e2e-test/sequence-graph-8d83a-775c8cf4fc-289tk" Mar 18 17:08:29.541206 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:08:29.541100 2578 secret.go:189] Couldn't get secret kserve-ci-e2e-test/sequence-graph-8d83a-serving-cert: secret "sequence-graph-8d83a-serving-cert" not found Mar 18 17:08:29.541206 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:08:29.541162 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/48e1881b-e2d9-4786-8624-1d663c24a219-proxy-tls podName:48e1881b-e2d9-4786-8624-1d663c24a219 nodeName:}" failed. No retries permitted until 2026-03-18 17:08:30.041146329 +0000 UTC m=+1464.053255133 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/48e1881b-e2d9-4786-8624-1d663c24a219-proxy-tls") pod "sequence-graph-8d83a-775c8cf4fc-289tk" (UID: "48e1881b-e2d9-4786-8624-1d663c24a219") : secret "sequence-graph-8d83a-serving-cert" not found Mar 18 17:08:29.541590 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:08:29.541570 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/48e1881b-e2d9-4786-8624-1d663c24a219-openshift-service-ca-bundle\") pod \"sequence-graph-8d83a-775c8cf4fc-289tk\" (UID: \"48e1881b-e2d9-4786-8624-1d663c24a219\") " pod="kserve-ci-e2e-test/sequence-graph-8d83a-775c8cf4fc-289tk" Mar 18 17:08:30.045438 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:08:30.045396 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/48e1881b-e2d9-4786-8624-1d663c24a219-proxy-tls\") pod \"sequence-graph-8d83a-775c8cf4fc-289tk\" (UID: \"48e1881b-e2d9-4786-8624-1d663c24a219\") " pod="kserve-ci-e2e-test/sequence-graph-8d83a-775c8cf4fc-289tk" Mar 18 17:08:30.047832 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:08:30.047807 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/48e1881b-e2d9-4786-8624-1d663c24a219-proxy-tls\") pod \"sequence-graph-8d83a-775c8cf4fc-289tk\" (UID: \"48e1881b-e2d9-4786-8624-1d663c24a219\") " pod="kserve-ci-e2e-test/sequence-graph-8d83a-775c8cf4fc-289tk" Mar 18 17:08:30.205438 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:08:30.205401 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-8d83a-775c8cf4fc-289tk" Mar 18 17:08:30.330627 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:08:30.330546 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-8d83a-775c8cf4fc-289tk"] Mar 18 17:08:30.334225 ip-10-0-143-175 kubenswrapper[2578]: W0318 17:08:30.334196 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48e1881b_e2d9_4786_8624_1d663c24a219.slice/crio-369fe5e770054c291bb7b89a53a50401de71033a4fd34ce2037400d1baecd241 WatchSource:0}: Error finding container 369fe5e770054c291bb7b89a53a50401de71033a4fd34ce2037400d1baecd241: Status 404 returned error can't find the container with id 369fe5e770054c291bb7b89a53a50401de71033a4fd34ce2037400d1baecd241 Mar 18 17:08:30.589163 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:08:30.589127 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:08:31.277094 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:08:31.277058 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-8d83a-775c8cf4fc-289tk" event={"ID":"48e1881b-e2d9-4786-8624-1d663c24a219","Type":"ContainerStarted","Data":"2dce5aea52ed1b4040a69c0a27268f088dd55a0b47eb8fc6d6b37046cc6ceaa6"} Mar 18 17:08:31.277094 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:08:31.277096 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-8d83a-775c8cf4fc-289tk" event={"ID":"48e1881b-e2d9-4786-8624-1d663c24a219","Type":"ContainerStarted","Data":"369fe5e770054c291bb7b89a53a50401de71033a4fd34ce2037400d1baecd241"} Mar 18 17:08:31.277349 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:08:31.277200 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-8d83a-775c8cf4fc-289tk" Mar 18 17:08:31.295391 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:08:31.295342 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/sequence-graph-8d83a-775c8cf4fc-289tk" podStartSLOduration=2.295325844 podStartE2EDuration="2.295325844s" podCreationTimestamp="2026-03-18 17:08:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:08:31.293197451 +0000 UTC m=+1465.305306277" watchObservedRunningTime="2026-03-18 17:08:31.295325844 +0000 UTC m=+1465.307434668" Mar 18 17:08:37.286050 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:08:37.286023 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sequence-graph-8d83a-775c8cf4fc-289tk" Mar 18 17:08:43.589644 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:08:43.589608 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:08:58.997806 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:08:58.997750 2578 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized" image="quay.io/opendatahub/odh-model-serving-api:fast" Mar 18 17:08:58.998204 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:08:58.997954 2578 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:server,Image:quay.io/opendatahub/odh-model-serving-api:fast,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},ContainerPort{Name:metrics,HostPort:0,ContainerPort:9090,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:TLS_CERT_DIR,Value:/tls,ValueFrom:nil,},EnvVar{Name:GATEWAY_LABEL_SELECTOR,Value:,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{50 -3} {} 50m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tls-certs,ReadOnly:true,MountPath:/tls,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xp5hl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000640000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod model-serving-api-9699c8d45-stcsx_kserve(4f796fc2-c6ea-4936-8fae-6cd0d2b876db): ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized" logger="UnhandledError" Mar 18 17:08:58.999176 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:08:58.999135 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ErrImagePull: \"unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:09:09.589765 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:09:09.589724 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:09:21.589182 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:09:21.589150 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:09:36.591756 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:09:36.591582 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:09:49.591731 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:09:49.589801 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:10:02.589338 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:10:02.589303 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:10:14.590139 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:10:14.590107 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:10:27.589626 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:10:27.589428 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 17:10:27.589889 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:10:27.589631 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:10:40.589439 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:10:40.589408 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:10:51.589571 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:10:51.589536 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:11:03.589731 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:11:03.589623 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:11:14.589492 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:11:14.589461 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:11:26.593783 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:11:26.593751 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:11:37.589930 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:11:37.589891 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:11:48.590151 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:11:48.590042 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:12:02.589052 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:12:02.589010 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:12:17.589906 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:12:17.589860 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:12:29.589433 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:12:29.589395 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:12:43.589327 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:12:43.589295 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:12:54.589550 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:12:54.589441 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:13:05.589957 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:13:05.589922 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:13:19.589650 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:13:19.589617 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:13:34.590185 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:13:34.590006 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:13:47.589898 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:13:47.589862 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:13:58.589575 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:13:58.589538 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:14:09.973240 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:14:09.973131 2578 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized" image="quay.io/opendatahub/odh-model-serving-api:fast" Mar 18 17:14:09.974398 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:14:09.974357 2578 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:server,Image:quay.io/opendatahub/odh-model-serving-api:fast,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},ContainerPort{Name:metrics,HostPort:0,ContainerPort:9090,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:TLS_CERT_DIR,Value:/tls,ValueFrom:nil,},EnvVar{Name:GATEWAY_LABEL_SELECTOR,Value:,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{50 -3} {} 50m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tls-certs,ReadOnly:true,MountPath:/tls,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xp5hl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000640000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod model-serving-api-9699c8d45-stcsx_kserve(4f796fc2-c6ea-4936-8fae-6cd0d2b876db): ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized" logger="UnhandledError" Mar 18 17:14:09.975545 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:14:09.975506 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ErrImagePull: \"unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:14:22.589950 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:14:22.589917 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:14:36.591734 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:14:36.591636 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:14:50.591038 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:14:50.589087 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:15:04.589981 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:15:04.589949 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:15:15.589859 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:15:15.589825 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:15:30.589814 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:15:30.589610 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 17:15:30.590104 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:15:30.589864 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:15:45.589890 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:15:45.589856 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:16:00.589287 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:16:00.589254 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:16:04.222396 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:16:04.222360 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-19f83-b5859dffb-4pxf7"] Mar 18 17:16:04.222927 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:16:04.222647 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/ensemble-graph-19f83-b5859dffb-4pxf7" podUID="d6a6472d-1553-4fe1-8f9c-13d2dee6d93a" containerName="ensemble-graph-19f83" containerID="cri-o://d169c5684029224d0db5277b15c76b8119491f07a925f4f1428d442f9778867a" gracePeriod=30 Mar 18 17:16:04.676900 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:16:04.676864 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-19f83-predictor-b5c694bf5-xkjns"] Mar 18 17:16:04.677140 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:16:04.677118 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-19f83-predictor-b5c694bf5-xkjns" podUID="97d9ff2f-d076-4548-83cd-129f594d7745" containerName="kserve-container" containerID="cri-o://2e19f7d8e3a622c66578d77066763b5aa8b3880e23beb7aa1e6016abb3f81db6" gracePeriod=30 Mar 18 17:16:04.774324 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:16:04.774290 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-0c4f8-predictor-67fb899dc9-88dnr"] Mar 18 17:16:04.779140 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:16:04.779112 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-0c4f8-predictor-67fb899dc9-88dnr" Mar 18 17:16:04.782420 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:16:04.782391 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-0c4f8-predictor-67fb899dc9-88dnr"] Mar 18 17:16:04.878255 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:16:04.878215 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9pdj\" (UniqueName: \"kubernetes.io/projected/317736d7-450d-41e3-b7bf-0cd269b30334-kube-api-access-h9pdj\") pod \"success-200-isvc-0c4f8-predictor-67fb899dc9-88dnr\" (UID: \"317736d7-450d-41e3-b7bf-0cd269b30334\") " pod="kserve-ci-e2e-test/success-200-isvc-0c4f8-predictor-67fb899dc9-88dnr" Mar 18 17:16:04.979628 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:16:04.979498 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h9pdj\" (UniqueName: \"kubernetes.io/projected/317736d7-450d-41e3-b7bf-0cd269b30334-kube-api-access-h9pdj\") pod \"success-200-isvc-0c4f8-predictor-67fb899dc9-88dnr\" (UID: \"317736d7-450d-41e3-b7bf-0cd269b30334\") " pod="kserve-ci-e2e-test/success-200-isvc-0c4f8-predictor-67fb899dc9-88dnr" Mar 18 17:16:04.987286 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:16:04.987256 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9pdj\" (UniqueName: \"kubernetes.io/projected/317736d7-450d-41e3-b7bf-0cd269b30334-kube-api-access-h9pdj\") pod \"success-200-isvc-0c4f8-predictor-67fb899dc9-88dnr\" (UID: \"317736d7-450d-41e3-b7bf-0cd269b30334\") " pod="kserve-ci-e2e-test/success-200-isvc-0c4f8-predictor-67fb899dc9-88dnr" Mar 18 17:16:05.093105 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:16:05.093068 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-0c4f8-predictor-67fb899dc9-88dnr" Mar 18 17:16:05.221024 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:16:05.220967 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-0c4f8-predictor-67fb899dc9-88dnr"] Mar 18 17:16:05.224474 ip-10-0-143-175 kubenswrapper[2578]: W0318 17:16:05.224442 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod317736d7_450d_41e3_b7bf_0cd269b30334.slice/crio-028882afd801b0096f506a19565e67d1fd046eff4e37a3d7ee5cdfdf83a9e49c WatchSource:0}: Error finding container 028882afd801b0096f506a19565e67d1fd046eff4e37a3d7ee5cdfdf83a9e49c: Status 404 returned error can't find the container with id 028882afd801b0096f506a19565e67d1fd046eff4e37a3d7ee5cdfdf83a9e49c Mar 18 17:16:05.738797 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:16:05.738751 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-0c4f8-predictor-67fb899dc9-88dnr" event={"ID":"317736d7-450d-41e3-b7bf-0cd269b30334","Type":"ContainerStarted","Data":"9324e9a06643460c8ac8df62d84bad56db7a764a10e274fa4b49ffb00784db66"} Mar 18 17:16:05.738797 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:16:05.738796 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-0c4f8-predictor-67fb899dc9-88dnr" event={"ID":"317736d7-450d-41e3-b7bf-0cd269b30334","Type":"ContainerStarted","Data":"028882afd801b0096f506a19565e67d1fd046eff4e37a3d7ee5cdfdf83a9e49c"} Mar 18 17:16:05.739042 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:16:05.738988 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-0c4f8-predictor-67fb899dc9-88dnr" Mar 18 17:16:05.740293 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:16:05.740260 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-0c4f8-predictor-67fb899dc9-88dnr" podUID="317736d7-450d-41e3-b7bf-0cd269b30334" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.47:8080: connect: connection refused" Mar 18 17:16:05.753752 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:16:05.753703 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-0c4f8-predictor-67fb899dc9-88dnr" podStartSLOduration=1.7536879779999999 podStartE2EDuration="1.753687978s" podCreationTimestamp="2026-03-18 17:16:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:16:05.752733563 +0000 UTC m=+1919.764842388" watchObservedRunningTime="2026-03-18 17:16:05.753687978 +0000 UTC m=+1919.765796780" Mar 18 17:16:06.150014 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:16:06.149972 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-19f83-b5859dffb-4pxf7" podUID="d6a6472d-1553-4fe1-8f9c-13d2dee6d93a" containerName="ensemble-graph-19f83" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 17:16:06.743248 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:16:06.743200 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-0c4f8-predictor-67fb899dc9-88dnr" podUID="317736d7-450d-41e3-b7bf-0cd269b30334" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.47:8080: connect: connection refused" Mar 18 17:16:07.918720 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:16:07.918694 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-19f83-predictor-b5c694bf5-xkjns" Mar 18 17:16:08.004316 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:16:08.004227 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-29pbd\" (UniqueName: \"kubernetes.io/projected/97d9ff2f-d076-4548-83cd-129f594d7745-kube-api-access-29pbd\") pod \"97d9ff2f-d076-4548-83cd-129f594d7745\" (UID: \"97d9ff2f-d076-4548-83cd-129f594d7745\") " Mar 18 17:16:08.006295 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:16:08.006269 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97d9ff2f-d076-4548-83cd-129f594d7745-kube-api-access-29pbd" (OuterVolumeSpecName: "kube-api-access-29pbd") pod "97d9ff2f-d076-4548-83cd-129f594d7745" (UID: "97d9ff2f-d076-4548-83cd-129f594d7745"). InnerVolumeSpecName "kube-api-access-29pbd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 17:16:08.104844 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:16:08.104807 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-29pbd\" (UniqueName: \"kubernetes.io/projected/97d9ff2f-d076-4548-83cd-129f594d7745-kube-api-access-29pbd\") on node \"ip-10-0-143-175.ec2.internal\" DevicePath \"\"" Mar 18 17:16:08.750910 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:16:08.750877 2578 generic.go:358] "Generic (PLEG): container finished" podID="97d9ff2f-d076-4548-83cd-129f594d7745" containerID="2e19f7d8e3a622c66578d77066763b5aa8b3880e23beb7aa1e6016abb3f81db6" exitCode=0 Mar 18 17:16:08.751079 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:16:08.750935 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-19f83-predictor-b5c694bf5-xkjns" Mar 18 17:16:08.751079 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:16:08.750970 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-19f83-predictor-b5c694bf5-xkjns" event={"ID":"97d9ff2f-d076-4548-83cd-129f594d7745","Type":"ContainerDied","Data":"2e19f7d8e3a622c66578d77066763b5aa8b3880e23beb7aa1e6016abb3f81db6"} Mar 18 17:16:08.751079 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:16:08.751009 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-19f83-predictor-b5c694bf5-xkjns" event={"ID":"97d9ff2f-d076-4548-83cd-129f594d7745","Type":"ContainerDied","Data":"bf4764c1ca9bed19c03010222b5cee31c738c13fcf5edb1e5b45539be719dd4b"} Mar 18 17:16:08.751079 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:16:08.751031 2578 scope.go:117] "RemoveContainer" containerID="2e19f7d8e3a622c66578d77066763b5aa8b3880e23beb7aa1e6016abb3f81db6" Mar 18 17:16:08.759320 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:16:08.759298 2578 scope.go:117] "RemoveContainer" containerID="2e19f7d8e3a622c66578d77066763b5aa8b3880e23beb7aa1e6016abb3f81db6" Mar 18 17:16:08.759664 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:16:08.759644 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e19f7d8e3a622c66578d77066763b5aa8b3880e23beb7aa1e6016abb3f81db6\": container with ID starting with 2e19f7d8e3a622c66578d77066763b5aa8b3880e23beb7aa1e6016abb3f81db6 not found: ID does not exist" containerID="2e19f7d8e3a622c66578d77066763b5aa8b3880e23beb7aa1e6016abb3f81db6" Mar 18 17:16:08.759730 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:16:08.759675 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e19f7d8e3a622c66578d77066763b5aa8b3880e23beb7aa1e6016abb3f81db6"} err="failed to get container status \"2e19f7d8e3a622c66578d77066763b5aa8b3880e23beb7aa1e6016abb3f81db6\": rpc error: code = NotFound desc = could not find container \"2e19f7d8e3a622c66578d77066763b5aa8b3880e23beb7aa1e6016abb3f81db6\": container with ID starting with 2e19f7d8e3a622c66578d77066763b5aa8b3880e23beb7aa1e6016abb3f81db6 not found: ID does not exist" Mar 18 17:16:08.765575 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:16:08.765552 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-19f83-predictor-b5c694bf5-xkjns"] Mar 18 17:16:08.768649 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:16:08.768629 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-19f83-predictor-b5c694bf5-xkjns"] Mar 18 17:16:10.592519 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:16:10.592485 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97d9ff2f-d076-4548-83cd-129f594d7745" path="/var/lib/kubelet/pods/97d9ff2f-d076-4548-83cd-129f594d7745/volumes" Mar 18 17:16:11.149916 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:16:11.149871 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-19f83-b5859dffb-4pxf7" podUID="d6a6472d-1553-4fe1-8f9c-13d2dee6d93a" containerName="ensemble-graph-19f83" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 17:16:15.589880 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:16:15.589846 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:16:16.149914 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:16:16.149870 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-19f83-b5859dffb-4pxf7" podUID="d6a6472d-1553-4fe1-8f9c-13d2dee6d93a" containerName="ensemble-graph-19f83" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 17:16:16.150094 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:16:16.149974 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/ensemble-graph-19f83-b5859dffb-4pxf7" Mar 18 17:16:16.743360 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:16:16.743317 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-0c4f8-predictor-67fb899dc9-88dnr" podUID="317736d7-450d-41e3-b7bf-0cd269b30334" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.47:8080: connect: connection refused" Mar 18 17:16:21.150695 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:16:21.150648 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-19f83-b5859dffb-4pxf7" podUID="d6a6472d-1553-4fe1-8f9c-13d2dee6d93a" containerName="ensemble-graph-19f83" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 17:16:26.150539 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:16:26.150489 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-19f83-b5859dffb-4pxf7" podUID="d6a6472d-1553-4fe1-8f9c-13d2dee6d93a" containerName="ensemble-graph-19f83" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 17:16:26.743514 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:16:26.743460 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-0c4f8-predictor-67fb899dc9-88dnr" podUID="317736d7-450d-41e3-b7bf-0cd269b30334" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.47:8080: connect: connection refused" Mar 18 17:16:27.589491 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:16:27.589460 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:16:31.149823 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:16:31.149785 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-19f83-b5859dffb-4pxf7" podUID="d6a6472d-1553-4fe1-8f9c-13d2dee6d93a" containerName="ensemble-graph-19f83" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 17:16:34.365995 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:16:34.365971 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-19f83-b5859dffb-4pxf7" Mar 18 17:16:34.529363 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:16:34.529272 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d6a6472d-1553-4fe1-8f9c-13d2dee6d93a-openshift-service-ca-bundle\") pod \"d6a6472d-1553-4fe1-8f9c-13d2dee6d93a\" (UID: \"d6a6472d-1553-4fe1-8f9c-13d2dee6d93a\") " Mar 18 17:16:34.529363 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:16:34.529322 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d6a6472d-1553-4fe1-8f9c-13d2dee6d93a-proxy-tls\") pod \"d6a6472d-1553-4fe1-8f9c-13d2dee6d93a\" (UID: \"d6a6472d-1553-4fe1-8f9c-13d2dee6d93a\") " Mar 18 17:16:34.529669 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:16:34.529645 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6a6472d-1553-4fe1-8f9c-13d2dee6d93a-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "d6a6472d-1553-4fe1-8f9c-13d2dee6d93a" (UID: "d6a6472d-1553-4fe1-8f9c-13d2dee6d93a"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 17:16:34.531537 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:16:34.531497 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6a6472d-1553-4fe1-8f9c-13d2dee6d93a-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "d6a6472d-1553-4fe1-8f9c-13d2dee6d93a" (UID: "d6a6472d-1553-4fe1-8f9c-13d2dee6d93a"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 17:16:34.630405 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:16:34.630370 2578 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d6a6472d-1553-4fe1-8f9c-13d2dee6d93a-openshift-service-ca-bundle\") on node \"ip-10-0-143-175.ec2.internal\" DevicePath \"\"" Mar 18 17:16:34.630593 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:16:34.630423 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d6a6472d-1553-4fe1-8f9c-13d2dee6d93a-proxy-tls\") on node \"ip-10-0-143-175.ec2.internal\" DevicePath \"\"" Mar 18 17:16:34.835053 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:16:34.835016 2578 generic.go:358] "Generic (PLEG): container finished" podID="d6a6472d-1553-4fe1-8f9c-13d2dee6d93a" containerID="d169c5684029224d0db5277b15c76b8119491f07a925f4f1428d442f9778867a" exitCode=0 Mar 18 17:16:34.835234 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:16:34.835077 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-19f83-b5859dffb-4pxf7" Mar 18 17:16:34.835234 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:16:34.835091 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-19f83-b5859dffb-4pxf7" event={"ID":"d6a6472d-1553-4fe1-8f9c-13d2dee6d93a","Type":"ContainerDied","Data":"d169c5684029224d0db5277b15c76b8119491f07a925f4f1428d442f9778867a"} Mar 18 17:16:34.835234 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:16:34.835127 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-19f83-b5859dffb-4pxf7" event={"ID":"d6a6472d-1553-4fe1-8f9c-13d2dee6d93a","Type":"ContainerDied","Data":"feec5d767e7178a88704ef654810a18f1cbc7953eb833a4353ab4f5b1ea72416"} Mar 18 17:16:34.835234 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:16:34.835142 2578 scope.go:117] "RemoveContainer" containerID="d169c5684029224d0db5277b15c76b8119491f07a925f4f1428d442f9778867a" Mar 18 17:16:34.843200 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:16:34.843180 2578 scope.go:117] "RemoveContainer" containerID="d169c5684029224d0db5277b15c76b8119491f07a925f4f1428d442f9778867a" Mar 18 17:16:34.843459 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:16:34.843439 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d169c5684029224d0db5277b15c76b8119491f07a925f4f1428d442f9778867a\": container with ID starting with d169c5684029224d0db5277b15c76b8119491f07a925f4f1428d442f9778867a not found: ID does not exist" containerID="d169c5684029224d0db5277b15c76b8119491f07a925f4f1428d442f9778867a" Mar 18 17:16:34.843545 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:16:34.843470 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d169c5684029224d0db5277b15c76b8119491f07a925f4f1428d442f9778867a"} err="failed to get container status \"d169c5684029224d0db5277b15c76b8119491f07a925f4f1428d442f9778867a\": rpc error: code = NotFound desc = could not find container \"d169c5684029224d0db5277b15c76b8119491f07a925f4f1428d442f9778867a\": container with ID starting with d169c5684029224d0db5277b15c76b8119491f07a925f4f1428d442f9778867a not found: ID does not exist" Mar 18 17:16:34.849557 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:16:34.849515 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-19f83-b5859dffb-4pxf7"] Mar 18 17:16:34.851737 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:16:34.851714 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-19f83-b5859dffb-4pxf7"] Mar 18 17:16:36.592218 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:16:36.592176 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6a6472d-1553-4fe1-8f9c-13d2dee6d93a" path="/var/lib/kubelet/pods/d6a6472d-1553-4fe1-8f9c-13d2dee6d93a/volumes" Mar 18 17:16:36.743852 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:16:36.743810 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-0c4f8-predictor-67fb899dc9-88dnr" podUID="317736d7-450d-41e3-b7bf-0cd269b30334" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.47:8080: connect: connection refused" Mar 18 17:16:42.590226 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:16:42.590184 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:16:43.858176 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:16:43.858144 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-8d83a-775c8cf4fc-289tk"] Mar 18 17:16:43.858600 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:16:43.858392 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sequence-graph-8d83a-775c8cf4fc-289tk" podUID="48e1881b-e2d9-4786-8624-1d663c24a219" containerName="sequence-graph-8d83a" containerID="cri-o://2dce5aea52ed1b4040a69c0a27268f088dd55a0b47eb8fc6d6b37046cc6ceaa6" gracePeriod=30 Mar 18 17:16:43.990943 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:16:43.990908 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-8d83a-predictor-7f9fd554b9-gb4cv"] Mar 18 17:16:43.991165 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:16:43.991144 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-8d83a-predictor-7f9fd554b9-gb4cv" podUID="bb38c4ad-38c5-4006-9eab-4a67b9575f24" containerName="kserve-container" containerID="cri-o://6b4ed51a3bd67c4a6857b973ff13e7952cd3072073b2093aac9be6242ba41f1b" gracePeriod=30 Mar 18 17:16:44.253857 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:16:44.253779 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-eb922-predictor-ff68cf985-w7v89"] Mar 18 17:16:44.254131 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:16:44.254112 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d6a6472d-1553-4fe1-8f9c-13d2dee6d93a" containerName="ensemble-graph-19f83" Mar 18 17:16:44.254131 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:16:44.254126 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6a6472d-1553-4fe1-8f9c-13d2dee6d93a" containerName="ensemble-graph-19f83" Mar 18 17:16:44.254212 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:16:44.254138 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="97d9ff2f-d076-4548-83cd-129f594d7745" containerName="kserve-container" Mar 18 17:16:44.254212 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:16:44.254143 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="97d9ff2f-d076-4548-83cd-129f594d7745" containerName="kserve-container" Mar 18 17:16:44.254212 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:16:44.254210 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="97d9ff2f-d076-4548-83cd-129f594d7745" containerName="kserve-container" Mar 18 17:16:44.254304 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:16:44.254220 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="d6a6472d-1553-4fe1-8f9c-13d2dee6d93a" containerName="ensemble-graph-19f83" Mar 18 17:16:44.257283 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:16:44.257265 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-eb922-predictor-ff68cf985-w7v89" Mar 18 17:16:44.264073 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:16:44.264042 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-eb922-predictor-ff68cf985-w7v89"] Mar 18 17:16:44.415991 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:16:44.415957 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lz5hh\" (UniqueName: \"kubernetes.io/projected/8ec60ecd-1545-462c-9db3-184ba0c557d0-kube-api-access-lz5hh\") pod \"success-200-isvc-eb922-predictor-ff68cf985-w7v89\" (UID: \"8ec60ecd-1545-462c-9db3-184ba0c557d0\") " pod="kserve-ci-e2e-test/success-200-isvc-eb922-predictor-ff68cf985-w7v89" Mar 18 17:16:44.517265 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:16:44.517182 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lz5hh\" (UniqueName: \"kubernetes.io/projected/8ec60ecd-1545-462c-9db3-184ba0c557d0-kube-api-access-lz5hh\") pod \"success-200-isvc-eb922-predictor-ff68cf985-w7v89\" (UID: \"8ec60ecd-1545-462c-9db3-184ba0c557d0\") " pod="kserve-ci-e2e-test/success-200-isvc-eb922-predictor-ff68cf985-w7v89" Mar 18 17:16:44.525002 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:16:44.524980 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lz5hh\" (UniqueName: \"kubernetes.io/projected/8ec60ecd-1545-462c-9db3-184ba0c557d0-kube-api-access-lz5hh\") pod \"success-200-isvc-eb922-predictor-ff68cf985-w7v89\" (UID: \"8ec60ecd-1545-462c-9db3-184ba0c557d0\") " pod="kserve-ci-e2e-test/success-200-isvc-eb922-predictor-ff68cf985-w7v89" Mar 18 17:16:44.568725 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:16:44.568694 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-eb922-predictor-ff68cf985-w7v89" Mar 18 17:16:44.695491 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:16:44.695458 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-eb922-predictor-ff68cf985-w7v89"] Mar 18 17:16:44.697668 ip-10-0-143-175 kubenswrapper[2578]: W0318 17:16:44.697640 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ec60ecd_1545_462c_9db3_184ba0c557d0.slice/crio-e3f4af55cf100305a62dc4be92e91a3141922522af29f990f477e8314b7f2e73 WatchSource:0}: Error finding container e3f4af55cf100305a62dc4be92e91a3141922522af29f990f477e8314b7f2e73: Status 404 returned error can't find the container with id e3f4af55cf100305a62dc4be92e91a3141922522af29f990f477e8314b7f2e73 Mar 18 17:16:44.868196 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:16:44.868160 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-eb922-predictor-ff68cf985-w7v89" event={"ID":"8ec60ecd-1545-462c-9db3-184ba0c557d0","Type":"ContainerStarted","Data":"6336645f70edd1434022b2ce70771728ff5682abeb80afb7018ff1fe96218184"} Mar 18 17:16:44.868196 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:16:44.868198 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-eb922-predictor-ff68cf985-w7v89" event={"ID":"8ec60ecd-1545-462c-9db3-184ba0c557d0","Type":"ContainerStarted","Data":"e3f4af55cf100305a62dc4be92e91a3141922522af29f990f477e8314b7f2e73"} Mar 18 17:16:44.868686 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:16:44.868301 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-eb922-predictor-ff68cf985-w7v89" Mar 18 17:16:44.869759 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:16:44.869739 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-eb922-predictor-ff68cf985-w7v89" podUID="8ec60ecd-1545-462c-9db3-184ba0c557d0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.48:8080: connect: connection refused" Mar 18 17:16:44.883492 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:16:44.883444 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-eb922-predictor-ff68cf985-w7v89" podStartSLOduration=0.88343082 podStartE2EDuration="883.43082ms" podCreationTimestamp="2026-03-18 17:16:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:16:44.881928726 +0000 UTC m=+1958.894037550" watchObservedRunningTime="2026-03-18 17:16:44.88343082 +0000 UTC m=+1958.895539701" Mar 18 17:16:45.871790 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:16:45.871748 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-eb922-predictor-ff68cf985-w7v89" podUID="8ec60ecd-1545-462c-9db3-184ba0c557d0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.48:8080: connect: connection refused" Mar 18 17:16:46.743304 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:16:46.743245 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-0c4f8-predictor-67fb899dc9-88dnr" podUID="317736d7-450d-41e3-b7bf-0cd269b30334" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.47:8080: connect: connection refused" Mar 18 17:16:47.234230 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:16:47.234204 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-8d83a-predictor-7f9fd554b9-gb4cv" Mar 18 17:16:47.283802 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:16:47.283766 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-8d83a-775c8cf4fc-289tk" podUID="48e1881b-e2d9-4786-8624-1d663c24a219" containerName="sequence-graph-8d83a" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 17:16:47.344697 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:16:47.344548 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xdm9n\" (UniqueName: \"kubernetes.io/projected/bb38c4ad-38c5-4006-9eab-4a67b9575f24-kube-api-access-xdm9n\") pod \"bb38c4ad-38c5-4006-9eab-4a67b9575f24\" (UID: \"bb38c4ad-38c5-4006-9eab-4a67b9575f24\") " Mar 18 17:16:47.346629 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:16:47.346595 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb38c4ad-38c5-4006-9eab-4a67b9575f24-kube-api-access-xdm9n" (OuterVolumeSpecName: "kube-api-access-xdm9n") pod "bb38c4ad-38c5-4006-9eab-4a67b9575f24" (UID: "bb38c4ad-38c5-4006-9eab-4a67b9575f24"). InnerVolumeSpecName "kube-api-access-xdm9n". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 17:16:47.445573 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:16:47.445512 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xdm9n\" (UniqueName: \"kubernetes.io/projected/bb38c4ad-38c5-4006-9eab-4a67b9575f24-kube-api-access-xdm9n\") on node \"ip-10-0-143-175.ec2.internal\" DevicePath \"\"" Mar 18 17:16:47.880323 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:16:47.880285 2578 generic.go:358] "Generic (PLEG): container finished" podID="bb38c4ad-38c5-4006-9eab-4a67b9575f24" containerID="6b4ed51a3bd67c4a6857b973ff13e7952cd3072073b2093aac9be6242ba41f1b" exitCode=0 Mar 18 17:16:47.880497 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:16:47.880353 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-8d83a-predictor-7f9fd554b9-gb4cv" Mar 18 17:16:47.880497 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:16:47.880362 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-8d83a-predictor-7f9fd554b9-gb4cv" event={"ID":"bb38c4ad-38c5-4006-9eab-4a67b9575f24","Type":"ContainerDied","Data":"6b4ed51a3bd67c4a6857b973ff13e7952cd3072073b2093aac9be6242ba41f1b"} Mar 18 17:16:47.880497 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:16:47.880394 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-8d83a-predictor-7f9fd554b9-gb4cv" event={"ID":"bb38c4ad-38c5-4006-9eab-4a67b9575f24","Type":"ContainerDied","Data":"6a2f40fe2de51fb813d6bab03801f7c31721e10e313a772c72dc78d600de5a56"} Mar 18 17:16:47.880497 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:16:47.880416 2578 scope.go:117] "RemoveContainer" containerID="6b4ed51a3bd67c4a6857b973ff13e7952cd3072073b2093aac9be6242ba41f1b" Mar 18 17:16:47.889085 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:16:47.889065 2578 scope.go:117] "RemoveContainer" containerID="6b4ed51a3bd67c4a6857b973ff13e7952cd3072073b2093aac9be6242ba41f1b" Mar 18 17:16:47.889322 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:16:47.889300 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b4ed51a3bd67c4a6857b973ff13e7952cd3072073b2093aac9be6242ba41f1b\": container with ID starting with 6b4ed51a3bd67c4a6857b973ff13e7952cd3072073b2093aac9be6242ba41f1b not found: ID does not exist" containerID="6b4ed51a3bd67c4a6857b973ff13e7952cd3072073b2093aac9be6242ba41f1b" Mar 18 17:16:47.889385 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:16:47.889331 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b4ed51a3bd67c4a6857b973ff13e7952cd3072073b2093aac9be6242ba41f1b"} err="failed to get container status \"6b4ed51a3bd67c4a6857b973ff13e7952cd3072073b2093aac9be6242ba41f1b\": rpc error: code = NotFound desc = could not find container \"6b4ed51a3bd67c4a6857b973ff13e7952cd3072073b2093aac9be6242ba41f1b\": container with ID starting with 6b4ed51a3bd67c4a6857b973ff13e7952cd3072073b2093aac9be6242ba41f1b not found: ID does not exist" Mar 18 17:16:47.899229 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:16:47.899204 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-8d83a-predictor-7f9fd554b9-gb4cv"] Mar 18 17:16:47.904324 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:16:47.904303 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-8d83a-predictor-7f9fd554b9-gb4cv"] Mar 18 17:16:48.593110 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:16:48.593076 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb38c4ad-38c5-4006-9eab-4a67b9575f24" path="/var/lib/kubelet/pods/bb38c4ad-38c5-4006-9eab-4a67b9575f24/volumes" Mar 18 17:16:52.284129 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:16:52.284082 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-8d83a-775c8cf4fc-289tk" podUID="48e1881b-e2d9-4786-8624-1d663c24a219" containerName="sequence-graph-8d83a" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 17:16:55.872687 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:16:55.872593 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-eb922-predictor-ff68cf985-w7v89" podUID="8ec60ecd-1545-462c-9db3-184ba0c557d0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.48:8080: connect: connection refused" Mar 18 17:16:56.592045 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:16:56.591901 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:16:56.743593 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:16:56.743511 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-0c4f8-predictor-67fb899dc9-88dnr" podUID="317736d7-450d-41e3-b7bf-0cd269b30334" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.47:8080: connect: connection refused" Mar 18 17:16:57.284400 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:16:57.284354 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-8d83a-775c8cf4fc-289tk" podUID="48e1881b-e2d9-4786-8624-1d663c24a219" containerName="sequence-graph-8d83a" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 17:16:57.284788 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:16:57.284467 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-8d83a-775c8cf4fc-289tk" Mar 18 17:17:02.284489 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:17:02.284445 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-8d83a-775c8cf4fc-289tk" podUID="48e1881b-e2d9-4786-8624-1d663c24a219" containerName="sequence-graph-8d83a" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 17:17:05.872845 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:17:05.872787 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-eb922-predictor-ff68cf985-w7v89" podUID="8ec60ecd-1545-462c-9db3-184ba0c557d0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.48:8080: connect: connection refused" Mar 18 17:17:06.745374 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:17:06.745346 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-0c4f8-predictor-67fb899dc9-88dnr" Mar 18 17:17:07.284023 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:17:07.283983 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-8d83a-775c8cf4fc-289tk" podUID="48e1881b-e2d9-4786-8624-1d663c24a219" containerName="sequence-graph-8d83a" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 17:17:11.589686 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:17:11.589638 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:17:12.284715 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:17:12.284674 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-8d83a-775c8cf4fc-289tk" podUID="48e1881b-e2d9-4786-8624-1d663c24a219" containerName="sequence-graph-8d83a" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 17:17:13.965446 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:17:13.965412 2578 generic.go:358] "Generic (PLEG): container finished" podID="48e1881b-e2d9-4786-8624-1d663c24a219" containerID="2dce5aea52ed1b4040a69c0a27268f088dd55a0b47eb8fc6d6b37046cc6ceaa6" exitCode=0 Mar 18 17:17:13.965846 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:17:13.965481 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-8d83a-775c8cf4fc-289tk" event={"ID":"48e1881b-e2d9-4786-8624-1d663c24a219","Type":"ContainerDied","Data":"2dce5aea52ed1b4040a69c0a27268f088dd55a0b47eb8fc6d6b37046cc6ceaa6"} Mar 18 17:17:13.996519 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:17:13.996496 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-8d83a-775c8cf4fc-289tk" Mar 18 17:17:14.048231 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:17:14.048202 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/48e1881b-e2d9-4786-8624-1d663c24a219-proxy-tls\") pod \"48e1881b-e2d9-4786-8624-1d663c24a219\" (UID: \"48e1881b-e2d9-4786-8624-1d663c24a219\") " Mar 18 17:17:14.048402 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:17:14.048340 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/48e1881b-e2d9-4786-8624-1d663c24a219-openshift-service-ca-bundle\") pod \"48e1881b-e2d9-4786-8624-1d663c24a219\" (UID: \"48e1881b-e2d9-4786-8624-1d663c24a219\") " Mar 18 17:17:14.048685 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:17:14.048660 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48e1881b-e2d9-4786-8624-1d663c24a219-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "48e1881b-e2d9-4786-8624-1d663c24a219" (UID: "48e1881b-e2d9-4786-8624-1d663c24a219"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 17:17:14.050178 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:17:14.050160 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48e1881b-e2d9-4786-8624-1d663c24a219-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "48e1881b-e2d9-4786-8624-1d663c24a219" (UID: "48e1881b-e2d9-4786-8624-1d663c24a219"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 17:17:14.149670 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:17:14.149586 2578 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/48e1881b-e2d9-4786-8624-1d663c24a219-openshift-service-ca-bundle\") on node \"ip-10-0-143-175.ec2.internal\" DevicePath \"\"" Mar 18 17:17:14.149670 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:17:14.149616 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/48e1881b-e2d9-4786-8624-1d663c24a219-proxy-tls\") on node \"ip-10-0-143-175.ec2.internal\" DevicePath \"\"" Mar 18 17:17:14.970282 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:17:14.970189 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-8d83a-775c8cf4fc-289tk" Mar 18 17:17:14.970282 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:17:14.970201 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-8d83a-775c8cf4fc-289tk" event={"ID":"48e1881b-e2d9-4786-8624-1d663c24a219","Type":"ContainerDied","Data":"369fe5e770054c291bb7b89a53a50401de71033a4fd34ce2037400d1baecd241"} Mar 18 17:17:14.970282 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:17:14.970249 2578 scope.go:117] "RemoveContainer" containerID="2dce5aea52ed1b4040a69c0a27268f088dd55a0b47eb8fc6d6b37046cc6ceaa6" Mar 18 17:17:14.984249 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:17:14.984228 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-8d83a-775c8cf4fc-289tk"] Mar 18 17:17:14.987519 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:17:14.987496 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-8d83a-775c8cf4fc-289tk"] Mar 18 17:17:15.872617 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:17:15.872569 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-eb922-predictor-ff68cf985-w7v89" podUID="8ec60ecd-1545-462c-9db3-184ba0c557d0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.48:8080: connect: connection refused" Mar 18 17:17:16.593161 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:17:16.593127 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48e1881b-e2d9-4786-8624-1d663c24a219" path="/var/lib/kubelet/pods/48e1881b-e2d9-4786-8624-1d663c24a219/volumes" Mar 18 17:17:24.528508 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:17:24.528470 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/splitter-graph-0c4f8-7c56559465-2s29v"] Mar 18 17:17:24.529026 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:17:24.528838 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="48e1881b-e2d9-4786-8624-1d663c24a219" containerName="sequence-graph-8d83a" Mar 18 17:17:24.529026 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:17:24.528851 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="48e1881b-e2d9-4786-8624-1d663c24a219" containerName="sequence-graph-8d83a" Mar 18 17:17:24.529026 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:17:24.528861 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bb38c4ad-38c5-4006-9eab-4a67b9575f24" containerName="kserve-container" Mar 18 17:17:24.529026 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:17:24.528868 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb38c4ad-38c5-4006-9eab-4a67b9575f24" containerName="kserve-container" Mar 18 17:17:24.529026 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:17:24.528918 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="48e1881b-e2d9-4786-8624-1d663c24a219" containerName="sequence-graph-8d83a" Mar 18 17:17:24.529026 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:17:24.528928 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="bb38c4ad-38c5-4006-9eab-4a67b9575f24" containerName="kserve-container" Mar 18 17:17:24.533143 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:17:24.533125 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-0c4f8-7c56559465-2s29v" Mar 18 17:17:24.534991 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:17:24.534953 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"splitter-graph-0c4f8-serving-cert\"" Mar 18 17:17:24.535112 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:17:24.535020 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"splitter-graph-0c4f8-kube-rbac-proxy-sar-config\"" Mar 18 17:17:24.539951 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:17:24.539925 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-0c4f8-7c56559465-2s29v"] Mar 18 17:17:24.644090 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:17:24.644046 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6557ac99-6842-4706-83ab-858761eb5031-openshift-service-ca-bundle\") pod \"splitter-graph-0c4f8-7c56559465-2s29v\" (UID: \"6557ac99-6842-4706-83ab-858761eb5031\") " pod="kserve-ci-e2e-test/splitter-graph-0c4f8-7c56559465-2s29v" Mar 18 17:17:24.644299 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:17:24.644118 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6557ac99-6842-4706-83ab-858761eb5031-proxy-tls\") pod \"splitter-graph-0c4f8-7c56559465-2s29v\" (UID: \"6557ac99-6842-4706-83ab-858761eb5031\") " pod="kserve-ci-e2e-test/splitter-graph-0c4f8-7c56559465-2s29v" Mar 18 17:17:24.745561 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:17:24.745504 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6557ac99-6842-4706-83ab-858761eb5031-openshift-service-ca-bundle\") pod \"splitter-graph-0c4f8-7c56559465-2s29v\" (UID: \"6557ac99-6842-4706-83ab-858761eb5031\") " pod="kserve-ci-e2e-test/splitter-graph-0c4f8-7c56559465-2s29v" Mar 18 17:17:24.745754 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:17:24.745594 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6557ac99-6842-4706-83ab-858761eb5031-proxy-tls\") pod \"splitter-graph-0c4f8-7c56559465-2s29v\" (UID: \"6557ac99-6842-4706-83ab-858761eb5031\") " pod="kserve-ci-e2e-test/splitter-graph-0c4f8-7c56559465-2s29v" Mar 18 17:17:24.746129 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:17:24.746098 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6557ac99-6842-4706-83ab-858761eb5031-openshift-service-ca-bundle\") pod \"splitter-graph-0c4f8-7c56559465-2s29v\" (UID: \"6557ac99-6842-4706-83ab-858761eb5031\") " pod="kserve-ci-e2e-test/splitter-graph-0c4f8-7c56559465-2s29v" Mar 18 17:17:24.747973 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:17:24.747950 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6557ac99-6842-4706-83ab-858761eb5031-proxy-tls\") pod \"splitter-graph-0c4f8-7c56559465-2s29v\" (UID: \"6557ac99-6842-4706-83ab-858761eb5031\") " pod="kserve-ci-e2e-test/splitter-graph-0c4f8-7c56559465-2s29v" Mar 18 17:17:24.846235 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:17:24.846196 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-0c4f8-7c56559465-2s29v" Mar 18 17:17:24.966148 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:17:24.966122 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-0c4f8-7c56559465-2s29v"] Mar 18 17:17:24.968605 ip-10-0-143-175 kubenswrapper[2578]: W0318 17:17:24.968571 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6557ac99_6842_4706_83ab_858761eb5031.slice/crio-7a6a5ab1030b15bdb93873d7cf1dcb781f8f0f86235cb20cecbde08ca17d7f1d WatchSource:0}: Error finding container 7a6a5ab1030b15bdb93873d7cf1dcb781f8f0f86235cb20cecbde08ca17d7f1d: Status 404 returned error can't find the container with id 7a6a5ab1030b15bdb93873d7cf1dcb781f8f0f86235cb20cecbde08ca17d7f1d Mar 18 17:17:25.002142 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:17:25.002106 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-0c4f8-7c56559465-2s29v" event={"ID":"6557ac99-6842-4706-83ab-858761eb5031","Type":"ContainerStarted","Data":"7a6a5ab1030b15bdb93873d7cf1dcb781f8f0f86235cb20cecbde08ca17d7f1d"} Mar 18 17:17:25.872829 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:17:25.872783 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-eb922-predictor-ff68cf985-w7v89" podUID="8ec60ecd-1545-462c-9db3-184ba0c557d0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.48:8080: connect: connection refused" Mar 18 17:17:26.006298 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:17:26.006263 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-0c4f8-7c56559465-2s29v" event={"ID":"6557ac99-6842-4706-83ab-858761eb5031","Type":"ContainerStarted","Data":"f5273b16ec96a150273f76f304d9735ab3ba33c7fdc4505a3a3bd58caf601014"} Mar 18 17:17:26.006467 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:17:26.006363 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/splitter-graph-0c4f8-7c56559465-2s29v" Mar 18 17:17:26.022619 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:17:26.022569 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/splitter-graph-0c4f8-7c56559465-2s29v" podStartSLOduration=2.022553502 podStartE2EDuration="2.022553502s" podCreationTimestamp="2026-03-18 17:17:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:17:26.021675429 +0000 UTC m=+2000.033784256" watchObservedRunningTime="2026-03-18 17:17:26.022553502 +0000 UTC m=+2000.034662324" Mar 18 17:17:26.592505 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:17:26.592465 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:17:32.015670 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:17:32.015641 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/splitter-graph-0c4f8-7c56559465-2s29v" Mar 18 17:17:34.593741 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:17:34.593704 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-0c4f8-7c56559465-2s29v"] Mar 18 17:17:34.594236 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:17:34.593958 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/splitter-graph-0c4f8-7c56559465-2s29v" podUID="6557ac99-6842-4706-83ab-858761eb5031" containerName="splitter-graph-0c4f8" containerID="cri-o://f5273b16ec96a150273f76f304d9735ab3ba33c7fdc4505a3a3bd58caf601014" gracePeriod=30 Mar 18 17:17:34.967343 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:17:34.967253 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-0c4f8-predictor-67fb899dc9-88dnr"] Mar 18 17:17:34.967627 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:17:34.967581 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-0c4f8-predictor-67fb899dc9-88dnr" podUID="317736d7-450d-41e3-b7bf-0cd269b30334" containerName="kserve-container" containerID="cri-o://9324e9a06643460c8ac8df62d84bad56db7a764a10e274fa4b49ffb00784db66" gracePeriod=30 Mar 18 17:17:35.086249 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:17:35.086211 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-6eae8-predictor-7b94df567-r9c56"] Mar 18 17:17:35.089594 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:17:35.089572 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-6eae8-predictor-7b94df567-r9c56" Mar 18 17:17:35.097779 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:17:35.097754 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-6eae8-predictor-7b94df567-r9c56"] Mar 18 17:17:35.231367 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:17:35.231278 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72pl2\" (UniqueName: \"kubernetes.io/projected/98b3763c-585d-418a-a124-a44e45ae1196-kube-api-access-72pl2\") pod \"success-200-isvc-6eae8-predictor-7b94df567-r9c56\" (UID: \"98b3763c-585d-418a-a124-a44e45ae1196\") " pod="kserve-ci-e2e-test/success-200-isvc-6eae8-predictor-7b94df567-r9c56" Mar 18 17:17:35.332047 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:17:35.332019 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-72pl2\" (UniqueName: \"kubernetes.io/projected/98b3763c-585d-418a-a124-a44e45ae1196-kube-api-access-72pl2\") pod \"success-200-isvc-6eae8-predictor-7b94df567-r9c56\" (UID: \"98b3763c-585d-418a-a124-a44e45ae1196\") " pod="kserve-ci-e2e-test/success-200-isvc-6eae8-predictor-7b94df567-r9c56" Mar 18 17:17:35.339708 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:17:35.339675 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-72pl2\" (UniqueName: \"kubernetes.io/projected/98b3763c-585d-418a-a124-a44e45ae1196-kube-api-access-72pl2\") pod \"success-200-isvc-6eae8-predictor-7b94df567-r9c56\" (UID: \"98b3763c-585d-418a-a124-a44e45ae1196\") " pod="kserve-ci-e2e-test/success-200-isvc-6eae8-predictor-7b94df567-r9c56" Mar 18 17:17:35.400442 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:17:35.400410 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-6eae8-predictor-7b94df567-r9c56" Mar 18 17:17:35.522800 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:17:35.522720 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-6eae8-predictor-7b94df567-r9c56"] Mar 18 17:17:35.525052 ip-10-0-143-175 kubenswrapper[2578]: W0318 17:17:35.525002 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod98b3763c_585d_418a_a124_a44e45ae1196.slice/crio-507f867018ea03d52d98fd58acd15e882bc8d5c540f28dd987524899dfa0295e WatchSource:0}: Error finding container 507f867018ea03d52d98fd58acd15e882bc8d5c540f28dd987524899dfa0295e: Status 404 returned error can't find the container with id 507f867018ea03d52d98fd58acd15e882bc8d5c540f28dd987524899dfa0295e Mar 18 17:17:35.872429 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:17:35.872384 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-eb922-predictor-ff68cf985-w7v89" podUID="8ec60ecd-1545-462c-9db3-184ba0c557d0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.48:8080: connect: connection refused" Mar 18 17:17:36.038223 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:17:36.038183 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-6eae8-predictor-7b94df567-r9c56" event={"ID":"98b3763c-585d-418a-a124-a44e45ae1196","Type":"ContainerStarted","Data":"4a1c8c68658ce639cb7beb8871b20a987a2813e83254e7baf062a3428e3e9462"} Mar 18 17:17:36.038223 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:17:36.038220 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-6eae8-predictor-7b94df567-r9c56" event={"ID":"98b3763c-585d-418a-a124-a44e45ae1196","Type":"ContainerStarted","Data":"507f867018ea03d52d98fd58acd15e882bc8d5c540f28dd987524899dfa0295e"} Mar 18 17:17:36.038478 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:17:36.038389 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-6eae8-predictor-7b94df567-r9c56" Mar 18 17:17:36.039402 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:17:36.039376 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-6eae8-predictor-7b94df567-r9c56" podUID="98b3763c-585d-418a-a124-a44e45ae1196" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.50:8080: connect: connection refused" Mar 18 17:17:36.053374 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:17:36.053332 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-6eae8-predictor-7b94df567-r9c56" podStartSLOduration=1.053318703 podStartE2EDuration="1.053318703s" podCreationTimestamp="2026-03-18 17:17:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:17:36.05165729 +0000 UTC m=+2010.063766114" watchObservedRunningTime="2026-03-18 17:17:36.053318703 +0000 UTC m=+2010.065427527" Mar 18 17:17:36.743727 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:17:36.743676 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-0c4f8-predictor-67fb899dc9-88dnr" podUID="317736d7-450d-41e3-b7bf-0cd269b30334" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.47:8080: connect: connection refused" Mar 18 17:17:37.014163 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:17:37.014066 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-0c4f8-7c56559465-2s29v" podUID="6557ac99-6842-4706-83ab-858761eb5031" containerName="splitter-graph-0c4f8" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 17:17:37.042154 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:17:37.042108 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-6eae8-predictor-7b94df567-r9c56" podUID="98b3763c-585d-418a-a124-a44e45ae1196" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.50:8080: connect: connection refused" Mar 18 17:17:38.508194 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:17:38.508167 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-0c4f8-predictor-67fb899dc9-88dnr" Mar 18 17:17:38.659246 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:17:38.659161 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h9pdj\" (UniqueName: \"kubernetes.io/projected/317736d7-450d-41e3-b7bf-0cd269b30334-kube-api-access-h9pdj\") pod \"317736d7-450d-41e3-b7bf-0cd269b30334\" (UID: \"317736d7-450d-41e3-b7bf-0cd269b30334\") " Mar 18 17:17:38.661216 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:17:38.661177 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/317736d7-450d-41e3-b7bf-0cd269b30334-kube-api-access-h9pdj" (OuterVolumeSpecName: "kube-api-access-h9pdj") pod "317736d7-450d-41e3-b7bf-0cd269b30334" (UID: "317736d7-450d-41e3-b7bf-0cd269b30334"). InnerVolumeSpecName "kube-api-access-h9pdj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 17:17:38.760575 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:17:38.760498 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-h9pdj\" (UniqueName: \"kubernetes.io/projected/317736d7-450d-41e3-b7bf-0cd269b30334-kube-api-access-h9pdj\") on node \"ip-10-0-143-175.ec2.internal\" DevicePath \"\"" Mar 18 17:17:39.050263 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:17:39.050227 2578 generic.go:358] "Generic (PLEG): container finished" podID="317736d7-450d-41e3-b7bf-0cd269b30334" containerID="9324e9a06643460c8ac8df62d84bad56db7a764a10e274fa4b49ffb00784db66" exitCode=0 Mar 18 17:17:39.050507 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:17:39.050302 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-0c4f8-predictor-67fb899dc9-88dnr" Mar 18 17:17:39.050507 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:17:39.050305 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-0c4f8-predictor-67fb899dc9-88dnr" event={"ID":"317736d7-450d-41e3-b7bf-0cd269b30334","Type":"ContainerDied","Data":"9324e9a06643460c8ac8df62d84bad56db7a764a10e274fa4b49ffb00784db66"} Mar 18 17:17:39.050507 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:17:39.050342 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-0c4f8-predictor-67fb899dc9-88dnr" event={"ID":"317736d7-450d-41e3-b7bf-0cd269b30334","Type":"ContainerDied","Data":"028882afd801b0096f506a19565e67d1fd046eff4e37a3d7ee5cdfdf83a9e49c"} Mar 18 17:17:39.050507 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:17:39.050358 2578 scope.go:117] "RemoveContainer" containerID="9324e9a06643460c8ac8df62d84bad56db7a764a10e274fa4b49ffb00784db66" Mar 18 17:17:39.058798 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:17:39.058780 2578 scope.go:117] "RemoveContainer" containerID="9324e9a06643460c8ac8df62d84bad56db7a764a10e274fa4b49ffb00784db66" Mar 18 17:17:39.059057 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:17:39.059030 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9324e9a06643460c8ac8df62d84bad56db7a764a10e274fa4b49ffb00784db66\": container with ID starting with 9324e9a06643460c8ac8df62d84bad56db7a764a10e274fa4b49ffb00784db66 not found: ID does not exist" containerID="9324e9a06643460c8ac8df62d84bad56db7a764a10e274fa4b49ffb00784db66" Mar 18 17:17:39.059148 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:17:39.059062 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9324e9a06643460c8ac8df62d84bad56db7a764a10e274fa4b49ffb00784db66"} err="failed to get container status \"9324e9a06643460c8ac8df62d84bad56db7a764a10e274fa4b49ffb00784db66\": rpc error: code = NotFound desc = could not find container \"9324e9a06643460c8ac8df62d84bad56db7a764a10e274fa4b49ffb00784db66\": container with ID starting with 9324e9a06643460c8ac8df62d84bad56db7a764a10e274fa4b49ffb00784db66 not found: ID does not exist" Mar 18 17:17:39.069371 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:17:39.069349 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-0c4f8-predictor-67fb899dc9-88dnr"] Mar 18 17:17:39.072913 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:17:39.072893 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-0c4f8-predictor-67fb899dc9-88dnr"] Mar 18 17:17:39.589757 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:17:39.589726 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:17:40.592472 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:17:40.592430 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="317736d7-450d-41e3-b7bf-0cd269b30334" path="/var/lib/kubelet/pods/317736d7-450d-41e3-b7bf-0cd269b30334/volumes" Mar 18 17:17:42.013712 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:17:42.013671 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-0c4f8-7c56559465-2s29v" podUID="6557ac99-6842-4706-83ab-858761eb5031" containerName="splitter-graph-0c4f8" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 17:17:45.873381 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:17:45.873351 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-eb922-predictor-ff68cf985-w7v89" Mar 18 17:17:47.013849 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:17:47.013806 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-0c4f8-7c56559465-2s29v" podUID="6557ac99-6842-4706-83ab-858761eb5031" containerName="splitter-graph-0c4f8" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 17:17:47.014216 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:17:47.013918 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/splitter-graph-0c4f8-7c56559465-2s29v" Mar 18 17:17:47.042788 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:17:47.042740 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-6eae8-predictor-7b94df567-r9c56" podUID="98b3763c-585d-418a-a124-a44e45ae1196" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.50:8080: connect: connection refused" Mar 18 17:17:50.589568 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:17:50.589521 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:17:52.014387 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:17:52.014325 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-0c4f8-7c56559465-2s29v" podUID="6557ac99-6842-4706-83ab-858761eb5031" containerName="splitter-graph-0c4f8" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 17:17:57.013655 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:17:57.013610 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-0c4f8-7c56559465-2s29v" podUID="6557ac99-6842-4706-83ab-858761eb5031" containerName="splitter-graph-0c4f8" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 17:17:57.042397 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:17:57.042346 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-6eae8-predictor-7b94df567-r9c56" podUID="98b3763c-585d-418a-a124-a44e45ae1196" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.50:8080: connect: connection refused" Mar 18 17:18:02.013948 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:18:02.013908 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-0c4f8-7c56559465-2s29v" podUID="6557ac99-6842-4706-83ab-858761eb5031" containerName="splitter-graph-0c4f8" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 17:18:03.590126 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:18:03.590082 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:18:04.130756 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:18:04.130723 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/switch-graph-eb922-574c9ddc44-2c6pp"] Mar 18 17:18:04.131177 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:18:04.131160 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="317736d7-450d-41e3-b7bf-0cd269b30334" containerName="kserve-container" Mar 18 17:18:04.131250 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:18:04.131179 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="317736d7-450d-41e3-b7bf-0cd269b30334" containerName="kserve-container" Mar 18 17:18:04.131250 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:18:04.131246 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="317736d7-450d-41e3-b7bf-0cd269b30334" containerName="kserve-container" Mar 18 17:18:04.135690 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:18:04.135665 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-eb922-574c9ddc44-2c6pp" Mar 18 17:18:04.138016 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:18:04.137980 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-eb922-kube-rbac-proxy-sar-config\"" Mar 18 17:18:04.138227 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:18:04.138210 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-eb922-serving-cert\"" Mar 18 17:18:04.139474 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:18:04.139449 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-eb922-574c9ddc44-2c6pp"] Mar 18 17:18:04.175219 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:18:04.175169 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1631c823-d67a-417b-a542-fbfbbfb396ba-openshift-service-ca-bundle\") pod \"switch-graph-eb922-574c9ddc44-2c6pp\" (UID: \"1631c823-d67a-417b-a542-fbfbbfb396ba\") " pod="kserve-ci-e2e-test/switch-graph-eb922-574c9ddc44-2c6pp" Mar 18 17:18:04.175397 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:18:04.175228 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1631c823-d67a-417b-a542-fbfbbfb396ba-proxy-tls\") pod \"switch-graph-eb922-574c9ddc44-2c6pp\" (UID: \"1631c823-d67a-417b-a542-fbfbbfb396ba\") " pod="kserve-ci-e2e-test/switch-graph-eb922-574c9ddc44-2c6pp" Mar 18 17:18:04.276034 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:18:04.275992 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1631c823-d67a-417b-a542-fbfbbfb396ba-openshift-service-ca-bundle\") pod \"switch-graph-eb922-574c9ddc44-2c6pp\" (UID: \"1631c823-d67a-417b-a542-fbfbbfb396ba\") " pod="kserve-ci-e2e-test/switch-graph-eb922-574c9ddc44-2c6pp" Mar 18 17:18:04.276237 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:18:04.276053 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1631c823-d67a-417b-a542-fbfbbfb396ba-proxy-tls\") pod \"switch-graph-eb922-574c9ddc44-2c6pp\" (UID: \"1631c823-d67a-417b-a542-fbfbbfb396ba\") " pod="kserve-ci-e2e-test/switch-graph-eb922-574c9ddc44-2c6pp" Mar 18 17:18:04.276237 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:18:04.276218 2578 secret.go:189] Couldn't get secret kserve-ci-e2e-test/switch-graph-eb922-serving-cert: secret "switch-graph-eb922-serving-cert" not found Mar 18 17:18:04.276374 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:18:04.276287 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1631c823-d67a-417b-a542-fbfbbfb396ba-proxy-tls podName:1631c823-d67a-417b-a542-fbfbbfb396ba nodeName:}" failed. No retries permitted until 2026-03-18 17:18:04.776269049 +0000 UTC m=+2038.788377854 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/1631c823-d67a-417b-a542-fbfbbfb396ba-proxy-tls") pod "switch-graph-eb922-574c9ddc44-2c6pp" (UID: "1631c823-d67a-417b-a542-fbfbbfb396ba") : secret "switch-graph-eb922-serving-cert" not found Mar 18 17:18:04.276748 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:18:04.276723 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1631c823-d67a-417b-a542-fbfbbfb396ba-openshift-service-ca-bundle\") pod \"switch-graph-eb922-574c9ddc44-2c6pp\" (UID: \"1631c823-d67a-417b-a542-fbfbbfb396ba\") " pod="kserve-ci-e2e-test/switch-graph-eb922-574c9ddc44-2c6pp" Mar 18 17:18:04.743098 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:18:04.743068 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-0c4f8-7c56559465-2s29v" Mar 18 17:18:04.778597 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:18:04.778564 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6557ac99-6842-4706-83ab-858761eb5031-openshift-service-ca-bundle\") pod \"6557ac99-6842-4706-83ab-858761eb5031\" (UID: \"6557ac99-6842-4706-83ab-858761eb5031\") " Mar 18 17:18:04.778799 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:18:04.778608 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6557ac99-6842-4706-83ab-858761eb5031-proxy-tls\") pod \"6557ac99-6842-4706-83ab-858761eb5031\" (UID: \"6557ac99-6842-4706-83ab-858761eb5031\") " Mar 18 17:18:04.778799 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:18:04.778719 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1631c823-d67a-417b-a542-fbfbbfb396ba-proxy-tls\") pod \"switch-graph-eb922-574c9ddc44-2c6pp\" (UID: \"1631c823-d67a-417b-a542-fbfbbfb396ba\") " pod="kserve-ci-e2e-test/switch-graph-eb922-574c9ddc44-2c6pp" Mar 18 17:18:04.778989 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:18:04.778959 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6557ac99-6842-4706-83ab-858761eb5031-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "6557ac99-6842-4706-83ab-858761eb5031" (UID: "6557ac99-6842-4706-83ab-858761eb5031"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 17:18:04.780827 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:18:04.780799 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6557ac99-6842-4706-83ab-858761eb5031-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "6557ac99-6842-4706-83ab-858761eb5031" (UID: "6557ac99-6842-4706-83ab-858761eb5031"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 17:18:04.781243 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:18:04.781221 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1631c823-d67a-417b-a542-fbfbbfb396ba-proxy-tls\") pod \"switch-graph-eb922-574c9ddc44-2c6pp\" (UID: \"1631c823-d67a-417b-a542-fbfbbfb396ba\") " pod="kserve-ci-e2e-test/switch-graph-eb922-574c9ddc44-2c6pp" Mar 18 17:18:04.879460 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:18:04.879366 2578 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6557ac99-6842-4706-83ab-858761eb5031-openshift-service-ca-bundle\") on node \"ip-10-0-143-175.ec2.internal\" DevicePath \"\"" Mar 18 17:18:04.879460 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:18:04.879404 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6557ac99-6842-4706-83ab-858761eb5031-proxy-tls\") on node \"ip-10-0-143-175.ec2.internal\" DevicePath \"\"" Mar 18 17:18:05.049433 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:18:05.049400 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-eb922-574c9ddc44-2c6pp" Mar 18 17:18:05.137730 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:18:05.137406 2578 generic.go:358] "Generic (PLEG): container finished" podID="6557ac99-6842-4706-83ab-858761eb5031" containerID="f5273b16ec96a150273f76f304d9735ab3ba33c7fdc4505a3a3bd58caf601014" exitCode=0 Mar 18 17:18:05.137730 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:18:05.137467 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-0c4f8-7c56559465-2s29v" event={"ID":"6557ac99-6842-4706-83ab-858761eb5031","Type":"ContainerDied","Data":"f5273b16ec96a150273f76f304d9735ab3ba33c7fdc4505a3a3bd58caf601014"} Mar 18 17:18:05.137730 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:18:05.137493 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-0c4f8-7c56559465-2s29v" event={"ID":"6557ac99-6842-4706-83ab-858761eb5031","Type":"ContainerDied","Data":"7a6a5ab1030b15bdb93873d7cf1dcb781f8f0f86235cb20cecbde08ca17d7f1d"} Mar 18 17:18:05.137730 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:18:05.137513 2578 scope.go:117] "RemoveContainer" containerID="f5273b16ec96a150273f76f304d9735ab3ba33c7fdc4505a3a3bd58caf601014" Mar 18 17:18:05.137730 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:18:05.137677 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-0c4f8-7c56559465-2s29v" Mar 18 17:18:05.149899 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:18:05.149872 2578 scope.go:117] "RemoveContainer" containerID="f5273b16ec96a150273f76f304d9735ab3ba33c7fdc4505a3a3bd58caf601014" Mar 18 17:18:05.150295 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:18:05.150245 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5273b16ec96a150273f76f304d9735ab3ba33c7fdc4505a3a3bd58caf601014\": container with ID starting with f5273b16ec96a150273f76f304d9735ab3ba33c7fdc4505a3a3bd58caf601014 not found: ID does not exist" containerID="f5273b16ec96a150273f76f304d9735ab3ba33c7fdc4505a3a3bd58caf601014" Mar 18 17:18:05.150407 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:18:05.150287 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5273b16ec96a150273f76f304d9735ab3ba33c7fdc4505a3a3bd58caf601014"} err="failed to get container status \"f5273b16ec96a150273f76f304d9735ab3ba33c7fdc4505a3a3bd58caf601014\": rpc error: code = NotFound desc = could not find container \"f5273b16ec96a150273f76f304d9735ab3ba33c7fdc4505a3a3bd58caf601014\": container with ID starting with f5273b16ec96a150273f76f304d9735ab3ba33c7fdc4505a3a3bd58caf601014 not found: ID does not exist" Mar 18 17:18:05.163634 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:18:05.163598 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-0c4f8-7c56559465-2s29v"] Mar 18 17:18:05.166343 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:18:05.166316 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-0c4f8-7c56559465-2s29v"] Mar 18 17:18:05.179543 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:18:05.179497 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-eb922-574c9ddc44-2c6pp"] Mar 18 17:18:05.182392 ip-10-0-143-175 kubenswrapper[2578]: W0318 17:18:05.182354 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1631c823_d67a_417b_a542_fbfbbfb396ba.slice/crio-c41ce15ae161efd29805134ad40c3c5554655983bcb79fa3c396f7f68990fc7f WatchSource:0}: Error finding container c41ce15ae161efd29805134ad40c3c5554655983bcb79fa3c396f7f68990fc7f: Status 404 returned error can't find the container with id c41ce15ae161efd29805134ad40c3c5554655983bcb79fa3c396f7f68990fc7f Mar 18 17:18:06.143733 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:18:06.143698 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-eb922-574c9ddc44-2c6pp" event={"ID":"1631c823-d67a-417b-a542-fbfbbfb396ba","Type":"ContainerStarted","Data":"870a5ca9ea4480e8f28250bfb5eb5f9b90b3eb0c886b0d1f723718fb2787a849"} Mar 18 17:18:06.143733 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:18:06.143735 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-eb922-574c9ddc44-2c6pp" event={"ID":"1631c823-d67a-417b-a542-fbfbbfb396ba","Type":"ContainerStarted","Data":"c41ce15ae161efd29805134ad40c3c5554655983bcb79fa3c396f7f68990fc7f"} Mar 18 17:18:06.144153 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:18:06.143768 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-eb922-574c9ddc44-2c6pp" Mar 18 17:18:06.159073 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:18:06.159023 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/switch-graph-eb922-574c9ddc44-2c6pp" podStartSLOduration=2.159009237 podStartE2EDuration="2.159009237s" podCreationTimestamp="2026-03-18 17:18:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:18:06.157659417 +0000 UTC m=+2040.169768243" watchObservedRunningTime="2026-03-18 17:18:06.159009237 +0000 UTC m=+2040.171118062" Mar 18 17:18:06.593793 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:18:06.593754 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6557ac99-6842-4706-83ab-858761eb5031" path="/var/lib/kubelet/pods/6557ac99-6842-4706-83ab-858761eb5031/volumes" Mar 18 17:18:07.043048 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:18:07.043005 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-6eae8-predictor-7b94df567-r9c56" podUID="98b3763c-585d-418a-a124-a44e45ae1196" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.50:8080: connect: connection refused" Mar 18 17:18:12.153556 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:18:12.153510 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/switch-graph-eb922-574c9ddc44-2c6pp" Mar 18 17:18:15.589930 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:18:15.589897 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:18:17.042883 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:18:17.042832 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-6eae8-predictor-7b94df567-r9c56" podUID="98b3763c-585d-418a-a124-a44e45ae1196" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.50:8080: connect: connection refused" Mar 18 17:18:27.042820 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:18:27.042727 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-6eae8-predictor-7b94df567-r9c56" podUID="98b3763c-585d-418a-a124-a44e45ae1196" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.50:8080: connect: connection refused" Mar 18 17:18:29.589670 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:18:29.589636 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:18:37.043739 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:18:37.043702 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-6eae8-predictor-7b94df567-r9c56" Mar 18 17:18:44.590016 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:18:44.589984 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:18:55.106864 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:18:55.106830 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/splitter-graph-6eae8-7cdb674958-h2pjc"] Mar 18 17:18:55.107241 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:18:55.107189 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6557ac99-6842-4706-83ab-858761eb5031" containerName="splitter-graph-0c4f8" Mar 18 17:18:55.107241 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:18:55.107201 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="6557ac99-6842-4706-83ab-858761eb5031" containerName="splitter-graph-0c4f8" Mar 18 17:18:55.107320 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:18:55.107276 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="6557ac99-6842-4706-83ab-858761eb5031" containerName="splitter-graph-0c4f8" Mar 18 17:18:55.110370 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:18:55.110352 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-6eae8-7cdb674958-h2pjc" Mar 18 17:18:55.112433 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:18:55.112411 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"splitter-graph-6eae8-serving-cert\"" Mar 18 17:18:55.112585 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:18:55.112411 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"splitter-graph-6eae8-kube-rbac-proxy-sar-config\"" Mar 18 17:18:55.117376 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:18:55.117329 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-6eae8-7cdb674958-h2pjc"] Mar 18 17:18:55.197792 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:18:55.197750 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/08be6d3f-7873-46f4-a259-a0e4467eca34-openshift-service-ca-bundle\") pod \"splitter-graph-6eae8-7cdb674958-h2pjc\" (UID: \"08be6d3f-7873-46f4-a259-a0e4467eca34\") " pod="kserve-ci-e2e-test/splitter-graph-6eae8-7cdb674958-h2pjc" Mar 18 17:18:55.197976 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:18:55.197851 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/08be6d3f-7873-46f4-a259-a0e4467eca34-proxy-tls\") pod \"splitter-graph-6eae8-7cdb674958-h2pjc\" (UID: \"08be6d3f-7873-46f4-a259-a0e4467eca34\") " pod="kserve-ci-e2e-test/splitter-graph-6eae8-7cdb674958-h2pjc" Mar 18 17:18:55.298290 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:18:55.298258 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/08be6d3f-7873-46f4-a259-a0e4467eca34-proxy-tls\") pod \"splitter-graph-6eae8-7cdb674958-h2pjc\" (UID: \"08be6d3f-7873-46f4-a259-a0e4467eca34\") " pod="kserve-ci-e2e-test/splitter-graph-6eae8-7cdb674958-h2pjc" Mar 18 17:18:55.298483 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:18:55.298353 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/08be6d3f-7873-46f4-a259-a0e4467eca34-openshift-service-ca-bundle\") pod \"splitter-graph-6eae8-7cdb674958-h2pjc\" (UID: \"08be6d3f-7873-46f4-a259-a0e4467eca34\") " pod="kserve-ci-e2e-test/splitter-graph-6eae8-7cdb674958-h2pjc" Mar 18 17:18:55.298483 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:18:55.298407 2578 secret.go:189] Couldn't get secret kserve-ci-e2e-test/splitter-graph-6eae8-serving-cert: secret "splitter-graph-6eae8-serving-cert" not found Mar 18 17:18:55.298602 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:18:55.298486 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/08be6d3f-7873-46f4-a259-a0e4467eca34-proxy-tls podName:08be6d3f-7873-46f4-a259-a0e4467eca34 nodeName:}" failed. No retries permitted until 2026-03-18 17:18:55.798465699 +0000 UTC m=+2089.810574518 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/08be6d3f-7873-46f4-a259-a0e4467eca34-proxy-tls") pod "splitter-graph-6eae8-7cdb674958-h2pjc" (UID: "08be6d3f-7873-46f4-a259-a0e4467eca34") : secret "splitter-graph-6eae8-serving-cert" not found Mar 18 17:18:55.299056 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:18:55.299037 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/08be6d3f-7873-46f4-a259-a0e4467eca34-openshift-service-ca-bundle\") pod \"splitter-graph-6eae8-7cdb674958-h2pjc\" (UID: \"08be6d3f-7873-46f4-a259-a0e4467eca34\") " pod="kserve-ci-e2e-test/splitter-graph-6eae8-7cdb674958-h2pjc" Mar 18 17:18:55.803241 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:18:55.803202 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/08be6d3f-7873-46f4-a259-a0e4467eca34-proxy-tls\") pod \"splitter-graph-6eae8-7cdb674958-h2pjc\" (UID: \"08be6d3f-7873-46f4-a259-a0e4467eca34\") " pod="kserve-ci-e2e-test/splitter-graph-6eae8-7cdb674958-h2pjc" Mar 18 17:18:55.805688 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:18:55.805669 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/08be6d3f-7873-46f4-a259-a0e4467eca34-proxy-tls\") pod \"splitter-graph-6eae8-7cdb674958-h2pjc\" (UID: \"08be6d3f-7873-46f4-a259-a0e4467eca34\") " pod="kserve-ci-e2e-test/splitter-graph-6eae8-7cdb674958-h2pjc" Mar 18 17:18:56.022407 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:18:56.022362 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-6eae8-7cdb674958-h2pjc" Mar 18 17:18:56.149425 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:18:56.149393 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-6eae8-7cdb674958-h2pjc"] Mar 18 17:18:56.152887 ip-10-0-143-175 kubenswrapper[2578]: W0318 17:18:56.152855 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod08be6d3f_7873_46f4_a259_a0e4467eca34.slice/crio-6b8ae25705fc1f1841672dcfb48879e973b54e27bf6a4ea5b7a142121577ec3f WatchSource:0}: Error finding container 6b8ae25705fc1f1841672dcfb48879e973b54e27bf6a4ea5b7a142121577ec3f: Status 404 returned error can't find the container with id 6b8ae25705fc1f1841672dcfb48879e973b54e27bf6a4ea5b7a142121577ec3f Mar 18 17:18:56.303477 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:18:56.303442 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-6eae8-7cdb674958-h2pjc" event={"ID":"08be6d3f-7873-46f4-a259-a0e4467eca34","Type":"ContainerStarted","Data":"c3a276fc9d91c7e92682e34d8fa4faec0a1ce98b4d4b599370b25c5d7cbea425"} Mar 18 17:18:56.303477 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:18:56.303484 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-6eae8-7cdb674958-h2pjc" event={"ID":"08be6d3f-7873-46f4-a259-a0e4467eca34","Type":"ContainerStarted","Data":"6b8ae25705fc1f1841672dcfb48879e973b54e27bf6a4ea5b7a142121577ec3f"} Mar 18 17:18:56.303768 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:18:56.303549 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/splitter-graph-6eae8-7cdb674958-h2pjc" Mar 18 17:18:56.334864 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:18:56.334758 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/splitter-graph-6eae8-7cdb674958-h2pjc" podStartSLOduration=1.334743171 podStartE2EDuration="1.334743171s" podCreationTimestamp="2026-03-18 17:18:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:18:56.328051348 +0000 UTC m=+2090.340160173" watchObservedRunningTime="2026-03-18 17:18:56.334743171 +0000 UTC m=+2090.346852045" Mar 18 17:18:58.589766 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:18:58.589732 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:19:02.313583 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:19:02.313514 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/splitter-graph-6eae8-7cdb674958-h2pjc" Mar 18 17:19:12.879479 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:19:12.879376 2578 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized" image="quay.io/opendatahub/odh-model-serving-api:fast" Mar 18 17:19:12.879899 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:19:12.879644 2578 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:server,Image:quay.io/opendatahub/odh-model-serving-api:fast,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},ContainerPort{Name:metrics,HostPort:0,ContainerPort:9090,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:TLS_CERT_DIR,Value:/tls,ValueFrom:nil,},EnvVar{Name:GATEWAY_LABEL_SELECTOR,Value:,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{50 -3} {} 50m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tls-certs,ReadOnly:true,MountPath:/tls,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xp5hl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000640000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod model-serving-api-9699c8d45-stcsx_kserve(4f796fc2-c6ea-4936-8fae-6cd0d2b876db): ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized" logger="UnhandledError" Mar 18 17:19:12.880843 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:19:12.880814 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ErrImagePull: \"unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:19:24.589278 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:19:24.589245 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:19:38.590158 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:19:38.589917 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:19:53.591917 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:19:53.589837 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:20:05.589629 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:20:05.589542 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:20:20.589616 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:20:20.589571 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:20:34.589725 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:20:34.589699 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 17:20:34.590049 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:20:34.589982 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:20:49.589577 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:20:49.589541 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:21:01.589951 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:21:01.589874 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:21:13.589612 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:21:13.589573 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:21:28.589948 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:21:28.589916 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:21:40.589890 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:21:40.589848 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:21:55.589699 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:21:55.589668 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:22:09.589757 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:22:09.589678 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:22:23.588979 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:22:23.588946 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:22:38.589609 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:22:38.589569 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:22:52.590112 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:22:52.590049 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:23:07.589117 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:23:07.589082 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:23:20.589246 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:23:20.589197 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:23:32.589665 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:23:32.589388 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:23:44.589694 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:23:44.589659 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:23:57.589687 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:23:57.589651 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:24:10.593601 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:24:10.593476 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:24:23.876739 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:24:23.876630 2578 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized" image="quay.io/opendatahub/odh-model-serving-api:fast" Mar 18 17:24:23.877208 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:24:23.876822 2578 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:server,Image:quay.io/opendatahub/odh-model-serving-api:fast,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},ContainerPort{Name:metrics,HostPort:0,ContainerPort:9090,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:TLS_CERT_DIR,Value:/tls,ValueFrom:nil,},EnvVar{Name:GATEWAY_LABEL_SELECTOR,Value:,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{50 -3} {} 50m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tls-certs,ReadOnly:true,MountPath:/tls,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xp5hl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000640000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod model-serving-api-9699c8d45-stcsx_kserve(4f796fc2-c6ea-4936-8fae-6cd0d2b876db): ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized" logger="UnhandledError" Mar 18 17:24:23.877997 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:24:23.877968 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ErrImagePull: \"unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:24:36.591206 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:24:36.591129 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:24:51.590006 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:24:51.589975 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:25:02.592184 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:25:02.590227 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:25:16.591830 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:25:16.591661 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:25:28.590073 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:25:28.589958 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:25:43.589792 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:25:43.589659 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 17:25:43.590127 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:25:43.589855 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:25:58.590187 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:25:58.590078 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:26:13.589724 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:26:13.589692 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:26:24.589380 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:26:24.589346 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:26:39.590083 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:26:39.590050 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:26:51.589696 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:26:51.589665 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:27:04.591621 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:27:04.591578 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:27:09.858062 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:27:09.858029 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-6eae8-7cdb674958-h2pjc"] Mar 18 17:27:09.858540 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:27:09.858252 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/splitter-graph-6eae8-7cdb674958-h2pjc" podUID="08be6d3f-7873-46f4-a259-a0e4467eca34" containerName="splitter-graph-6eae8" containerID="cri-o://c3a276fc9d91c7e92682e34d8fa4faec0a1ce98b4d4b599370b25c5d7cbea425" gracePeriod=30 Mar 18 17:27:10.031634 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:27:10.031598 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-6eae8-predictor-7b94df567-r9c56"] Mar 18 17:27:10.031876 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:27:10.031835 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-6eae8-predictor-7b94df567-r9c56" podUID="98b3763c-585d-418a-a124-a44e45ae1196" containerName="kserve-container" containerID="cri-o://4a1c8c68658ce639cb7beb8871b20a987a2813e83254e7baf062a3428e3e9462" gracePeriod=30 Mar 18 17:27:12.311086 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:27:12.311040 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-6eae8-7cdb674958-h2pjc" podUID="08be6d3f-7873-46f4-a259-a0e4467eca34" containerName="splitter-graph-6eae8" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 17:27:13.475561 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:27:13.475520 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-6eae8-predictor-7b94df567-r9c56" Mar 18 17:27:13.529345 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:27:13.529312 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-72pl2\" (UniqueName: \"kubernetes.io/projected/98b3763c-585d-418a-a124-a44e45ae1196-kube-api-access-72pl2\") pod \"98b3763c-585d-418a-a124-a44e45ae1196\" (UID: \"98b3763c-585d-418a-a124-a44e45ae1196\") " Mar 18 17:27:13.531382 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:27:13.531351 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98b3763c-585d-418a-a124-a44e45ae1196-kube-api-access-72pl2" (OuterVolumeSpecName: "kube-api-access-72pl2") pod "98b3763c-585d-418a-a124-a44e45ae1196" (UID: "98b3763c-585d-418a-a124-a44e45ae1196"). InnerVolumeSpecName "kube-api-access-72pl2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 17:27:13.630234 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:27:13.630138 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-72pl2\" (UniqueName: \"kubernetes.io/projected/98b3763c-585d-418a-a124-a44e45ae1196-kube-api-access-72pl2\") on node \"ip-10-0-143-175.ec2.internal\" DevicePath \"\"" Mar 18 17:27:13.895744 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:27:13.895656 2578 generic.go:358] "Generic (PLEG): container finished" podID="98b3763c-585d-418a-a124-a44e45ae1196" containerID="4a1c8c68658ce639cb7beb8871b20a987a2813e83254e7baf062a3428e3e9462" exitCode=0 Mar 18 17:27:13.895744 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:27:13.895721 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-6eae8-predictor-7b94df567-r9c56" Mar 18 17:27:13.895744 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:27:13.895735 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-6eae8-predictor-7b94df567-r9c56" event={"ID":"98b3763c-585d-418a-a124-a44e45ae1196","Type":"ContainerDied","Data":"4a1c8c68658ce639cb7beb8871b20a987a2813e83254e7baf062a3428e3e9462"} Mar 18 17:27:13.896015 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:27:13.895777 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-6eae8-predictor-7b94df567-r9c56" event={"ID":"98b3763c-585d-418a-a124-a44e45ae1196","Type":"ContainerDied","Data":"507f867018ea03d52d98fd58acd15e882bc8d5c540f28dd987524899dfa0295e"} Mar 18 17:27:13.896015 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:27:13.895796 2578 scope.go:117] "RemoveContainer" containerID="4a1c8c68658ce639cb7beb8871b20a987a2813e83254e7baf062a3428e3e9462" Mar 18 17:27:13.904166 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:27:13.904148 2578 scope.go:117] "RemoveContainer" containerID="4a1c8c68658ce639cb7beb8871b20a987a2813e83254e7baf062a3428e3e9462" Mar 18 17:27:13.904425 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:27:13.904407 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a1c8c68658ce639cb7beb8871b20a987a2813e83254e7baf062a3428e3e9462\": container with ID starting with 4a1c8c68658ce639cb7beb8871b20a987a2813e83254e7baf062a3428e3e9462 not found: ID does not exist" containerID="4a1c8c68658ce639cb7beb8871b20a987a2813e83254e7baf062a3428e3e9462" Mar 18 17:27:13.904496 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:27:13.904438 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a1c8c68658ce639cb7beb8871b20a987a2813e83254e7baf062a3428e3e9462"} err="failed to get container status \"4a1c8c68658ce639cb7beb8871b20a987a2813e83254e7baf062a3428e3e9462\": rpc error: code = NotFound desc = could not find container \"4a1c8c68658ce639cb7beb8871b20a987a2813e83254e7baf062a3428e3e9462\": container with ID starting with 4a1c8c68658ce639cb7beb8871b20a987a2813e83254e7baf062a3428e3e9462 not found: ID does not exist" Mar 18 17:27:13.916479 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:27:13.916454 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-6eae8-predictor-7b94df567-r9c56"] Mar 18 17:27:13.921267 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:27:13.921246 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-6eae8-predictor-7b94df567-r9c56"] Mar 18 17:27:14.592716 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:27:14.592680 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98b3763c-585d-418a-a124-a44e45ae1196" path="/var/lib/kubelet/pods/98b3763c-585d-418a-a124-a44e45ae1196/volumes" Mar 18 17:27:15.589270 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:27:15.589238 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:27:17.312147 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:27:17.312105 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-6eae8-7cdb674958-h2pjc" podUID="08be6d3f-7873-46f4-a259-a0e4467eca34" containerName="splitter-graph-6eae8" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 17:27:22.311414 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:27:22.311364 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-6eae8-7cdb674958-h2pjc" podUID="08be6d3f-7873-46f4-a259-a0e4467eca34" containerName="splitter-graph-6eae8" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 17:27:22.311845 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:27:22.311484 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/splitter-graph-6eae8-7cdb674958-h2pjc" Mar 18 17:27:27.311255 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:27:27.311164 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-6eae8-7cdb674958-h2pjc" podUID="08be6d3f-7873-46f4-a259-a0e4467eca34" containerName="splitter-graph-6eae8" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 17:27:27.589765 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:27:27.589733 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:27:32.311448 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:27:32.311402 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-6eae8-7cdb674958-h2pjc" podUID="08be6d3f-7873-46f4-a259-a0e4467eca34" containerName="splitter-graph-6eae8" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 17:27:37.311234 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:27:37.311191 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-6eae8-7cdb674958-h2pjc" podUID="08be6d3f-7873-46f4-a259-a0e4467eca34" containerName="splitter-graph-6eae8" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 17:27:39.983941 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:27:39.983897 2578 generic.go:358] "Generic (PLEG): container finished" podID="08be6d3f-7873-46f4-a259-a0e4467eca34" containerID="c3a276fc9d91c7e92682e34d8fa4faec0a1ce98b4d4b599370b25c5d7cbea425" exitCode=0 Mar 18 17:27:39.984302 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:27:39.983935 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-6eae8-7cdb674958-h2pjc" event={"ID":"08be6d3f-7873-46f4-a259-a0e4467eca34","Type":"ContainerDied","Data":"c3a276fc9d91c7e92682e34d8fa4faec0a1ce98b4d4b599370b25c5d7cbea425"} Mar 18 17:27:40.499635 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:27:40.499613 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-6eae8-7cdb674958-h2pjc" Mar 18 17:27:40.660247 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:27:40.660161 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/08be6d3f-7873-46f4-a259-a0e4467eca34-openshift-service-ca-bundle\") pod \"08be6d3f-7873-46f4-a259-a0e4467eca34\" (UID: \"08be6d3f-7873-46f4-a259-a0e4467eca34\") " Mar 18 17:27:40.660247 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:27:40.660238 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/08be6d3f-7873-46f4-a259-a0e4467eca34-proxy-tls\") pod \"08be6d3f-7873-46f4-a259-a0e4467eca34\" (UID: \"08be6d3f-7873-46f4-a259-a0e4467eca34\") " Mar 18 17:27:40.660502 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:27:40.660477 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08be6d3f-7873-46f4-a259-a0e4467eca34-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "08be6d3f-7873-46f4-a259-a0e4467eca34" (UID: "08be6d3f-7873-46f4-a259-a0e4467eca34"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 17:27:40.662682 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:27:40.662649 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08be6d3f-7873-46f4-a259-a0e4467eca34-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "08be6d3f-7873-46f4-a259-a0e4467eca34" (UID: "08be6d3f-7873-46f4-a259-a0e4467eca34"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 17:27:40.761225 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:27:40.761185 2578 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/08be6d3f-7873-46f4-a259-a0e4467eca34-openshift-service-ca-bundle\") on node \"ip-10-0-143-175.ec2.internal\" DevicePath \"\"" Mar 18 17:27:40.761225 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:27:40.761219 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/08be6d3f-7873-46f4-a259-a0e4467eca34-proxy-tls\") on node \"ip-10-0-143-175.ec2.internal\" DevicePath \"\"" Mar 18 17:27:40.988826 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:27:40.988733 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-6eae8-7cdb674958-h2pjc" event={"ID":"08be6d3f-7873-46f4-a259-a0e4467eca34","Type":"ContainerDied","Data":"6b8ae25705fc1f1841672dcfb48879e973b54e27bf6a4ea5b7a142121577ec3f"} Mar 18 17:27:40.988826 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:27:40.988788 2578 scope.go:117] "RemoveContainer" containerID="c3a276fc9d91c7e92682e34d8fa4faec0a1ce98b4d4b599370b25c5d7cbea425" Mar 18 17:27:40.989286 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:27:40.988751 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-6eae8-7cdb674958-h2pjc" Mar 18 17:27:41.011081 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:27:41.011053 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-6eae8-7cdb674958-h2pjc"] Mar 18 17:27:41.014166 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:27:41.014145 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-6eae8-7cdb674958-h2pjc"] Mar 18 17:27:42.598063 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:27:42.598028 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08be6d3f-7873-46f4-a259-a0e4467eca34" path="/var/lib/kubelet/pods/08be6d3f-7873-46f4-a259-a0e4467eca34/volumes" Mar 18 17:27:42.599453 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:27:42.599426 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:27:56.591143 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:27:56.591109 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:28:08.589359 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:28:08.589325 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:28:19.589699 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:28:19.589661 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:28:30.589885 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:28:30.589836 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:28:44.590120 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:28:44.590083 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:28:55.589709 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:28:55.589662 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:29:07.589653 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:29:07.589610 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:29:20.590521 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:29:20.590301 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:29:35.898130 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:29:35.898026 2578 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized" image="quay.io/opendatahub/odh-model-serving-api:fast" Mar 18 17:29:35.898555 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:29:35.898218 2578 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:server,Image:quay.io/opendatahub/odh-model-serving-api:fast,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},ContainerPort{Name:metrics,HostPort:0,ContainerPort:9090,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:TLS_CERT_DIR,Value:/tls,ValueFrom:nil,},EnvVar{Name:GATEWAY_LABEL_SELECTOR,Value:,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{50 -3} {} 50m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tls-certs,ReadOnly:true,MountPath:/tls,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xp5hl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000640000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod model-serving-api-9699c8d45-stcsx_kserve(4f796fc2-c6ea-4936-8fae-6cd0d2b876db): ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized" logger="UnhandledError" Mar 18 17:29:35.899408 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:29:35.899378 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ErrImagePull: \"unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:29:51.589670 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:29:51.589638 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:30:03.591479 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:30:03.589595 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:30:14.589457 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:30:14.589410 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:30:29.590086 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:30:29.590054 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:30:44.589997 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:30:44.589803 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 17:30:44.590256 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:30:44.589985 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:30:57.589334 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:30:57.589299 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:31:09.589434 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:31:09.589393 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:31:24.589230 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:31:24.589174 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:31:39.589397 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:31:39.589365 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:31:50.589392 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:31:50.589343 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:32:03.590172 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:32:03.590078 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:32:17.589764 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:32:17.589578 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:32:30.589637 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:32:30.589582 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:32:44.593924 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:32:44.593841 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:32:57.590067 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:32:57.589997 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:33:11.589986 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:33:11.589952 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:33:22.593609 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:33:22.593547 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:33:37.590021 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:33:37.589908 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:33:49.589500 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:33:49.589467 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:34:03.589830 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:34:03.589799 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:34:14.590045 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:34:14.590002 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:34:23.462551 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:34:23.462480 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-eb922-574c9ddc44-2c6pp"] Mar 18 17:34:23.463023 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:34:23.462798 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/switch-graph-eb922-574c9ddc44-2c6pp" podUID="1631c823-d67a-417b-a542-fbfbbfb396ba" containerName="switch-graph-eb922" containerID="cri-o://870a5ca9ea4480e8f28250bfb5eb5f9b90b3eb0c886b0d1f723718fb2787a849" gracePeriod=30 Mar 18 17:34:23.716904 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:34:23.716825 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-eb922-predictor-ff68cf985-w7v89"] Mar 18 17:34:23.717109 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:34:23.717078 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-eb922-predictor-ff68cf985-w7v89" podUID="8ec60ecd-1545-462c-9db3-184ba0c557d0" containerName="kserve-container" containerID="cri-o://6336645f70edd1434022b2ce70771728ff5682abeb80afb7018ff1fe96218184" gracePeriod=30 Mar 18 17:34:25.589301 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:34:25.589268 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:34:25.871951 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:34:25.871852 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-eb922-predictor-ff68cf985-w7v89" podUID="8ec60ecd-1545-462c-9db3-184ba0c557d0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.48:8080: connect: connection refused" Mar 18 17:34:26.758116 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:34:26.758092 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-eb922-predictor-ff68cf985-w7v89" Mar 18 17:34:26.868843 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:34:26.868754 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz5hh\" (UniqueName: \"kubernetes.io/projected/8ec60ecd-1545-462c-9db3-184ba0c557d0-kube-api-access-lz5hh\") pod \"8ec60ecd-1545-462c-9db3-184ba0c557d0\" (UID: \"8ec60ecd-1545-462c-9db3-184ba0c557d0\") " Mar 18 17:34:26.870915 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:34:26.870877 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ec60ecd-1545-462c-9db3-184ba0c557d0-kube-api-access-lz5hh" (OuterVolumeSpecName: "kube-api-access-lz5hh") pod "8ec60ecd-1545-462c-9db3-184ba0c557d0" (UID: "8ec60ecd-1545-462c-9db3-184ba0c557d0"). InnerVolumeSpecName "kube-api-access-lz5hh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 17:34:26.970128 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:34:26.970089 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lz5hh\" (UniqueName: \"kubernetes.io/projected/8ec60ecd-1545-462c-9db3-184ba0c557d0-kube-api-access-lz5hh\") on node \"ip-10-0-143-175.ec2.internal\" DevicePath \"\"" Mar 18 17:34:27.151204 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:34:27.151114 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-eb922-574c9ddc44-2c6pp" podUID="1631c823-d67a-417b-a542-fbfbbfb396ba" containerName="switch-graph-eb922" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 17:34:27.267249 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:34:27.267213 2578 generic.go:358] "Generic (PLEG): container finished" podID="8ec60ecd-1545-462c-9db3-184ba0c557d0" containerID="6336645f70edd1434022b2ce70771728ff5682abeb80afb7018ff1fe96218184" exitCode=0 Mar 18 17:34:27.267442 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:34:27.267282 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-eb922-predictor-ff68cf985-w7v89" Mar 18 17:34:27.267442 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:34:27.267279 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-eb922-predictor-ff68cf985-w7v89" event={"ID":"8ec60ecd-1545-462c-9db3-184ba0c557d0","Type":"ContainerDied","Data":"6336645f70edd1434022b2ce70771728ff5682abeb80afb7018ff1fe96218184"} Mar 18 17:34:27.267442 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:34:27.267391 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-eb922-predictor-ff68cf985-w7v89" event={"ID":"8ec60ecd-1545-462c-9db3-184ba0c557d0","Type":"ContainerDied","Data":"e3f4af55cf100305a62dc4be92e91a3141922522af29f990f477e8314b7f2e73"} Mar 18 17:34:27.267442 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:34:27.267407 2578 scope.go:117] "RemoveContainer" containerID="6336645f70edd1434022b2ce70771728ff5682abeb80afb7018ff1fe96218184" Mar 18 17:34:27.276092 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:34:27.276070 2578 scope.go:117] "RemoveContainer" containerID="6336645f70edd1434022b2ce70771728ff5682abeb80afb7018ff1fe96218184" Mar 18 17:34:27.276336 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:34:27.276318 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6336645f70edd1434022b2ce70771728ff5682abeb80afb7018ff1fe96218184\": container with ID starting with 6336645f70edd1434022b2ce70771728ff5682abeb80afb7018ff1fe96218184 not found: ID does not exist" containerID="6336645f70edd1434022b2ce70771728ff5682abeb80afb7018ff1fe96218184" Mar 18 17:34:27.276399 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:34:27.276345 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6336645f70edd1434022b2ce70771728ff5682abeb80afb7018ff1fe96218184"} err="failed to get container status \"6336645f70edd1434022b2ce70771728ff5682abeb80afb7018ff1fe96218184\": rpc error: code = NotFound desc = could not find container \"6336645f70edd1434022b2ce70771728ff5682abeb80afb7018ff1fe96218184\": container with ID starting with 6336645f70edd1434022b2ce70771728ff5682abeb80afb7018ff1fe96218184 not found: ID does not exist" Mar 18 17:34:27.287625 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:34:27.287600 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-eb922-predictor-ff68cf985-w7v89"] Mar 18 17:34:27.290755 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:34:27.290734 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-eb922-predictor-ff68cf985-w7v89"] Mar 18 17:34:28.593857 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:34:28.593820 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ec60ecd-1545-462c-9db3-184ba0c557d0" path="/var/lib/kubelet/pods/8ec60ecd-1545-462c-9db3-184ba0c557d0/volumes" Mar 18 17:34:32.150923 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:34:32.150880 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-eb922-574c9ddc44-2c6pp" podUID="1631c823-d67a-417b-a542-fbfbbfb396ba" containerName="switch-graph-eb922" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 17:34:37.151249 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:34:37.151205 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-eb922-574c9ddc44-2c6pp" podUID="1631c823-d67a-417b-a542-fbfbbfb396ba" containerName="switch-graph-eb922" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 17:34:37.151775 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:34:37.151324 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-eb922-574c9ddc44-2c6pp" Mar 18 17:34:38.886964 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:34:38.886854 2578 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized" image="quay.io/opendatahub/odh-model-serving-api:fast" Mar 18 17:34:38.887403 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:34:38.887051 2578 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:server,Image:quay.io/opendatahub/odh-model-serving-api:fast,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},ContainerPort{Name:metrics,HostPort:0,ContainerPort:9090,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:TLS_CERT_DIR,Value:/tls,ValueFrom:nil,},EnvVar{Name:GATEWAY_LABEL_SELECTOR,Value:,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{50 -3} {} 50m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tls-certs,ReadOnly:true,MountPath:/tls,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xp5hl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000640000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod model-serving-api-9699c8d45-stcsx_kserve(4f796fc2-c6ea-4936-8fae-6cd0d2b876db): ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized" logger="UnhandledError" Mar 18 17:34:38.888244 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:34:38.888215 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ErrImagePull: \"unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:34:39.716848 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:34:39.716815 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-eb922-574c9ddc44-2c6pp_1631c823-d67a-417b-a542-fbfbbfb396ba/switch-graph-eb922/0.log" Mar 18 17:34:40.460821 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:34:40.460784 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-eb922-574c9ddc44-2c6pp_1631c823-d67a-417b-a542-fbfbbfb396ba/switch-graph-eb922/0.log" Mar 18 17:34:41.207490 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:34:41.207454 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-eb922-574c9ddc44-2c6pp_1631c823-d67a-417b-a542-fbfbbfb396ba/switch-graph-eb922/0.log" Mar 18 17:34:41.933006 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:34:41.932978 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-eb922-574c9ddc44-2c6pp_1631c823-d67a-417b-a542-fbfbbfb396ba/switch-graph-eb922/0.log" Mar 18 17:34:42.151234 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:34:42.151196 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-eb922-574c9ddc44-2c6pp" podUID="1631c823-d67a-417b-a542-fbfbbfb396ba" containerName="switch-graph-eb922" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 17:34:42.683440 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:34:42.683402 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-eb922-574c9ddc44-2c6pp_1631c823-d67a-417b-a542-fbfbbfb396ba/switch-graph-eb922/0.log" Mar 18 17:34:43.416229 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:34:43.416179 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-eb922-574c9ddc44-2c6pp_1631c823-d67a-417b-a542-fbfbbfb396ba/switch-graph-eb922/0.log" Mar 18 17:34:44.136703 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:34:44.136674 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-eb922-574c9ddc44-2c6pp_1631c823-d67a-417b-a542-fbfbbfb396ba/switch-graph-eb922/0.log" Mar 18 17:34:44.858896 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:34:44.858867 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-eb922-574c9ddc44-2c6pp_1631c823-d67a-417b-a542-fbfbbfb396ba/switch-graph-eb922/0.log" Mar 18 17:34:45.592450 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:34:45.592410 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-eb922-574c9ddc44-2c6pp_1631c823-d67a-417b-a542-fbfbbfb396ba/switch-graph-eb922/0.log" Mar 18 17:34:46.318023 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:34:46.317988 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-eb922-574c9ddc44-2c6pp_1631c823-d67a-417b-a542-fbfbbfb396ba/switch-graph-eb922/0.log" Mar 18 17:34:47.037715 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:34:47.037663 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-eb922-574c9ddc44-2c6pp_1631c823-d67a-417b-a542-fbfbbfb396ba/switch-graph-eb922/0.log" Mar 18 17:34:47.151118 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:34:47.151081 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-eb922-574c9ddc44-2c6pp" podUID="1631c823-d67a-417b-a542-fbfbbfb396ba" containerName="switch-graph-eb922" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 17:34:47.805509 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:34:47.805478 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-eb922-574c9ddc44-2c6pp_1631c823-d67a-417b-a542-fbfbbfb396ba/switch-graph-eb922/0.log" Mar 18 17:34:49.589912 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:34:49.589876 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:34:52.151179 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:34:52.151134 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-eb922-574c9ddc44-2c6pp" podUID="1631c823-d67a-417b-a542-fbfbbfb396ba" containerName="switch-graph-eb922" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 17:34:53.605308 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:34:53.605286 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-eb922-574c9ddc44-2c6pp" Mar 18 17:34:53.691303 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:34:53.691267 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1631c823-d67a-417b-a542-fbfbbfb396ba-proxy-tls\") pod \"1631c823-d67a-417b-a542-fbfbbfb396ba\" (UID: \"1631c823-d67a-417b-a542-fbfbbfb396ba\") " Mar 18 17:34:53.691464 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:34:53.691359 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1631c823-d67a-417b-a542-fbfbbfb396ba-openshift-service-ca-bundle\") pod \"1631c823-d67a-417b-a542-fbfbbfb396ba\" (UID: \"1631c823-d67a-417b-a542-fbfbbfb396ba\") " Mar 18 17:34:53.691723 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:34:53.691702 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1631c823-d67a-417b-a542-fbfbbfb396ba-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "1631c823-d67a-417b-a542-fbfbbfb396ba" (UID: "1631c823-d67a-417b-a542-fbfbbfb396ba"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 17:34:53.693365 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:34:53.693338 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1631c823-d67a-417b-a542-fbfbbfb396ba-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "1631c823-d67a-417b-a542-fbfbbfb396ba" (UID: "1631c823-d67a-417b-a542-fbfbbfb396ba"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 17:34:53.792676 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:34:53.792649 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1631c823-d67a-417b-a542-fbfbbfb396ba-proxy-tls\") on node \"ip-10-0-143-175.ec2.internal\" DevicePath \"\"" Mar 18 17:34:53.792848 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:34:53.792681 2578 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1631c823-d67a-417b-a542-fbfbbfb396ba-openshift-service-ca-bundle\") on node \"ip-10-0-143-175.ec2.internal\" DevicePath \"\"" Mar 18 17:34:54.356075 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:34:54.356038 2578 generic.go:358] "Generic (PLEG): container finished" podID="1631c823-d67a-417b-a542-fbfbbfb396ba" containerID="870a5ca9ea4480e8f28250bfb5eb5f9b90b3eb0c886b0d1f723718fb2787a849" exitCode=0 Mar 18 17:34:54.356274 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:34:54.356106 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-eb922-574c9ddc44-2c6pp" Mar 18 17:34:54.356274 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:34:54.356111 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-eb922-574c9ddc44-2c6pp" event={"ID":"1631c823-d67a-417b-a542-fbfbbfb396ba","Type":"ContainerDied","Data":"870a5ca9ea4480e8f28250bfb5eb5f9b90b3eb0c886b0d1f723718fb2787a849"} Mar 18 17:34:54.356274 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:34:54.356145 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-eb922-574c9ddc44-2c6pp" event={"ID":"1631c823-d67a-417b-a542-fbfbbfb396ba","Type":"ContainerDied","Data":"c41ce15ae161efd29805134ad40c3c5554655983bcb79fa3c396f7f68990fc7f"} Mar 18 17:34:54.356274 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:34:54.356161 2578 scope.go:117] "RemoveContainer" containerID="870a5ca9ea4480e8f28250bfb5eb5f9b90b3eb0c886b0d1f723718fb2787a849" Mar 18 17:34:54.364310 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:34:54.364289 2578 scope.go:117] "RemoveContainer" containerID="870a5ca9ea4480e8f28250bfb5eb5f9b90b3eb0c886b0d1f723718fb2787a849" Mar 18 17:34:54.364620 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:34:54.364598 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"870a5ca9ea4480e8f28250bfb5eb5f9b90b3eb0c886b0d1f723718fb2787a849\": container with ID starting with 870a5ca9ea4480e8f28250bfb5eb5f9b90b3eb0c886b0d1f723718fb2787a849 not found: ID does not exist" containerID="870a5ca9ea4480e8f28250bfb5eb5f9b90b3eb0c886b0d1f723718fb2787a849" Mar 18 17:34:54.364692 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:34:54.364626 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"870a5ca9ea4480e8f28250bfb5eb5f9b90b3eb0c886b0d1f723718fb2787a849"} err="failed to get container status \"870a5ca9ea4480e8f28250bfb5eb5f9b90b3eb0c886b0d1f723718fb2787a849\": rpc error: code = NotFound desc = could not find container \"870a5ca9ea4480e8f28250bfb5eb5f9b90b3eb0c886b0d1f723718fb2787a849\": container with ID starting with 870a5ca9ea4480e8f28250bfb5eb5f9b90b3eb0c886b0d1f723718fb2787a849 not found: ID does not exist" Mar 18 17:34:54.375893 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:34:54.375865 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-eb922-574c9ddc44-2c6pp"] Mar 18 17:34:54.379054 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:34:54.379033 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/switch-graph-eb922-574c9ddc44-2c6pp"] Mar 18 17:34:54.592904 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:34:54.592875 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1631c823-d67a-417b-a542-fbfbbfb396ba" path="/var/lib/kubelet/pods/1631c823-d67a-417b-a542-fbfbbfb396ba/volumes" Mar 18 17:34:54.679406 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:34:54.679320 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-52khs_31d1ce77-15fd-48f5-844f-07de4a0cdfc4/global-pull-secret-syncer/0.log" Mar 18 17:34:54.832387 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:34:54.832357 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-9fxzt_0faceadc-eddf-479d-b008-553314c47823/konnectivity-agent/0.log" Mar 18 17:34:55.020217 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:34:55.020136 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-143-175.ec2.internal_a2c314ed716f4e99aa7d05b49c88fc46/haproxy/0.log" Mar 18 17:34:58.592203 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:34:58.592133 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-57f6d8dc57-4tc82_4d7aa466-f884-4bf5-b245-b7e3250058d9/metrics-server/0.log" Mar 18 17:34:58.619540 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:34:58.619495 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-6d47bdb78d-7mt94_127a2232-0ea6-440f-ad16-58a6181cf5c7/monitoring-plugin/0.log" Mar 18 17:34:58.847091 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:34:58.846998 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-hnxlk_6d1cbe17-16d8-49c6-9433-bee7ad721a0d/node-exporter/0.log" Mar 18 17:34:58.868677 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:34:58.868650 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-hnxlk_6d1cbe17-16d8-49c6-9433-bee7ad721a0d/kube-rbac-proxy/0.log" Mar 18 17:34:58.891881 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:34:58.891857 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-hnxlk_6d1cbe17-16d8-49c6-9433-bee7ad721a0d/init-textfile/0.log" Mar 18 17:34:59.293338 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:34:59.293184 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-74554c6fc7-vjhfx_73448893-1fc0-402a-9e38-cf3a81a0a9f0/telemeter-client/0.log" Mar 18 17:34:59.318764 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:34:59.318741 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-74554c6fc7-vjhfx_73448893-1fc0-402a-9e38-cf3a81a0a9f0/reload/0.log" Mar 18 17:34:59.347250 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:34:59.347226 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-74554c6fc7-vjhfx_73448893-1fc0-402a-9e38-cf3a81a0a9f0/kube-rbac-proxy/0.log" Mar 18 17:35:02.012283 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:35:02.012254 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-67fdcb5769-8bt8k_e11605b2-ec2b-4f4f-82c3-4db202827f31/volume-data-source-validator/0.log" Mar 18 17:35:02.526174 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:35:02.526139 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-gffj5/perf-node-gather-daemonset-p6gt9"] Mar 18 17:35:02.526608 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:35:02.526585 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="08be6d3f-7873-46f4-a259-a0e4467eca34" containerName="splitter-graph-6eae8" Mar 18 17:35:02.526608 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:35:02.526608 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="08be6d3f-7873-46f4-a259-a0e4467eca34" containerName="splitter-graph-6eae8" Mar 18 17:35:02.526760 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:35:02.526621 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8ec60ecd-1545-462c-9db3-184ba0c557d0" containerName="kserve-container" Mar 18 17:35:02.526760 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:35:02.526627 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ec60ecd-1545-462c-9db3-184ba0c557d0" containerName="kserve-container" Mar 18 17:35:02.526760 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:35:02.526642 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1631c823-d67a-417b-a542-fbfbbfb396ba" containerName="switch-graph-eb922" Mar 18 17:35:02.526760 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:35:02.526647 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="1631c823-d67a-417b-a542-fbfbbfb396ba" containerName="switch-graph-eb922" Mar 18 17:35:02.526760 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:35:02.526658 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="98b3763c-585d-418a-a124-a44e45ae1196" containerName="kserve-container" Mar 18 17:35:02.526760 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:35:02.526664 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="98b3763c-585d-418a-a124-a44e45ae1196" containerName="kserve-container" Mar 18 17:35:02.526760 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:35:02.526708 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="8ec60ecd-1545-462c-9db3-184ba0c557d0" containerName="kserve-container" Mar 18 17:35:02.526760 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:35:02.526719 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="98b3763c-585d-418a-a124-a44e45ae1196" containerName="kserve-container" Mar 18 17:35:02.526760 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:35:02.526726 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="08be6d3f-7873-46f4-a259-a0e4467eca34" containerName="splitter-graph-6eae8" Mar 18 17:35:02.526760 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:35:02.526733 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="1631c823-d67a-417b-a542-fbfbbfb396ba" containerName="switch-graph-eb922" Mar 18 17:35:02.530779 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:35:02.530758 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gffj5/perf-node-gather-daemonset-p6gt9" Mar 18 17:35:02.532786 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:35:02.532765 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-gffj5\"/\"openshift-service-ca.crt\"" Mar 18 17:35:02.532897 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:35:02.532811 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-gffj5\"/\"default-dockercfg-4qhj7\"" Mar 18 17:35:02.532897 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:35:02.532855 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-gffj5\"/\"kube-root-ca.crt\"" Mar 18 17:35:02.535782 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:35:02.535762 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-gffj5/perf-node-gather-daemonset-p6gt9"] Mar 18 17:35:02.562424 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:35:02.562388 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/525f1a2d-3083-4ebd-a8c7-7633ba0c20df-sys\") pod \"perf-node-gather-daemonset-p6gt9\" (UID: \"525f1a2d-3083-4ebd-a8c7-7633ba0c20df\") " pod="openshift-must-gather-gffj5/perf-node-gather-daemonset-p6gt9" Mar 18 17:35:02.562424 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:35:02.562423 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxf72\" (UniqueName: \"kubernetes.io/projected/525f1a2d-3083-4ebd-a8c7-7633ba0c20df-kube-api-access-kxf72\") pod \"perf-node-gather-daemonset-p6gt9\" (UID: \"525f1a2d-3083-4ebd-a8c7-7633ba0c20df\") " pod="openshift-must-gather-gffj5/perf-node-gather-daemonset-p6gt9" Mar 18 17:35:02.562662 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:35:02.562549 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/525f1a2d-3083-4ebd-a8c7-7633ba0c20df-podres\") pod \"perf-node-gather-daemonset-p6gt9\" (UID: \"525f1a2d-3083-4ebd-a8c7-7633ba0c20df\") " pod="openshift-must-gather-gffj5/perf-node-gather-daemonset-p6gt9" Mar 18 17:35:02.562662 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:35:02.562588 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/525f1a2d-3083-4ebd-a8c7-7633ba0c20df-lib-modules\") pod \"perf-node-gather-daemonset-p6gt9\" (UID: \"525f1a2d-3083-4ebd-a8c7-7633ba0c20df\") " pod="openshift-must-gather-gffj5/perf-node-gather-daemonset-p6gt9" Mar 18 17:35:02.562662 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:35:02.562621 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/525f1a2d-3083-4ebd-a8c7-7633ba0c20df-proc\") pod \"perf-node-gather-daemonset-p6gt9\" (UID: \"525f1a2d-3083-4ebd-a8c7-7633ba0c20df\") " pod="openshift-must-gather-gffj5/perf-node-gather-daemonset-p6gt9" Mar 18 17:35:02.663193 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:35:02.663149 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/525f1a2d-3083-4ebd-a8c7-7633ba0c20df-sys\") pod \"perf-node-gather-daemonset-p6gt9\" (UID: \"525f1a2d-3083-4ebd-a8c7-7633ba0c20df\") " pod="openshift-must-gather-gffj5/perf-node-gather-daemonset-p6gt9" Mar 18 17:35:02.663193 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:35:02.663196 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kxf72\" (UniqueName: \"kubernetes.io/projected/525f1a2d-3083-4ebd-a8c7-7633ba0c20df-kube-api-access-kxf72\") pod \"perf-node-gather-daemonset-p6gt9\" (UID: \"525f1a2d-3083-4ebd-a8c7-7633ba0c20df\") " pod="openshift-must-gather-gffj5/perf-node-gather-daemonset-p6gt9" Mar 18 17:35:02.663461 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:35:02.663274 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/525f1a2d-3083-4ebd-a8c7-7633ba0c20df-podres\") pod \"perf-node-gather-daemonset-p6gt9\" (UID: \"525f1a2d-3083-4ebd-a8c7-7633ba0c20df\") " pod="openshift-must-gather-gffj5/perf-node-gather-daemonset-p6gt9" Mar 18 17:35:02.663461 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:35:02.663292 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/525f1a2d-3083-4ebd-a8c7-7633ba0c20df-sys\") pod \"perf-node-gather-daemonset-p6gt9\" (UID: \"525f1a2d-3083-4ebd-a8c7-7633ba0c20df\") " pod="openshift-must-gather-gffj5/perf-node-gather-daemonset-p6gt9" Mar 18 17:35:02.663461 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:35:02.663301 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/525f1a2d-3083-4ebd-a8c7-7633ba0c20df-lib-modules\") pod \"perf-node-gather-daemonset-p6gt9\" (UID: \"525f1a2d-3083-4ebd-a8c7-7633ba0c20df\") " pod="openshift-must-gather-gffj5/perf-node-gather-daemonset-p6gt9" Mar 18 17:35:02.663461 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:35:02.663371 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/525f1a2d-3083-4ebd-a8c7-7633ba0c20df-proc\") pod \"perf-node-gather-daemonset-p6gt9\" (UID: \"525f1a2d-3083-4ebd-a8c7-7633ba0c20df\") " pod="openshift-must-gather-gffj5/perf-node-gather-daemonset-p6gt9" Mar 18 17:35:02.663461 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:35:02.663414 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/525f1a2d-3083-4ebd-a8c7-7633ba0c20df-lib-modules\") pod \"perf-node-gather-daemonset-p6gt9\" (UID: \"525f1a2d-3083-4ebd-a8c7-7633ba0c20df\") " pod="openshift-must-gather-gffj5/perf-node-gather-daemonset-p6gt9" Mar 18 17:35:02.663461 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:35:02.663454 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/525f1a2d-3083-4ebd-a8c7-7633ba0c20df-podres\") pod \"perf-node-gather-daemonset-p6gt9\" (UID: \"525f1a2d-3083-4ebd-a8c7-7633ba0c20df\") " pod="openshift-must-gather-gffj5/perf-node-gather-daemonset-p6gt9" Mar 18 17:35:02.663722 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:35:02.663492 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/525f1a2d-3083-4ebd-a8c7-7633ba0c20df-proc\") pod \"perf-node-gather-daemonset-p6gt9\" (UID: \"525f1a2d-3083-4ebd-a8c7-7633ba0c20df\") " pod="openshift-must-gather-gffj5/perf-node-gather-daemonset-p6gt9" Mar 18 17:35:02.671096 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:35:02.671072 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxf72\" (UniqueName: \"kubernetes.io/projected/525f1a2d-3083-4ebd-a8c7-7633ba0c20df-kube-api-access-kxf72\") pod \"perf-node-gather-daemonset-p6gt9\" (UID: \"525f1a2d-3083-4ebd-a8c7-7633ba0c20df\") " pod="openshift-must-gather-gffj5/perf-node-gather-daemonset-p6gt9" Mar 18 17:35:02.683553 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:35:02.683498 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-bv7gz_f604a9b3-7355-4f54-8113-531b6f45d7f2/dns/0.log" Mar 18 17:35:02.707060 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:35:02.707030 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-bv7gz_f604a9b3-7355-4f54-8113-531b6f45d7f2/kube-rbac-proxy/0.log" Mar 18 17:35:02.841498 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:35:02.841468 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gffj5/perf-node-gather-daemonset-p6gt9" Mar 18 17:35:02.896609 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:35:02.896582 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-s8lc2_824b5ab3-8c23-48c6-a404-bc9472781b90/dns-node-resolver/0.log" Mar 18 17:35:02.976880 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:35:02.976852 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-gffj5/perf-node-gather-daemonset-p6gt9"] Mar 18 17:35:02.978003 ip-10-0-143-175 kubenswrapper[2578]: W0318 17:35:02.977978 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod525f1a2d_3083_4ebd_a8c7_7633ba0c20df.slice/crio-13fbc5b60aceab29f8c31da1b1db8d49a2753985ef628fc1b36d708085b89e69 WatchSource:0}: Error finding container 13fbc5b60aceab29f8c31da1b1db8d49a2753985ef628fc1b36d708085b89e69: Status 404 returned error can't find the container with id 13fbc5b60aceab29f8c31da1b1db8d49a2753985ef628fc1b36d708085b89e69 Mar 18 17:35:03.349801 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:35:03.349774 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-6ddd9cbcbd-xgxcr_415e43ce-3750-4e1f-9f99-17d7cebca805/registry/0.log" Mar 18 17:35:03.378847 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:35:03.378818 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-bgzmm_f79627fb-f2a3-4632-89c2-5d99716c1cc9/node-ca/0.log" Mar 18 17:35:03.388560 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:35:03.388507 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gffj5/perf-node-gather-daemonset-p6gt9" event={"ID":"525f1a2d-3083-4ebd-a8c7-7633ba0c20df","Type":"ContainerStarted","Data":"e013a820a7ed042aab354b308e569d880ca29778eac7dd5829714ae9372ad3f8"} Mar 18 17:35:03.388560 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:35:03.388571 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-gffj5/perf-node-gather-daemonset-p6gt9" Mar 18 17:35:03.388824 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:35:03.388590 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gffj5/perf-node-gather-daemonset-p6gt9" event={"ID":"525f1a2d-3083-4ebd-a8c7-7633ba0c20df","Type":"ContainerStarted","Data":"13fbc5b60aceab29f8c31da1b1db8d49a2753985ef628fc1b36d708085b89e69"} Mar 18 17:35:03.404873 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:35:03.404820 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-gffj5/perf-node-gather-daemonset-p6gt9" podStartSLOduration=1.404805047 podStartE2EDuration="1.404805047s" podCreationTimestamp="2026-03-18 17:35:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:35:03.402777898 +0000 UTC m=+3057.414886725" watchObservedRunningTime="2026-03-18 17:35:03.404805047 +0000 UTC m=+3057.416913872" Mar 18 17:35:03.610562 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:35:03.589799 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:35:04.502059 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:35:04.502022 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-gwqfh_3d42738c-ceaa-4925-8255-a2b61010e00f/serve-healthcheck-canary/0.log" Mar 18 17:35:04.887971 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:35:04.887936 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-76bdd9f478-x922g_61e0afa5-670e-42e7-8743-980838b63847/insights-operator/0.log" Mar 18 17:35:04.888584 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:35:04.888556 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-76bdd9f478-x922g_61e0afa5-670e-42e7-8743-980838b63847/insights-operator/1.log" Mar 18 17:35:04.985436 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:35:04.985401 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-gn87n_21b7e61b-bd0b-483d-a381-12ff0e42fe1f/kube-rbac-proxy/0.log" Mar 18 17:35:05.010460 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:35:05.010428 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-gn87n_21b7e61b-bd0b-483d-a381-12ff0e42fe1f/exporter/0.log" Mar 18 17:35:05.034955 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:35:05.034914 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-gn87n_21b7e61b-bd0b-483d-a381-12ff0e42fe1f/extractor/0.log" Mar 18 17:35:07.089822 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:35:07.089749 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-68cc5db7c4-npvlf_8b14e9b6-c9f5-45f4-a14d-34e6ce85ba31/manager/0.log" Mar 18 17:35:07.593439 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:35:07.593405 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-6lk5g_4b2bc59c-ccc3-4fdc-9cde-f8bd31d1e1f0/manager/0.log" Mar 18 17:35:09.401872 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:35:09.401845 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-gffj5/perf-node-gather-daemonset-p6gt9" Mar 18 17:35:12.843272 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:35:12.843239 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6p9n7_85ea82d7-7483-4563-a8fd-b12c358cdc2d/kube-multus/0.log" Mar 18 17:35:13.209383 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:35:13.209307 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-m75lt_42ddca27-69b7-4448-a710-9cc62df43c14/kube-multus-additional-cni-plugins/0.log" Mar 18 17:35:13.236603 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:35:13.236568 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-m75lt_42ddca27-69b7-4448-a710-9cc62df43c14/egress-router-binary-copy/0.log" Mar 18 17:35:13.260896 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:35:13.260867 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-m75lt_42ddca27-69b7-4448-a710-9cc62df43c14/cni-plugins/0.log" Mar 18 17:35:13.287872 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:35:13.287845 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-m75lt_42ddca27-69b7-4448-a710-9cc62df43c14/bond-cni-plugin/0.log" Mar 18 17:35:13.317328 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:35:13.317304 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-m75lt_42ddca27-69b7-4448-a710-9cc62df43c14/routeoverride-cni/0.log" Mar 18 17:35:13.341655 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:35:13.341630 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-m75lt_42ddca27-69b7-4448-a710-9cc62df43c14/whereabouts-cni-bincopy/0.log" Mar 18 17:35:13.366686 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:35:13.366660 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-m75lt_42ddca27-69b7-4448-a710-9cc62df43c14/whereabouts-cni/0.log" Mar 18 17:35:13.452558 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:35:13.452507 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-bcgtq_4bd683e7-9eae-4e40-ba74-10c5acb6fd8b/network-metrics-daemon/0.log" Mar 18 17:35:13.475929 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:35:13.475848 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-bcgtq_4bd683e7-9eae-4e40-ba74-10c5acb6fd8b/kube-rbac-proxy/0.log" Mar 18 17:35:15.155513 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:35:15.155481 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qrlf7_5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa/ovn-controller/0.log" Mar 18 17:35:15.202381 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:35:15.202351 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qrlf7_5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa/ovn-acl-logging/0.log" Mar 18 17:35:15.229090 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:35:15.229063 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qrlf7_5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa/kube-rbac-proxy-node/0.log" Mar 18 17:35:15.256017 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:35:15.255990 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qrlf7_5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa/kube-rbac-proxy-ovn-metrics/0.log" Mar 18 17:35:15.278166 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:35:15.278145 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qrlf7_5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa/northd/0.log" Mar 18 17:35:15.302375 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:35:15.302348 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qrlf7_5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa/nbdb/0.log" Mar 18 17:35:15.328117 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:35:15.328087 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qrlf7_5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa/sbdb/0.log" Mar 18 17:35:15.512749 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:35:15.512639 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qrlf7_5b66f0ce-7ce3-4f1b-9d2c-c7f629da04fa/ovnkube-controller/0.log" Mar 18 17:35:15.589245 ip-10-0-143-175 kubenswrapper[2578]: E0318 17:35:15.589214 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-stcsx" podUID="4f796fc2-c6ea-4936-8fae-6cd0d2b876db" Mar 18 17:35:16.552097 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:35:16.552062 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-cc88fdd44-2v9sz_8ed2bd8c-3d8b-4f0b-84ac-e32e792e1e83/check-endpoints/0.log" Mar 18 17:35:16.581186 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:35:16.581152 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-ml2kv_6fad1cd6-2abc-416f-8534-fda50c4cccd9/network-check-target-container/0.log" Mar 18 17:35:17.597652 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:35:17.597622 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-htm47_67114ccf-1178-47ed-acd5-eaf483936c9f/iptables-alerter/0.log" Mar 18 17:35:18.263920 ip-10-0-143-175 kubenswrapper[2578]: I0318 17:35:18.263889 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-bdr5f_dbc66209-3812-4679-96f3-4550a7abfa0c/tuned/0.log"