Mar 18 16:44:41.568646 ip-10-0-139-49 systemd[1]: Starting Kubernetes Kubelet... Mar 18 16:44:42.016485 ip-10-0-139-49 kubenswrapper[2575]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 18 16:44:42.016485 ip-10-0-139-49 kubenswrapper[2575]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 18 16:44:42.016485 ip-10-0-139-49 kubenswrapper[2575]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 18 16:44:42.016485 ip-10-0-139-49 kubenswrapper[2575]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Mar 18 16:44:42.016485 ip-10-0-139-49 kubenswrapper[2575]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 18 16:44:42.020035 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.019945 2575 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 18 16:44:42.024540 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.024519 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Mar 18 16:44:42.024540 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.024536 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Mar 18 16:44:42.024540 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.024539 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Mar 18 16:44:42.024540 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.024542 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Mar 18 16:44:42.024540 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.024545 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Mar 18 16:44:42.024540 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.024547 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Mar 18 16:44:42.024753 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.024550 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Mar 18 16:44:42.024753 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.024553 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Mar 18 16:44:42.024753 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.024556 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Mar 18 16:44:42.024753 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.024559 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Mar 18 16:44:42.024753 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.024561 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Mar 18 16:44:42.024753 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.024564 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Mar 18 16:44:42.024753 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.024566 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Mar 18 16:44:42.024753 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.024569 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Mar 18 16:44:42.024753 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.024571 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Mar 18 16:44:42.024753 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.024574 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Mar 18 16:44:42.024753 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.024577 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Mar 18 16:44:42.024753 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.024580 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Mar 18 16:44:42.024753 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.024582 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Mar 18 16:44:42.024753 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.024585 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Mar 18 16:44:42.024753 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.024587 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Mar 18 16:44:42.024753 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.024589 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Mar 18 16:44:42.024753 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.024592 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Mar 18 16:44:42.024753 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.024594 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 18 16:44:42.024753 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.024597 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Mar 18 16:44:42.024753 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.024599 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 18 16:44:42.025224 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.024602 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Mar 18 16:44:42.025224 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.024604 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Mar 18 16:44:42.025224 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.024607 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Mar 18 16:44:42.025224 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.024616 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Mar 18 16:44:42.025224 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.024618 2575 feature_gate.go:328] unrecognized feature gate: Example2 Mar 18 16:44:42.025224 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.024621 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Mar 18 16:44:42.025224 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.024623 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Mar 18 16:44:42.025224 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.024626 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Mar 18 16:44:42.025224 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.024628 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Mar 18 16:44:42.025224 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.024630 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Mar 18 16:44:42.025224 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.024633 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Mar 18 16:44:42.025224 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.024635 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 18 16:44:42.025224 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.024637 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Mar 18 16:44:42.025224 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.024641 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Mar 18 16:44:42.025224 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.024644 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Mar 18 16:44:42.025224 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.024648 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 18 16:44:42.025224 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.024653 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Mar 18 16:44:42.025224 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.024656 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Mar 18 16:44:42.025224 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.024659 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Mar 18 16:44:42.025224 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.024661 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Mar 18 16:44:42.025769 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.024664 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Mar 18 16:44:42.025769 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.024667 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Mar 18 16:44:42.025769 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.024669 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Mar 18 16:44:42.025769 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.024672 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Mar 18 16:44:42.025769 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.024674 2575 feature_gate.go:328] unrecognized feature gate: Example Mar 18 16:44:42.025769 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.024677 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Mar 18 16:44:42.025769 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.024679 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Mar 18 16:44:42.025769 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.024681 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Mar 18 16:44:42.025769 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.024684 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Mar 18 16:44:42.025769 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.024687 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Mar 18 16:44:42.025769 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.024690 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Mar 18 16:44:42.025769 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.024693 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Mar 18 16:44:42.025769 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.024695 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Mar 18 16:44:42.025769 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.024698 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 18 16:44:42.025769 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.024701 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Mar 18 16:44:42.025769 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.024703 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Mar 18 16:44:42.025769 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.024706 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 18 16:44:42.025769 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.024708 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Mar 18 16:44:42.025769 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.024711 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Mar 18 16:44:42.025769 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.024713 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Mar 18 16:44:42.026246 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.024716 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Mar 18 16:44:42.026246 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.024719 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Mar 18 16:44:42.026246 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.024721 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Mar 18 16:44:42.026246 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.024723 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Mar 18 16:44:42.026246 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.024726 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Mar 18 16:44:42.026246 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.024730 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Mar 18 16:44:42.026246 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.024732 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Mar 18 16:44:42.026246 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.024735 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Mar 18 16:44:42.026246 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.024737 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Mar 18 16:44:42.026246 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.024740 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Mar 18 16:44:42.026246 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.024742 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Mar 18 16:44:42.026246 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.024745 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Mar 18 16:44:42.026246 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.024748 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Mar 18 16:44:42.026246 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.024753 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Mar 18 16:44:42.026246 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.024757 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 18 16:44:42.026246 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.024760 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 18 16:44:42.026246 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.024763 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Mar 18 16:44:42.026246 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.024766 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Mar 18 16:44:42.026246 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.024769 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Mar 18 16:44:42.026716 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.024771 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Mar 18 16:44:42.026716 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.026541 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Mar 18 16:44:42.026716 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.026550 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Mar 18 16:44:42.026716 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.026553 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Mar 18 16:44:42.026716 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.026556 2575 feature_gate.go:328] unrecognized feature gate: Example2 Mar 18 16:44:42.026716 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.026559 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Mar 18 16:44:42.026716 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.026562 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Mar 18 16:44:42.026716 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.026564 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 18 16:44:42.026716 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.026567 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Mar 18 16:44:42.026716 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.026570 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Mar 18 16:44:42.026716 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.026572 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Mar 18 16:44:42.026716 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.026575 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Mar 18 16:44:42.026716 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.026577 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Mar 18 16:44:42.026716 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.026580 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Mar 18 16:44:42.026716 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.026582 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Mar 18 16:44:42.026716 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.026585 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Mar 18 16:44:42.026716 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.026587 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Mar 18 16:44:42.026716 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.026589 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Mar 18 16:44:42.026716 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.026592 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Mar 18 16:44:42.026716 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.026594 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Mar 18 16:44:42.027180 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.026597 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Mar 18 16:44:42.027180 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.026600 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Mar 18 16:44:42.027180 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.026602 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Mar 18 16:44:42.027180 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.026605 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Mar 18 16:44:42.027180 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.026607 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Mar 18 16:44:42.027180 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.026610 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Mar 18 16:44:42.027180 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.026612 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Mar 18 16:44:42.027180 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.026615 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Mar 18 16:44:42.027180 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.026617 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 18 16:44:42.027180 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.026619 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Mar 18 16:44:42.027180 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.026622 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Mar 18 16:44:42.027180 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.026624 2575 feature_gate.go:328] unrecognized feature gate: Example Mar 18 16:44:42.027180 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.026627 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Mar 18 16:44:42.027180 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.026630 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Mar 18 16:44:42.027180 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.026632 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Mar 18 16:44:42.027180 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.026635 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Mar 18 16:44:42.027180 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.026637 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Mar 18 16:44:42.027180 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.026640 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Mar 18 16:44:42.027180 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.026643 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Mar 18 16:44:42.027180 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.026646 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Mar 18 16:44:42.027686 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.026650 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Mar 18 16:44:42.027686 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.026653 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Mar 18 16:44:42.027686 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.026656 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Mar 18 16:44:42.027686 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.026659 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Mar 18 16:44:42.027686 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.026662 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Mar 18 16:44:42.027686 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.026664 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Mar 18 16:44:42.027686 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.026667 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Mar 18 16:44:42.027686 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.026670 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 18 16:44:42.027686 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.026673 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Mar 18 16:44:42.027686 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.026676 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Mar 18 16:44:42.027686 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.026679 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Mar 18 16:44:42.027686 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.026682 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Mar 18 16:44:42.027686 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.026684 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Mar 18 16:44:42.027686 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.026686 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Mar 18 16:44:42.027686 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.026689 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Mar 18 16:44:42.027686 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.026691 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Mar 18 16:44:42.027686 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.026694 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Mar 18 16:44:42.027686 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.026696 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Mar 18 16:44:42.027686 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.026699 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Mar 18 16:44:42.028144 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.026702 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Mar 18 16:44:42.028144 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.026704 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 18 16:44:42.028144 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.026706 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Mar 18 16:44:42.028144 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.026709 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Mar 18 16:44:42.028144 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.026712 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 18 16:44:42.028144 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.026715 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Mar 18 16:44:42.028144 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.026717 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Mar 18 16:44:42.028144 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.026720 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Mar 18 16:44:42.028144 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.026723 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 18 16:44:42.028144 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.026726 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Mar 18 16:44:42.028144 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.026729 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Mar 18 16:44:42.028144 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.026731 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Mar 18 16:44:42.028144 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.026734 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Mar 18 16:44:42.028144 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.026737 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Mar 18 16:44:42.028144 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.026740 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Mar 18 16:44:42.028144 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.026742 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Mar 18 16:44:42.028144 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.026745 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Mar 18 16:44:42.028144 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.026747 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 18 16:44:42.028144 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.026749 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Mar 18 16:44:42.028621 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.026752 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 18 16:44:42.028621 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.026754 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Mar 18 16:44:42.028621 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.026757 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Mar 18 16:44:42.028621 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.026759 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Mar 18 16:44:42.028621 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.026761 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Mar 18 16:44:42.028621 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.026764 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Mar 18 16:44:42.028621 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.026766 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Mar 18 16:44:42.028621 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.026769 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Mar 18 16:44:42.028621 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.026771 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Mar 18 16:44:42.028621 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.026839 2575 flags.go:64] FLAG: --address="0.0.0.0" Mar 18 16:44:42.028621 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.026847 2575 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 18 16:44:42.028621 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.026853 2575 flags.go:64] FLAG: --anonymous-auth="true" Mar 18 16:44:42.028621 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.026857 2575 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 18 16:44:42.028621 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.026862 2575 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 18 16:44:42.028621 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.026865 2575 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 18 16:44:42.028621 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.026869 2575 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 18 16:44:42.028621 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.026874 2575 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 18 16:44:42.028621 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.026878 2575 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 18 16:44:42.028621 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.026881 2575 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 18 16:44:42.028621 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.026884 2575 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 18 16:44:42.028621 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.026887 2575 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 18 16:44:42.028621 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.026890 2575 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 18 16:44:42.029144 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.026893 2575 flags.go:64] FLAG: --cgroup-root="" Mar 18 16:44:42.029144 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.026896 2575 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 18 16:44:42.029144 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.026898 2575 flags.go:64] FLAG: --client-ca-file="" Mar 18 16:44:42.029144 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.026901 2575 flags.go:64] FLAG: --cloud-config="" Mar 18 16:44:42.029144 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.026904 2575 flags.go:64] FLAG: --cloud-provider="external" Mar 18 16:44:42.029144 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.026907 2575 flags.go:64] FLAG: --cluster-dns="[]" Mar 18 16:44:42.029144 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.026911 2575 flags.go:64] FLAG: --cluster-domain="" Mar 18 16:44:42.029144 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.026914 2575 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 18 16:44:42.029144 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.026917 2575 flags.go:64] FLAG: --config-dir="" Mar 18 16:44:42.029144 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.026919 2575 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 18 16:44:42.029144 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.026923 2575 flags.go:64] FLAG: --container-log-max-files="5" Mar 18 16:44:42.029144 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.026926 2575 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 18 16:44:42.029144 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.026929 2575 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 18 16:44:42.029144 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.026932 2575 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 18 16:44:42.029144 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.026935 2575 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 18 16:44:42.029144 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.026938 2575 flags.go:64] FLAG: --contention-profiling="false" Mar 18 16:44:42.029144 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.026941 2575 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 18 16:44:42.029144 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.026944 2575 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 18 16:44:42.029144 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.026948 2575 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 18 16:44:42.029144 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.026950 2575 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 18 16:44:42.029144 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.026954 2575 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 18 16:44:42.029144 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.026957 2575 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 18 16:44:42.029144 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.026960 2575 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 18 16:44:42.029144 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.026963 2575 flags.go:64] FLAG: --enable-load-reader="false" Mar 18 16:44:42.029144 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.026966 2575 flags.go:64] FLAG: --enable-server="true" Mar 18 16:44:42.029769 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.026969 2575 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 18 16:44:42.029769 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.026973 2575 flags.go:64] FLAG: --event-burst="100" Mar 18 16:44:42.029769 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.026976 2575 flags.go:64] FLAG: --event-qps="50" Mar 18 16:44:42.029769 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.026979 2575 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 18 16:44:42.029769 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.026982 2575 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 18 16:44:42.029769 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.026985 2575 flags.go:64] FLAG: --eviction-hard="" Mar 18 16:44:42.029769 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.026988 2575 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 18 16:44:42.029769 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.026991 2575 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 18 16:44:42.029769 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.026994 2575 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 18 16:44:42.029769 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.026997 2575 flags.go:64] FLAG: --eviction-soft="" Mar 18 16:44:42.029769 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.026999 2575 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 18 16:44:42.029769 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.027002 2575 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 18 16:44:42.029769 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.027005 2575 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 18 16:44:42.029769 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.027008 2575 flags.go:64] FLAG: --experimental-mounter-path="" Mar 18 16:44:42.029769 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.027010 2575 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 18 16:44:42.029769 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.027013 2575 flags.go:64] FLAG: --fail-swap-on="true" Mar 18 16:44:42.029769 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.027016 2575 flags.go:64] FLAG: --feature-gates="" Mar 18 16:44:42.029769 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.027019 2575 flags.go:64] FLAG: --file-check-frequency="20s" Mar 18 16:44:42.029769 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.027022 2575 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 18 16:44:42.029769 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.027025 2575 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 18 16:44:42.029769 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.027028 2575 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 18 16:44:42.029769 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.027031 2575 flags.go:64] FLAG: --healthz-port="10248" Mar 18 16:44:42.029769 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.027034 2575 flags.go:64] FLAG: --help="false" Mar 18 16:44:42.029769 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.027036 2575 flags.go:64] FLAG: --hostname-override="ip-10-0-139-49.ec2.internal" Mar 18 16:44:42.029769 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.027040 2575 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 18 16:44:42.030419 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.027043 2575 flags.go:64] FLAG: --http-check-frequency="20s" Mar 18 16:44:42.030419 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.027046 2575 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Mar 18 16:44:42.030419 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.027049 2575 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Mar 18 16:44:42.030419 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.027052 2575 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 18 16:44:42.030419 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.027055 2575 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 18 16:44:42.030419 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.027058 2575 flags.go:64] FLAG: --image-service-endpoint="" Mar 18 16:44:42.030419 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.027061 2575 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 18 16:44:42.030419 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.027064 2575 flags.go:64] FLAG: --kube-api-burst="100" Mar 18 16:44:42.030419 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.027067 2575 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 18 16:44:42.030419 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.027070 2575 flags.go:64] FLAG: --kube-api-qps="50" Mar 18 16:44:42.030419 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.027073 2575 flags.go:64] FLAG: --kube-reserved="" Mar 18 16:44:42.030419 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.027076 2575 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 18 16:44:42.030419 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.027079 2575 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 18 16:44:42.030419 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.027082 2575 flags.go:64] FLAG: --kubelet-cgroups="" Mar 18 16:44:42.030419 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.027084 2575 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 18 16:44:42.030419 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.027087 2575 flags.go:64] FLAG: --lock-file="" Mar 18 16:44:42.030419 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.027090 2575 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 18 16:44:42.030419 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.027093 2575 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 18 16:44:42.030419 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.027095 2575 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 18 16:44:42.030419 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.027100 2575 flags.go:64] FLAG: --log-json-split-stream="false" Mar 18 16:44:42.030419 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.027103 2575 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 18 16:44:42.030419 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.027106 2575 flags.go:64] FLAG: --log-text-split-stream="false" Mar 18 16:44:42.030419 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.027108 2575 flags.go:64] FLAG: --logging-format="text" Mar 18 16:44:42.030978 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.027111 2575 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 18 16:44:42.030978 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.027114 2575 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 18 16:44:42.030978 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.027117 2575 flags.go:64] FLAG: --manifest-url="" Mar 18 16:44:42.030978 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.027120 2575 flags.go:64] FLAG: --manifest-url-header="" Mar 18 16:44:42.030978 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.027124 2575 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 18 16:44:42.030978 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.027127 2575 flags.go:64] FLAG: --max-open-files="1000000" Mar 18 16:44:42.030978 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.027131 2575 flags.go:64] FLAG: --max-pods="110" Mar 18 16:44:42.030978 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.027133 2575 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 18 16:44:42.030978 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.027136 2575 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 18 16:44:42.030978 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.027139 2575 flags.go:64] FLAG: --memory-manager-policy="None" Mar 18 16:44:42.030978 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.027142 2575 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 18 16:44:42.030978 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.027145 2575 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 18 16:44:42.030978 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.027147 2575 flags.go:64] FLAG: --node-ip="0.0.0.0" Mar 18 16:44:42.030978 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.027150 2575 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Mar 18 16:44:42.030978 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.027157 2575 flags.go:64] FLAG: --node-status-max-images="50" Mar 18 16:44:42.030978 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.027160 2575 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 18 16:44:42.030978 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.027163 2575 flags.go:64] FLAG: --oom-score-adj="-999" Mar 18 16:44:42.030978 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.027167 2575 flags.go:64] FLAG: --pod-cidr="" Mar 18 16:44:42.030978 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.027170 2575 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b3115b2610585407ab0742648cfbe39c72f57482889f0e778f5ac6fdc482217b" Mar 18 16:44:42.030978 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.027175 2575 flags.go:64] FLAG: --pod-manifest-path="" Mar 18 16:44:42.030978 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.027178 2575 flags.go:64] FLAG: --pod-max-pids="-1" Mar 18 16:44:42.030978 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.027181 2575 flags.go:64] FLAG: --pods-per-core="0" Mar 18 16:44:42.030978 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.027184 2575 flags.go:64] FLAG: --port="10250" Mar 18 16:44:42.030978 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.027187 2575 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 18 16:44:42.031564 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.027190 2575 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0c7dc76d52043a084" Mar 18 16:44:42.031564 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.027193 2575 flags.go:64] FLAG: --qos-reserved="" Mar 18 16:44:42.031564 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.027196 2575 flags.go:64] FLAG: --read-only-port="10255" Mar 18 16:44:42.031564 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.027199 2575 flags.go:64] FLAG: --register-node="true" Mar 18 16:44:42.031564 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.027202 2575 flags.go:64] FLAG: --register-schedulable="true" Mar 18 16:44:42.031564 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.027204 2575 flags.go:64] FLAG: --register-with-taints="" Mar 18 16:44:42.031564 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.027208 2575 flags.go:64] FLAG: --registry-burst="10" Mar 18 16:44:42.031564 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.027211 2575 flags.go:64] FLAG: --registry-qps="5" Mar 18 16:44:42.031564 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.027213 2575 flags.go:64] FLAG: --reserved-cpus="" Mar 18 16:44:42.031564 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.027216 2575 flags.go:64] FLAG: --reserved-memory="" Mar 18 16:44:42.031564 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.027219 2575 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 18 16:44:42.031564 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.027222 2575 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 18 16:44:42.031564 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.027225 2575 flags.go:64] FLAG: --rotate-certificates="false" Mar 18 16:44:42.031564 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.027228 2575 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 18 16:44:42.031564 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.027230 2575 flags.go:64] FLAG: --runonce="false" Mar 18 16:44:42.031564 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.027233 2575 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 18 16:44:42.031564 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.027236 2575 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 18 16:44:42.031564 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.027239 2575 flags.go:64] FLAG: --seccomp-default="false" Mar 18 16:44:42.031564 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.027242 2575 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 18 16:44:42.031564 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.027244 2575 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 18 16:44:42.031564 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.027247 2575 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 18 16:44:42.031564 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.027250 2575 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 18 16:44:42.031564 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.027253 2575 flags.go:64] FLAG: --storage-driver-password="root" Mar 18 16:44:42.031564 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.027256 2575 flags.go:64] FLAG: --storage-driver-secure="false" Mar 18 16:44:42.031564 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.027258 2575 flags.go:64] FLAG: --storage-driver-table="stats" Mar 18 16:44:42.031564 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.027261 2575 flags.go:64] FLAG: --storage-driver-user="root" Mar 18 16:44:42.032187 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.027268 2575 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 18 16:44:42.032187 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.027271 2575 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 18 16:44:42.032187 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.027274 2575 flags.go:64] FLAG: --system-cgroups="" Mar 18 16:44:42.032187 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.027277 2575 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Mar 18 16:44:42.032187 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.027282 2575 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 18 16:44:42.032187 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.027285 2575 flags.go:64] FLAG: --tls-cert-file="" Mar 18 16:44:42.032187 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.027288 2575 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 18 16:44:42.032187 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.027293 2575 flags.go:64] FLAG: --tls-min-version="" Mar 18 16:44:42.032187 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.027296 2575 flags.go:64] FLAG: --tls-private-key-file="" Mar 18 16:44:42.032187 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.027307 2575 flags.go:64] FLAG: --topology-manager-policy="none" Mar 18 16:44:42.032187 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.027310 2575 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 18 16:44:42.032187 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.027313 2575 flags.go:64] FLAG: --topology-manager-scope="container" Mar 18 16:44:42.032187 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.027316 2575 flags.go:64] FLAG: --v="2" Mar 18 16:44:42.032187 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.027321 2575 flags.go:64] FLAG: --version="false" Mar 18 16:44:42.032187 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.027325 2575 flags.go:64] FLAG: --vmodule="" Mar 18 16:44:42.032187 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.027329 2575 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 18 16:44:42.032187 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.027332 2575 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 18 16:44:42.032187 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.027437 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Mar 18 16:44:42.032187 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.027442 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Mar 18 16:44:42.032187 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.027445 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Mar 18 16:44:42.032187 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.027448 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Mar 18 16:44:42.032187 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.027451 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Mar 18 16:44:42.032187 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.027453 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Mar 18 16:44:42.032187 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.027456 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Mar 18 16:44:42.032762 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.027458 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Mar 18 16:44:42.032762 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.027462 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Mar 18 16:44:42.032762 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.027465 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Mar 18 16:44:42.032762 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.027468 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Mar 18 16:44:42.032762 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.027471 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Mar 18 16:44:42.032762 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.027474 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Mar 18 16:44:42.032762 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.027476 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Mar 18 16:44:42.032762 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.027479 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Mar 18 16:44:42.032762 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.027483 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Mar 18 16:44:42.032762 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.027486 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Mar 18 16:44:42.032762 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.027488 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 18 16:44:42.032762 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.027491 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Mar 18 16:44:42.032762 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.027494 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 18 16:44:42.032762 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.027496 2575 feature_gate.go:328] unrecognized feature gate: Example Mar 18 16:44:42.032762 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.027499 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Mar 18 16:44:42.032762 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.027502 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Mar 18 16:44:42.032762 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.027505 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Mar 18 16:44:42.032762 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.027507 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Mar 18 16:44:42.032762 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.027510 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Mar 18 16:44:42.033262 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.027513 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Mar 18 16:44:42.033262 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.027515 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Mar 18 16:44:42.033262 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.027518 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Mar 18 16:44:42.033262 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.027520 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Mar 18 16:44:42.033262 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.027522 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Mar 18 16:44:42.033262 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.027525 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Mar 18 16:44:42.033262 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.027527 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 18 16:44:42.033262 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.027530 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Mar 18 16:44:42.033262 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.027532 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Mar 18 16:44:42.033262 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.027535 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Mar 18 16:44:42.033262 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.027537 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Mar 18 16:44:42.033262 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.027540 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Mar 18 16:44:42.033262 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.027542 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Mar 18 16:44:42.033262 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.027545 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Mar 18 16:44:42.033262 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.027547 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Mar 18 16:44:42.033262 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.027550 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Mar 18 16:44:42.033262 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.027552 2575 feature_gate.go:328] unrecognized feature gate: Example2 Mar 18 16:44:42.033262 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.027555 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Mar 18 16:44:42.033262 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.027558 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 18 16:44:42.033262 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.027562 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Mar 18 16:44:42.033824 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.027564 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Mar 18 16:44:42.033824 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.027568 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Mar 18 16:44:42.033824 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.027570 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Mar 18 16:44:42.033824 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.027573 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Mar 18 16:44:42.033824 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.027576 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 18 16:44:42.033824 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.027579 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Mar 18 16:44:42.033824 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.027581 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Mar 18 16:44:42.033824 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.027584 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Mar 18 16:44:42.033824 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.027588 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Mar 18 16:44:42.033824 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.027590 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Mar 18 16:44:42.033824 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.027593 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Mar 18 16:44:42.033824 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.027595 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Mar 18 16:44:42.033824 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.027598 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Mar 18 16:44:42.033824 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.027600 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Mar 18 16:44:42.033824 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.027602 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Mar 18 16:44:42.033824 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.027605 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 18 16:44:42.033824 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.027607 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Mar 18 16:44:42.033824 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.027610 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Mar 18 16:44:42.033824 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.027612 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Mar 18 16:44:42.033824 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.027614 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 18 16:44:42.034315 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.027617 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Mar 18 16:44:42.034315 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.027619 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Mar 18 16:44:42.034315 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.027622 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Mar 18 16:44:42.034315 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.027624 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Mar 18 16:44:42.034315 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.027627 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Mar 18 16:44:42.034315 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.027629 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Mar 18 16:44:42.034315 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.027631 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Mar 18 16:44:42.034315 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.027634 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Mar 18 16:44:42.034315 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.027636 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Mar 18 16:44:42.034315 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.027639 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Mar 18 16:44:42.034315 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.027641 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Mar 18 16:44:42.034315 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.027644 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Mar 18 16:44:42.034315 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.027646 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Mar 18 16:44:42.034315 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.027650 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Mar 18 16:44:42.034315 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.027652 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Mar 18 16:44:42.034315 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.027655 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Mar 18 16:44:42.034315 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.027661 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Mar 18 16:44:42.034315 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.027663 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Mar 18 16:44:42.034315 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.027666 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Mar 18 16:44:42.034315 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.027668 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 18 16:44:42.034817 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.027681 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Mar 18 16:44:42.035803 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.035786 2575 server.go:530] "Kubelet version" kubeletVersion="v1.33.8" Mar 18 16:44:42.035838 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.035804 2575 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 18 16:44:42.035870 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.035856 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Mar 18 16:44:42.035870 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.035861 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Mar 18 16:44:42.035870 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.035864 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Mar 18 16:44:42.035870 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.035867 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Mar 18 16:44:42.035870 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.035870 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Mar 18 16:44:42.035993 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.035873 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Mar 18 16:44:42.035993 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.035876 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Mar 18 16:44:42.035993 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.035880 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Mar 18 16:44:42.035993 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.035883 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Mar 18 16:44:42.035993 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.035886 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Mar 18 16:44:42.035993 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.035888 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Mar 18 16:44:42.035993 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.035891 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 18 16:44:42.035993 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.035893 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Mar 18 16:44:42.035993 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.035896 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Mar 18 16:44:42.035993 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.035898 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Mar 18 16:44:42.035993 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.035901 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Mar 18 16:44:42.035993 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.035904 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Mar 18 16:44:42.035993 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.035906 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Mar 18 16:44:42.035993 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.035909 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Mar 18 16:44:42.035993 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.035911 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Mar 18 16:44:42.035993 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.035914 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Mar 18 16:44:42.035993 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.035917 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Mar 18 16:44:42.035993 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.035919 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Mar 18 16:44:42.035993 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.035921 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Mar 18 16:44:42.035993 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.035924 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Mar 18 16:44:42.036497 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.035926 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Mar 18 16:44:42.036497 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.035929 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Mar 18 16:44:42.036497 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.035931 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Mar 18 16:44:42.036497 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.035933 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Mar 18 16:44:42.036497 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.035936 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Mar 18 16:44:42.036497 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.035938 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 18 16:44:42.036497 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.035944 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Mar 18 16:44:42.036497 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.035947 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Mar 18 16:44:42.036497 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.035950 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Mar 18 16:44:42.036497 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.035953 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Mar 18 16:44:42.036497 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.035955 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Mar 18 16:44:42.036497 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.035957 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Mar 18 16:44:42.036497 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.035960 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Mar 18 16:44:42.036497 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.035963 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 18 16:44:42.036497 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.035966 2575 feature_gate.go:328] unrecognized feature gate: Example2 Mar 18 16:44:42.036497 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.035968 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Mar 18 16:44:42.036497 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.035971 2575 feature_gate.go:328] unrecognized feature gate: Example Mar 18 16:44:42.036497 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.035973 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Mar 18 16:44:42.036497 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.035977 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Mar 18 16:44:42.036966 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.035981 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 18 16:44:42.036966 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.035984 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Mar 18 16:44:42.036966 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.035987 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Mar 18 16:44:42.036966 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.035990 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Mar 18 16:44:42.036966 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.035993 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Mar 18 16:44:42.036966 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.035996 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Mar 18 16:44:42.036966 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.035999 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Mar 18 16:44:42.036966 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.036002 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Mar 18 16:44:42.036966 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.036004 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Mar 18 16:44:42.036966 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.036007 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Mar 18 16:44:42.036966 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.036009 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 18 16:44:42.036966 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.036012 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Mar 18 16:44:42.036966 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.036014 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Mar 18 16:44:42.036966 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.036017 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Mar 18 16:44:42.036966 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.036019 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Mar 18 16:44:42.036966 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.036022 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Mar 18 16:44:42.036966 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.036024 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Mar 18 16:44:42.036966 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.036027 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 18 16:44:42.036966 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.036029 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Mar 18 16:44:42.036966 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.036032 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 18 16:44:42.037470 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.036035 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Mar 18 16:44:42.037470 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.036037 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Mar 18 16:44:42.037470 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.036041 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 18 16:44:42.037470 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.036045 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Mar 18 16:44:42.037470 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.036048 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Mar 18 16:44:42.037470 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.036050 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Mar 18 16:44:42.037470 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.036053 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Mar 18 16:44:42.037470 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.036056 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Mar 18 16:44:42.037470 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.036058 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Mar 18 16:44:42.037470 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.036061 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Mar 18 16:44:42.037470 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.036063 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Mar 18 16:44:42.037470 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.036066 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Mar 18 16:44:42.037470 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.036068 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Mar 18 16:44:42.037470 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.036070 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Mar 18 16:44:42.037470 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.036073 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Mar 18 16:44:42.037470 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.036075 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Mar 18 16:44:42.037470 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.036078 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Mar 18 16:44:42.037470 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.036080 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Mar 18 16:44:42.037470 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.036083 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Mar 18 16:44:42.037470 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.036085 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Mar 18 16:44:42.037951 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.036087 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Mar 18 16:44:42.037951 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.036090 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Mar 18 16:44:42.037951 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.036095 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Mar 18 16:44:42.037951 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.036199 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Mar 18 16:44:42.037951 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.036204 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 18 16:44:42.037951 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.036208 2575 feature_gate.go:328] unrecognized feature gate: Example Mar 18 16:44:42.037951 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.036210 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Mar 18 16:44:42.037951 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.036214 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Mar 18 16:44:42.037951 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.036217 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Mar 18 16:44:42.037951 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.036220 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Mar 18 16:44:42.037951 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.036222 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Mar 18 16:44:42.037951 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.036225 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Mar 18 16:44:42.037951 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.036227 2575 feature_gate.go:328] unrecognized feature gate: Example2 Mar 18 16:44:42.037951 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.036231 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Mar 18 16:44:42.037951 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.036235 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Mar 18 16:44:42.038409 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.036239 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Mar 18 16:44:42.038409 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.036241 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Mar 18 16:44:42.038409 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.036244 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Mar 18 16:44:42.038409 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.036248 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 18 16:44:42.038409 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.036252 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Mar 18 16:44:42.038409 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.036254 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Mar 18 16:44:42.038409 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.036257 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 18 16:44:42.038409 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.036259 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Mar 18 16:44:42.038409 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.036262 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 18 16:44:42.038409 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.036264 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Mar 18 16:44:42.038409 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.036267 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Mar 18 16:44:42.038409 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.036269 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Mar 18 16:44:42.038409 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.036272 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Mar 18 16:44:42.038409 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.036275 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Mar 18 16:44:42.038409 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.036277 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Mar 18 16:44:42.038409 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.036280 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Mar 18 16:44:42.038409 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.036282 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Mar 18 16:44:42.038409 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.036285 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Mar 18 16:44:42.038409 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.036287 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Mar 18 16:44:42.038877 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.036289 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Mar 18 16:44:42.038877 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.036292 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Mar 18 16:44:42.038877 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.036294 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Mar 18 16:44:42.038877 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.036297 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 18 16:44:42.038877 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.036299 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Mar 18 16:44:42.038877 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.036301 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Mar 18 16:44:42.038877 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.036304 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 18 16:44:42.038877 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.036306 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Mar 18 16:44:42.038877 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.036309 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Mar 18 16:44:42.038877 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.036311 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Mar 18 16:44:42.038877 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.036313 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Mar 18 16:44:42.038877 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.036315 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Mar 18 16:44:42.038877 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.036318 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Mar 18 16:44:42.038877 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.036321 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Mar 18 16:44:42.038877 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.036323 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Mar 18 16:44:42.038877 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.036326 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Mar 18 16:44:42.038877 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.036328 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Mar 18 16:44:42.038877 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.036331 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Mar 18 16:44:42.038877 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.036333 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Mar 18 16:44:42.038877 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.036336 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Mar 18 16:44:42.039409 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.036338 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Mar 18 16:44:42.039409 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.036341 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Mar 18 16:44:42.039409 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.036343 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Mar 18 16:44:42.039409 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.036345 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Mar 18 16:44:42.039409 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.036348 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Mar 18 16:44:42.039409 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.036350 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Mar 18 16:44:42.039409 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.036352 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Mar 18 16:44:42.039409 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.036371 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Mar 18 16:44:42.039409 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.036376 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Mar 18 16:44:42.039409 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.036379 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Mar 18 16:44:42.039409 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.036382 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Mar 18 16:44:42.039409 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.036384 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Mar 18 16:44:42.039409 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.036387 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Mar 18 16:44:42.039409 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.036389 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Mar 18 16:44:42.039409 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.036391 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Mar 18 16:44:42.039409 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.036394 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Mar 18 16:44:42.039409 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.036396 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Mar 18 16:44:42.039409 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.036399 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Mar 18 16:44:42.039409 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.036401 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Mar 18 16:44:42.039409 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.036403 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Mar 18 16:44:42.039960 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.036406 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 18 16:44:42.039960 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.036408 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Mar 18 16:44:42.039960 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.036410 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Mar 18 16:44:42.039960 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.036413 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Mar 18 16:44:42.039960 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.036415 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Mar 18 16:44:42.039960 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.036419 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Mar 18 16:44:42.039960 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.036422 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Mar 18 16:44:42.039960 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.036424 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 18 16:44:42.039960 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.036427 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Mar 18 16:44:42.039960 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.036430 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Mar 18 16:44:42.039960 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.036432 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Mar 18 16:44:42.039960 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.036435 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Mar 18 16:44:42.039960 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.036437 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Mar 18 16:44:42.039960 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.036440 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Mar 18 16:44:42.039960 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:42.036442 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Mar 18 16:44:42.039960 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.036447 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Mar 18 16:44:42.040395 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.037168 2575 server.go:962] "Client rotation is on, will bootstrap in background" Mar 18 16:44:42.040395 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.039184 2575 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 18 16:44:42.040395 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.040277 2575 server.go:1019] "Starting client certificate rotation" Mar 18 16:44:42.040395 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.040384 2575 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Mar 18 16:44:42.040574 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.040431 2575 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Mar 18 16:44:42.071662 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.071636 2575 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 18 16:44:42.076592 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.076574 2575 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 18 16:44:42.095878 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.095858 2575 log.go:25] "Validated CRI v1 runtime API" Mar 18 16:44:42.101632 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.101616 2575 log.go:25] "Validated CRI v1 image API" Mar 18 16:44:42.101841 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.101815 2575 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Mar 18 16:44:42.102857 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.102841 2575 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 18 16:44:42.107484 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.107458 2575 fs.go:135] Filesystem UUIDs: map[4cfdd512-cb39-4cc3-be39-2e9301849acf:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 c68c90c4-4101-4f9d-b678-35274b205fa0:/dev/nvme0n1p3] Mar 18 16:44:42.107569 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.107483 2575 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Mar 18 16:44:42.113293 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.113189 2575 manager.go:217] Machine: {Timestamp:2026-03-18 16:44:42.11120292 +0000 UTC m=+0.422455832 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3100899 MemoryCapacity:32812171264 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec23d1282c41fd0af98599b592de550f SystemUUID:ec23d128-2c41-fd0a-f985-99b592de550f BootID:6e44ecd1-776b-41a7-8c2b-2b85623a3d37 Filesystems:[{Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6094848 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406085632 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406085632 Type:vfs Inodes:4005392 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:85:a9:98:ad:8f Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:85:a9:98:ad:8f Speed:0 Mtu:9001} {Name:ovs-system MacAddress:72:3b:36:9b:6b:a4 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812171264 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 18 16:44:42.113293 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.113289 2575 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 18 16:44:42.113408 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.113379 2575 manager.go:233] Version: {KernelVersion:5.14.0-570.96.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260303-1 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 18 16:44:42.114403 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.114380 2575 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 18 16:44:42.114563 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.114405 2575 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-139-49.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 18 16:44:42.114636 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.114577 2575 topology_manager.go:138] "Creating topology manager with none policy" Mar 18 16:44:42.114636 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.114592 2575 container_manager_linux.go:306] "Creating device plugin manager" Mar 18 16:44:42.114636 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.114611 2575 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 18 16:44:42.115738 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.115725 2575 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 18 16:44:42.117026 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.117014 2575 state_mem.go:36] "Initialized new in-memory state store" Mar 18 16:44:42.117163 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.117152 2575 server.go:1267] "Using root directory" path="/var/lib/kubelet" Mar 18 16:44:42.119767 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.119756 2575 kubelet.go:491] "Attempting to sync node with API server" Mar 18 16:44:42.119836 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.119780 2575 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 18 16:44:42.119836 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.119795 2575 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 18 16:44:42.119836 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.119808 2575 kubelet.go:397] "Adding apiserver pod source" Mar 18 16:44:42.119836 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.119820 2575 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 18 16:44:42.120968 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.120955 2575 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Mar 18 16:44:42.121046 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.120980 2575 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Mar 18 16:44:42.124482 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.124465 2575 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.9-3.rhaos4.20.gitb9ac835.el9" apiVersion="v1" Mar 18 16:44:42.126456 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.126443 2575 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Mar 18 16:44:42.128381 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.128347 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 18 16:44:42.128435 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.128385 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 18 16:44:42.128435 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.128393 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 18 16:44:42.128435 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.128398 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 18 16:44:42.128435 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.128404 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 18 16:44:42.128435 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.128410 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 18 16:44:42.128435 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.128415 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 18 16:44:42.128435 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.128421 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 18 16:44:42.128435 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.128428 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 18 16:44:42.128435 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.128434 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 18 16:44:42.128671 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.128442 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 18 16:44:42.128671 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.128451 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 18 16:44:42.129397 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:44:42.129376 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Mar 18 16:44:42.129540 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:44:42.129520 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-139-49.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Mar 18 16:44:42.130389 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.130377 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 18 16:44:42.130435 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.130399 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Mar 18 16:44:42.133312 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.133296 2575 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-139-49.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 16:44:42.133778 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.133766 2575 watchdog_linux.go:99] "Systemd watchdog is not enabled" Mar 18 16:44:42.133827 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.133803 2575 server.go:1295] "Started kubelet" Mar 18 16:44:42.133938 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.133874 2575 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Mar 18 16:44:42.133989 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.133945 2575 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 18 16:44:42.134020 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.134014 2575 server_v1.go:47] "podresources" method="list" useActivePods=true Mar 18 16:44:42.134652 ip-10-0-139-49 systemd[1]: Started Kubernetes Kubelet. Mar 18 16:44:42.135649 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.135549 2575 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 18 16:44:42.135869 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.135851 2575 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-vscxx" Mar 18 16:44:42.136757 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.136739 2575 server.go:317] "Adding debug handlers to kubelet server" Mar 18 16:44:42.141645 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.141622 2575 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-vscxx" Mar 18 16:44:42.145105 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.145089 2575 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 18 16:44:42.145175 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.145134 2575 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Mar 18 16:44:42.145774 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.145756 2575 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Mar 18 16:44:42.145774 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.145758 2575 volume_manager.go:295] "The desired_state_of_world populator starts" Mar 18 16:44:42.145906 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.145787 2575 volume_manager.go:297] "Starting Kubelet Volume Manager" Mar 18 16:44:42.145985 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.145973 2575 reconstruct.go:97] "Volume reconstruction finished" Mar 18 16:44:42.145985 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.145984 2575 reconciler.go:26] "Reconciler: start to sync state" Mar 18 16:44:42.146087 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.146014 2575 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 18 16:44:42.146087 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.146030 2575 factory.go:55] Registering systemd factory Mar 18 16:44:42.146087 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.146041 2575 factory.go:223] Registration of the systemd container factory successfully Mar 18 16:44:42.146176 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:44:42.146083 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-49.ec2.internal\" not found" Mar 18 16:44:42.146333 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:44:42.146313 2575 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Mar 18 16:44:42.146484 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.146402 2575 factory.go:153] Registering CRI-O factory Mar 18 16:44:42.146575 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.146490 2575 factory.go:223] Registration of the crio container factory successfully Mar 18 16:44:42.146575 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.146516 2575 factory.go:103] Registering Raw factory Mar 18 16:44:42.146575 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.146533 2575 manager.go:1196] Started watching for new ooms in manager Mar 18 16:44:42.146928 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.146914 2575 manager.go:319] Starting recovery of all containers Mar 18 16:44:42.154504 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.154314 2575 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Mar 18 16:44:42.157132 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:44:42.157111 2575 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-139-49.ec2.internal\" not found" node="ip-10-0-139-49.ec2.internal" Mar 18 16:44:42.157652 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.157640 2575 manager.go:324] Recovery completed Mar 18 16:44:42.162201 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.162188 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 18 16:44:42.164752 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.164736 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-49.ec2.internal" event="NodeHasSufficientMemory" Mar 18 16:44:42.164829 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.164769 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-49.ec2.internal" event="NodeHasNoDiskPressure" Mar 18 16:44:42.164829 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.164785 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-49.ec2.internal" event="NodeHasSufficientPID" Mar 18 16:44:42.165237 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.165223 2575 cpu_manager.go:222] "Starting CPU manager" policy="none" Mar 18 16:44:42.165237 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.165237 2575 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Mar 18 16:44:42.165343 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.165258 2575 state_mem.go:36] "Initialized new in-memory state store" Mar 18 16:44:42.166439 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.166428 2575 policy_none.go:49] "None policy: Start" Mar 18 16:44:42.166483 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.166444 2575 memory_manager.go:186] "Starting memorymanager" policy="None" Mar 18 16:44:42.166483 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.166453 2575 state_mem.go:35] "Initializing new in-memory state store" Mar 18 16:44:42.211938 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.211915 2575 manager.go:341] "Starting Device Plugin manager" Mar 18 16:44:42.234056 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:44:42.211960 2575 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Mar 18 16:44:42.234056 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.211973 2575 server.go:85] "Starting device plugin registration server" Mar 18 16:44:42.234056 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.212223 2575 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 18 16:44:42.234056 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.212236 2575 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 18 16:44:42.234056 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.212322 2575 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 18 16:44:42.234056 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.212412 2575 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 18 16:44:42.234056 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.212421 2575 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 18 16:44:42.234056 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:44:42.212994 2575 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Mar 18 16:44:42.234056 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:44:42.213057 2575 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-139-49.ec2.internal\" not found" Mar 18 16:44:42.283839 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.283788 2575 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Mar 18 16:44:42.285141 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.285127 2575 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Mar 18 16:44:42.285197 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.285161 2575 status_manager.go:230] "Starting to sync pod status with apiserver" Mar 18 16:44:42.285197 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.285179 2575 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Mar 18 16:44:42.285197 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.285190 2575 kubelet.go:2451] "Starting kubelet main sync loop" Mar 18 16:44:42.285329 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:44:42.285220 2575 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Mar 18 16:44:42.289455 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.289439 2575 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Mar 18 16:44:42.312553 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.312532 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 18 16:44:42.313546 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.313521 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-49.ec2.internal" event="NodeHasSufficientMemory" Mar 18 16:44:42.313618 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.313559 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-49.ec2.internal" event="NodeHasNoDiskPressure" Mar 18 16:44:42.313618 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.313575 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-49.ec2.internal" event="NodeHasSufficientPID" Mar 18 16:44:42.313618 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.313596 2575 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-139-49.ec2.internal" Mar 18 16:44:42.322367 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.322334 2575 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-139-49.ec2.internal" Mar 18 16:44:42.322425 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:44:42.322353 2575 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-139-49.ec2.internal\": node \"ip-10-0-139-49.ec2.internal\" not found" Mar 18 16:44:42.341461 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:44:42.341445 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-49.ec2.internal\" not found" Mar 18 16:44:42.385628 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.385592 2575 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-49.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-139-49.ec2.internal"] Mar 18 16:44:42.385688 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.385662 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 18 16:44:42.387587 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.387567 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-49.ec2.internal" event="NodeHasSufficientMemory" Mar 18 16:44:42.387668 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.387592 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-49.ec2.internal" event="NodeHasNoDiskPressure" Mar 18 16:44:42.387668 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.387602 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-49.ec2.internal" event="NodeHasSufficientPID" Mar 18 16:44:42.389714 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.389703 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 18 16:44:42.389882 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.389866 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-49.ec2.internal" Mar 18 16:44:42.389944 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.389901 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 18 16:44:42.390341 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.390327 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-49.ec2.internal" event="NodeHasSufficientMemory" Mar 18 16:44:42.390434 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.390349 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-49.ec2.internal" event="NodeHasNoDiskPressure" Mar 18 16:44:42.390434 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.390383 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-49.ec2.internal" event="NodeHasSufficientPID" Mar 18 16:44:42.390434 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.390423 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-49.ec2.internal" event="NodeHasSufficientMemory" Mar 18 16:44:42.390542 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.390440 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-49.ec2.internal" event="NodeHasNoDiskPressure" Mar 18 16:44:42.390542 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.390454 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-49.ec2.internal" event="NodeHasSufficientPID" Mar 18 16:44:42.392384 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.392352 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-49.ec2.internal" Mar 18 16:44:42.392476 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.392393 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 18 16:44:42.393612 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.393595 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-49.ec2.internal" event="NodeHasSufficientMemory" Mar 18 16:44:42.393715 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.393622 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-49.ec2.internal" event="NodeHasNoDiskPressure" Mar 18 16:44:42.393715 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.393634 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-49.ec2.internal" event="NodeHasSufficientPID" Mar 18 16:44:42.414877 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:44:42.414853 2575 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-139-49.ec2.internal\" not found" node="ip-10-0-139-49.ec2.internal" Mar 18 16:44:42.419107 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:44:42.419091 2575 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-139-49.ec2.internal\" not found" node="ip-10-0-139-49.ec2.internal" Mar 18 16:44:42.442270 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:44:42.442255 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-49.ec2.internal\" not found" Mar 18 16:44:42.447231 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.447214 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/bbdfc43637cf5d326752644715aecd84-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-139-49.ec2.internal\" (UID: \"bbdfc43637cf5d326752644715aecd84\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-49.ec2.internal" Mar 18 16:44:42.447283 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.447242 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bbdfc43637cf5d326752644715aecd84-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-139-49.ec2.internal\" (UID: \"bbdfc43637cf5d326752644715aecd84\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-49.ec2.internal" Mar 18 16:44:42.447326 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.447299 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/ea09336bc3d6205db966e5eb690fe85d-config\") pod \"kube-apiserver-proxy-ip-10-0-139-49.ec2.internal\" (UID: \"ea09336bc3d6205db966e5eb690fe85d\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-139-49.ec2.internal" Mar 18 16:44:42.542743 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:44:42.542692 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-49.ec2.internal\" not found" Mar 18 16:44:42.548052 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.548036 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/ea09336bc3d6205db966e5eb690fe85d-config\") pod \"kube-apiserver-proxy-ip-10-0-139-49.ec2.internal\" (UID: \"ea09336bc3d6205db966e5eb690fe85d\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-139-49.ec2.internal" Mar 18 16:44:42.548109 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.548062 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/bbdfc43637cf5d326752644715aecd84-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-139-49.ec2.internal\" (UID: \"bbdfc43637cf5d326752644715aecd84\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-49.ec2.internal" Mar 18 16:44:42.548109 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.548078 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bbdfc43637cf5d326752644715aecd84-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-139-49.ec2.internal\" (UID: \"bbdfc43637cf5d326752644715aecd84\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-49.ec2.internal" Mar 18 16:44:42.548204 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.548145 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bbdfc43637cf5d326752644715aecd84-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-139-49.ec2.internal\" (UID: \"bbdfc43637cf5d326752644715aecd84\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-49.ec2.internal" Mar 18 16:44:42.548204 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.548151 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/ea09336bc3d6205db966e5eb690fe85d-config\") pod \"kube-apiserver-proxy-ip-10-0-139-49.ec2.internal\" (UID: \"ea09336bc3d6205db966e5eb690fe85d\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-139-49.ec2.internal" Mar 18 16:44:42.548204 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.548167 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/bbdfc43637cf5d326752644715aecd84-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-139-49.ec2.internal\" (UID: \"bbdfc43637cf5d326752644715aecd84\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-49.ec2.internal" Mar 18 16:44:42.643459 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:44:42.643428 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-49.ec2.internal\" not found" Mar 18 16:44:42.717953 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.717918 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-49.ec2.internal" Mar 18 16:44:42.720465 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:42.720449 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-49.ec2.internal" Mar 18 16:44:42.743805 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:44:42.743774 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-49.ec2.internal\" not found" Mar 18 16:44:42.844389 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:44:42.844281 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-49.ec2.internal\" not found" Mar 18 16:44:42.944759 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:44:42.944731 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-49.ec2.internal\" not found" Mar 18 16:44:43.040245 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:43.040221 2575 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 18 16:44:43.040853 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:43.040389 2575 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Mar 18 16:44:43.040853 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:43.040438 2575 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Mar 18 16:44:43.045349 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:44:43.045328 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-49.ec2.internal\" not found" Mar 18 16:44:43.144272 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:43.144231 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-03-17 16:39:42 +0000 UTC" deadline="2027-10-04 18:18:30.177727348 +0000 UTC" Mar 18 16:44:43.144272 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:43.144265 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13561h33m47.033465231s" Mar 18 16:44:43.145333 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:43.145317 2575 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Mar 18 16:44:43.145613 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:44:43.145595 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-49.ec2.internal\" not found" Mar 18 16:44:43.154640 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:43.154624 2575 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Mar 18 16:44:43.179225 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:43.179202 2575 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-lqw9m" Mar 18 16:44:43.186398 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:43.186373 2575 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-lqw9m" Mar 18 16:44:43.213677 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:43.213656 2575 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Mar 18 16:44:43.236169 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:43.236144 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea09336bc3d6205db966e5eb690fe85d.slice/crio-767f62cd627420a3c07b149b5384eadd2adfff2fb67a85accab0f5a160ef6ce2 WatchSource:0}: Error finding container 767f62cd627420a3c07b149b5384eadd2adfff2fb67a85accab0f5a160ef6ce2: Status 404 returned error can't find the container with id 767f62cd627420a3c07b149b5384eadd2adfff2fb67a85accab0f5a160ef6ce2 Mar 18 16:44:43.236539 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:43.236521 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbbdfc43637cf5d326752644715aecd84.slice/crio-fd62232efd81d6bd7a8a8e1aa49276e32400c661efb346a965d3573688319c15 WatchSource:0}: Error finding container fd62232efd81d6bd7a8a8e1aa49276e32400c661efb346a965d3573688319c15: Status 404 returned error can't find the container with id fd62232efd81d6bd7a8a8e1aa49276e32400c661efb346a965d3573688319c15 Mar 18 16:44:43.241117 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:43.241102 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 16:44:43.245909 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:43.245891 2575 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-49.ec2.internal" Mar 18 16:44:43.256474 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:43.256458 2575 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Mar 18 16:44:43.258116 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:43.258103 2575 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-49.ec2.internal" Mar 18 16:44:43.267130 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:43.267114 2575 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Mar 18 16:44:43.288047 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:43.288010 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-49.ec2.internal" event={"ID":"ea09336bc3d6205db966e5eb690fe85d","Type":"ContainerStarted","Data":"767f62cd627420a3c07b149b5384eadd2adfff2fb67a85accab0f5a160ef6ce2"} Mar 18 16:44:43.288973 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:43.288955 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-49.ec2.internal" event={"ID":"bbdfc43637cf5d326752644715aecd84","Type":"ContainerStarted","Data":"fd62232efd81d6bd7a8a8e1aa49276e32400c661efb346a965d3573688319c15"} Mar 18 16:44:43.564550 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:43.563997 2575 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Mar 18 16:44:44.032433 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.032336 2575 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Mar 18 16:44:44.121627 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.121598 2575 apiserver.go:52] "Watching apiserver" Mar 18 16:44:44.126792 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.126760 2575 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Mar 18 16:44:44.127177 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.127152 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-nbj8s","openshift-network-diagnostics/network-check-target-hr9bv","openshift-network-operator/iptables-alerter-kcvj8","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-t4bgd","openshift-cluster-node-tuning-operator/tuned-mwhvr","openshift-image-registry/node-ca-lqcmn","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-49.ec2.internal","openshift-ovn-kubernetes/ovnkube-node-qktrm","kube-system/konnectivity-agent-88mp7","kube-system/kube-apiserver-proxy-ip-10-0-139-49.ec2.internal","openshift-dns/node-resolver-kmk8h","openshift-multus/multus-additional-cni-plugins-w2ddv","openshift-multus/multus-qq8q7"] Mar 18 16:44:44.129392 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.129351 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-w2ddv" Mar 18 16:44:44.130626 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.130596 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hr9bv" Mar 18 16:44:44.130710 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:44:44.130678 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hr9bv" podUID="35d62bb3-aced-4698-ba43-4c7f828a050d" Mar 18 16:44:44.131625 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.131601 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Mar 18 16:44:44.131820 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.131802 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Mar 18 16:44:44.131915 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.131832 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Mar 18 16:44:44.131985 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.131972 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Mar 18 16:44:44.132070 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.132055 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-t8cpw\"" Mar 18 16:44:44.132194 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.132180 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Mar 18 16:44:44.133594 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.132990 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-kcvj8" Mar 18 16:44:44.134438 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.134391 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-t4bgd" Mar 18 16:44:44.134966 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.134948 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Mar 18 16:44:44.135060 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.135044 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Mar 18 16:44:44.135104 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.135070 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-vgjtp\"" Mar 18 16:44:44.135198 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.135179 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Mar 18 16:44:44.135653 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.135632 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-mwhvr" Mar 18 16:44:44.136167 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.135781 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-lqcmn" Mar 18 16:44:44.136167 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.136086 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Mar 18 16:44:44.136908 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.136888 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nbj8s" Mar 18 16:44:44.136992 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:44:44.136972 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nbj8s" podUID="d6948911-017b-4b29-b362-5520b984c273" Mar 18 16:44:44.137423 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.137401 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Mar 18 16:44:44.137423 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.137353 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Mar 18 16:44:44.137990 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.137970 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Mar 18 16:44:44.138180 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.138160 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Mar 18 16:44:44.138882 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.138575 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-wld92\"" Mar 18 16:44:44.138882 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.138676 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Mar 18 16:44:44.138882 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.138791 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-jlxmp\"" Mar 18 16:44:44.138882 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.138813 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Mar 18 16:44:44.139144 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.138986 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-kszpw\"" Mar 18 16:44:44.139144 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.139016 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Mar 18 16:44:44.140241 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.139919 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-qktrm" Mar 18 16:44:44.141732 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.141712 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-88mp7" Mar 18 16:44:44.142156 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.142140 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-qq8q7" Mar 18 16:44:44.142674 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.142654 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Mar 18 16:44:44.142750 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.142664 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Mar 18 16:44:44.143824 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.143712 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-kmk8h" Mar 18 16:44:44.144408 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.144380 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Mar 18 16:44:44.144408 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.144401 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Mar 18 16:44:44.144552 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.144454 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Mar 18 16:44:44.144858 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.144839 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-hhpw6\"" Mar 18 16:44:44.145213 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.145189 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-2vbv9\"" Mar 18 16:44:44.145501 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.145483 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Mar 18 16:44:44.145698 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.144845 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Mar 18 16:44:44.145890 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.145873 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Mar 18 16:44:44.146122 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.146101 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Mar 18 16:44:44.146374 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.146339 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-nv6rb\"" Mar 18 16:44:44.147927 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.147909 2575 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Mar 18 16:44:44.148210 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.148121 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-bkbxc\"" Mar 18 16:44:44.148577 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.148337 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Mar 18 16:44:44.148577 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.148544 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Mar 18 16:44:44.156190 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.156172 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a51f93e2-c750-4700-bb46-109456c7c78a-host-cni-bin\") pod \"ovnkube-node-qktrm\" (UID: \"a51f93e2-c750-4700-bb46-109456c7c78a\") " pod="openshift-ovn-kubernetes/ovnkube-node-qktrm" Mar 18 16:44:44.156267 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.156205 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a51f93e2-c750-4700-bb46-109456c7c78a-ovnkube-config\") pod \"ovnkube-node-qktrm\" (UID: \"a51f93e2-c750-4700-bb46-109456c7c78a\") " pod="openshift-ovn-kubernetes/ovnkube-node-qktrm" Mar 18 16:44:44.156267 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.156254 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/2e545205-6f68-4391-a421-c5d55b65e2d0-etc-tuned\") pod \"tuned-mwhvr\" (UID: \"2e545205-6f68-4391-a421-c5d55b65e2d0\") " pod="openshift-cluster-node-tuning-operator/tuned-mwhvr" Mar 18 16:44:44.156406 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.156283 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a51f93e2-c750-4700-bb46-109456c7c78a-host-cni-netd\") pod \"ovnkube-node-qktrm\" (UID: \"a51f93e2-c750-4700-bb46-109456c7c78a\") " pod="openshift-ovn-kubernetes/ovnkube-node-qktrm" Mar 18 16:44:44.156406 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.156307 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d6948911-017b-4b29-b362-5520b984c273-metrics-certs\") pod \"network-metrics-daemon-nbj8s\" (UID: \"d6948911-017b-4b29-b362-5520b984c273\") " pod="openshift-multus/network-metrics-daemon-nbj8s" Mar 18 16:44:44.156406 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.156334 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0452f967-d43e-4fc8-8591-3f8887d642ef-tmp-dir\") pod \"node-resolver-kmk8h\" (UID: \"0452f967-d43e-4fc8-8591-3f8887d642ef\") " pod="openshift-dns/node-resolver-kmk8h" Mar 18 16:44:44.156406 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.156378 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/42adf8fc-4871-4bee-8ebf-d7519c60b6af-kubelet-dir\") pod \"aws-ebs-csi-driver-node-t4bgd\" (UID: \"42adf8fc-4871-4bee-8ebf-d7519c60b6af\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-t4bgd" Mar 18 16:44:44.156406 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.156403 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4tmq\" (UniqueName: \"kubernetes.io/projected/42adf8fc-4871-4bee-8ebf-d7519c60b6af-kube-api-access-l4tmq\") pod \"aws-ebs-csi-driver-node-t4bgd\" (UID: \"42adf8fc-4871-4bee-8ebf-d7519c60b6af\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-t4bgd" Mar 18 16:44:44.156648 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.156450 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a51f93e2-c750-4700-bb46-109456c7c78a-systemd-units\") pod \"ovnkube-node-qktrm\" (UID: \"a51f93e2-c750-4700-bb46-109456c7c78a\") " pod="openshift-ovn-kubernetes/ovnkube-node-qktrm" Mar 18 16:44:44.156648 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.156470 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a51f93e2-c750-4700-bb46-109456c7c78a-host-slash\") pod \"ovnkube-node-qktrm\" (UID: \"a51f93e2-c750-4700-bb46-109456c7c78a\") " pod="openshift-ovn-kubernetes/ovnkube-node-qktrm" Mar 18 16:44:44.156648 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.156486 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a51f93e2-c750-4700-bb46-109456c7c78a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qktrm\" (UID: \"a51f93e2-c750-4700-bb46-109456c7c78a\") " pod="openshift-ovn-kubernetes/ovnkube-node-qktrm" Mar 18 16:44:44.156648 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.156509 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a51f93e2-c750-4700-bb46-109456c7c78a-env-overrides\") pod \"ovnkube-node-qktrm\" (UID: \"a51f93e2-c750-4700-bb46-109456c7c78a\") " pod="openshift-ovn-kubernetes/ovnkube-node-qktrm" Mar 18 16:44:44.156648 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.156537 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a51f93e2-c750-4700-bb46-109456c7c78a-ovn-node-metrics-cert\") pod \"ovnkube-node-qktrm\" (UID: \"a51f93e2-c750-4700-bb46-109456c7c78a\") " pod="openshift-ovn-kubernetes/ovnkube-node-qktrm" Mar 18 16:44:44.156648 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.156560 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/84aaa605-3cf0-4df0-9d6c-85011b39807e-host-var-lib-kubelet\") pod \"multus-qq8q7\" (UID: \"84aaa605-3cf0-4df0-9d6c-85011b39807e\") " pod="openshift-multus/multus-qq8q7" Mar 18 16:44:44.156648 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.156609 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1def291b-47ff-4e2f-b1a9-30275edc2bd9-host\") pod \"node-ca-lqcmn\" (UID: \"1def291b-47ff-4e2f-b1a9-30275edc2bd9\") " pod="openshift-image-registry/node-ca-lqcmn" Mar 18 16:44:44.156648 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.156642 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/42adf8fc-4871-4bee-8ebf-d7519c60b6af-sys-fs\") pod \"aws-ebs-csi-driver-node-t4bgd\" (UID: \"42adf8fc-4871-4bee-8ebf-d7519c60b6af\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-t4bgd" Mar 18 16:44:44.157012 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.156683 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a51f93e2-c750-4700-bb46-109456c7c78a-run-openvswitch\") pod \"ovnkube-node-qktrm\" (UID: \"a51f93e2-c750-4700-bb46-109456c7c78a\") " pod="openshift-ovn-kubernetes/ovnkube-node-qktrm" Mar 18 16:44:44.157012 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.156709 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/84aaa605-3cf0-4df0-9d6c-85011b39807e-hostroot\") pod \"multus-qq8q7\" (UID: \"84aaa605-3cf0-4df0-9d6c-85011b39807e\") " pod="openshift-multus/multus-qq8q7" Mar 18 16:44:44.157012 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.156742 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/84aaa605-3cf0-4df0-9d6c-85011b39807e-etc-kubernetes\") pod \"multus-qq8q7\" (UID: \"84aaa605-3cf0-4df0-9d6c-85011b39807e\") " pod="openshift-multus/multus-qq8q7" Mar 18 16:44:44.157012 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.156765 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zh9pp\" (UniqueName: \"kubernetes.io/projected/0452f967-d43e-4fc8-8591-3f8887d642ef-kube-api-access-zh9pp\") pod \"node-resolver-kmk8h\" (UID: \"0452f967-d43e-4fc8-8591-3f8887d642ef\") " pod="openshift-dns/node-resolver-kmk8h" Mar 18 16:44:44.157012 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.156783 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/2e545205-6f68-4391-a421-c5d55b65e2d0-etc-sysconfig\") pod \"tuned-mwhvr\" (UID: \"2e545205-6f68-4391-a421-c5d55b65e2d0\") " pod="openshift-cluster-node-tuning-operator/tuned-mwhvr" Mar 18 16:44:44.157012 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.156797 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/2e545205-6f68-4391-a421-c5d55b65e2d0-etc-systemd\") pod \"tuned-mwhvr\" (UID: \"2e545205-6f68-4391-a421-c5d55b65e2d0\") " pod="openshift-cluster-node-tuning-operator/tuned-mwhvr" Mar 18 16:44:44.157012 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.156814 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/cac8f65a-dd3e-4b2c-8a45-1ba1e842dd91-agent-certs\") pod \"konnectivity-agent-88mp7\" (UID: \"cac8f65a-dd3e-4b2c-8a45-1ba1e842dd91\") " pod="kube-system/konnectivity-agent-88mp7" Mar 18 16:44:44.157012 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.156839 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rpztg\" (UniqueName: \"kubernetes.io/projected/35d62bb3-aced-4698-ba43-4c7f828a050d-kube-api-access-rpztg\") pod \"network-check-target-hr9bv\" (UID: \"35d62bb3-aced-4698-ba43-4c7f828a050d\") " pod="openshift-network-diagnostics/network-check-target-hr9bv" Mar 18 16:44:44.157012 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.156867 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/af60e70e-528b-44c4-a08d-8bd69ee9547f-iptables-alerter-script\") pod \"iptables-alerter-kcvj8\" (UID: \"af60e70e-528b-44c4-a08d-8bd69ee9547f\") " pod="openshift-network-operator/iptables-alerter-kcvj8" Mar 18 16:44:44.157012 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.156887 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f8f81e7e-bafc-43ea-a249-a4269a4090b1-cni-binary-copy\") pod \"multus-additional-cni-plugins-w2ddv\" (UID: \"f8f81e7e-bafc-43ea-a249-a4269a4090b1\") " pod="openshift-multus/multus-additional-cni-plugins-w2ddv" Mar 18 16:44:44.157012 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.156903 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6h7w\" (UniqueName: \"kubernetes.io/projected/f8f81e7e-bafc-43ea-a249-a4269a4090b1-kube-api-access-b6h7w\") pod \"multus-additional-cni-plugins-w2ddv\" (UID: \"f8f81e7e-bafc-43ea-a249-a4269a4090b1\") " pod="openshift-multus/multus-additional-cni-plugins-w2ddv" Mar 18 16:44:44.157012 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.156918 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a51f93e2-c750-4700-bb46-109456c7c78a-host-run-netns\") pod \"ovnkube-node-qktrm\" (UID: \"a51f93e2-c750-4700-bb46-109456c7c78a\") " pod="openshift-ovn-kubernetes/ovnkube-node-qktrm" Mar 18 16:44:44.157012 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.156951 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a51f93e2-c750-4700-bb46-109456c7c78a-run-systemd\") pod \"ovnkube-node-qktrm\" (UID: \"a51f93e2-c750-4700-bb46-109456c7c78a\") " pod="openshift-ovn-kubernetes/ovnkube-node-qktrm" Mar 18 16:44:44.157012 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.156974 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/84aaa605-3cf0-4df0-9d6c-85011b39807e-host-run-netns\") pod \"multus-qq8q7\" (UID: \"84aaa605-3cf0-4df0-9d6c-85011b39807e\") " pod="openshift-multus/multus-qq8q7" Mar 18 16:44:44.157012 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.156996 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/84aaa605-3cf0-4df0-9d6c-85011b39807e-multus-daemon-config\") pod \"multus-qq8q7\" (UID: \"84aaa605-3cf0-4df0-9d6c-85011b39807e\") " pod="openshift-multus/multus-qq8q7" Mar 18 16:44:44.157653 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.157037 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f8f81e7e-bafc-43ea-a249-a4269a4090b1-os-release\") pod \"multus-additional-cni-plugins-w2ddv\" (UID: \"f8f81e7e-bafc-43ea-a249-a4269a4090b1\") " pod="openshift-multus/multus-additional-cni-plugins-w2ddv" Mar 18 16:44:44.157653 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.157063 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/84aaa605-3cf0-4df0-9d6c-85011b39807e-os-release\") pod \"multus-qq8q7\" (UID: \"84aaa605-3cf0-4df0-9d6c-85011b39807e\") " pod="openshift-multus/multus-qq8q7" Mar 18 16:44:44.157653 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.157084 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1def291b-47ff-4e2f-b1a9-30275edc2bd9-serviceca\") pod \"node-ca-lqcmn\" (UID: \"1def291b-47ff-4e2f-b1a9-30275edc2bd9\") " pod="openshift-image-registry/node-ca-lqcmn" Mar 18 16:44:44.157653 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.157112 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/42adf8fc-4871-4bee-8ebf-d7519c60b6af-socket-dir\") pod \"aws-ebs-csi-driver-node-t4bgd\" (UID: \"42adf8fc-4871-4bee-8ebf-d7519c60b6af\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-t4bgd" Mar 18 16:44:44.157653 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.157148 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/2e545205-6f68-4391-a421-c5d55b65e2d0-etc-modprobe-d\") pod \"tuned-mwhvr\" (UID: \"2e545205-6f68-4391-a421-c5d55b65e2d0\") " pod="openshift-cluster-node-tuning-operator/tuned-mwhvr" Mar 18 16:44:44.157653 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.157166 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2e545205-6f68-4391-a421-c5d55b65e2d0-sys\") pod \"tuned-mwhvr\" (UID: \"2e545205-6f68-4391-a421-c5d55b65e2d0\") " pod="openshift-cluster-node-tuning-operator/tuned-mwhvr" Mar 18 16:44:44.157653 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.157182 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7gfs\" (UniqueName: \"kubernetes.io/projected/a51f93e2-c750-4700-bb46-109456c7c78a-kube-api-access-d7gfs\") pod \"ovnkube-node-qktrm\" (UID: \"a51f93e2-c750-4700-bb46-109456c7c78a\") " pod="openshift-ovn-kubernetes/ovnkube-node-qktrm" Mar 18 16:44:44.157653 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.157199 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/84aaa605-3cf0-4df0-9d6c-85011b39807e-multus-cni-dir\") pod \"multus-qq8q7\" (UID: \"84aaa605-3cf0-4df0-9d6c-85011b39807e\") " pod="openshift-multus/multus-qq8q7" Mar 18 16:44:44.157653 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.157219 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/84aaa605-3cf0-4df0-9d6c-85011b39807e-host-run-k8s-cni-cncf-io\") pod \"multus-qq8q7\" (UID: \"84aaa605-3cf0-4df0-9d6c-85011b39807e\") " pod="openshift-multus/multus-qq8q7" Mar 18 16:44:44.157653 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.157240 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/84aaa605-3cf0-4df0-9d6c-85011b39807e-multus-conf-dir\") pod \"multus-qq8q7\" (UID: \"84aaa605-3cf0-4df0-9d6c-85011b39807e\") " pod="openshift-multus/multus-qq8q7" Mar 18 16:44:44.157653 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.157272 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkw5t\" (UniqueName: \"kubernetes.io/projected/1def291b-47ff-4e2f-b1a9-30275edc2bd9-kube-api-access-wkw5t\") pod \"node-ca-lqcmn\" (UID: \"1def291b-47ff-4e2f-b1a9-30275edc2bd9\") " pod="openshift-image-registry/node-ca-lqcmn" Mar 18 16:44:44.157653 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.157296 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/f8f81e7e-bafc-43ea-a249-a4269a4090b1-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-w2ddv\" (UID: \"f8f81e7e-bafc-43ea-a249-a4269a4090b1\") " pod="openshift-multus/multus-additional-cni-plugins-w2ddv" Mar 18 16:44:44.157653 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.157312 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a51f93e2-c750-4700-bb46-109456c7c78a-ovnkube-script-lib\") pod \"ovnkube-node-qktrm\" (UID: \"a51f93e2-c750-4700-bb46-109456c7c78a\") " pod="openshift-ovn-kubernetes/ovnkube-node-qktrm" Mar 18 16:44:44.157653 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.157344 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/cac8f65a-dd3e-4b2c-8a45-1ba1e842dd91-konnectivity-ca\") pod \"konnectivity-agent-88mp7\" (UID: \"cac8f65a-dd3e-4b2c-8a45-1ba1e842dd91\") " pod="kube-system/konnectivity-agent-88mp7" Mar 18 16:44:44.157653 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.157414 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/0452f967-d43e-4fc8-8591-3f8887d642ef-hosts-file\") pod \"node-resolver-kmk8h\" (UID: \"0452f967-d43e-4fc8-8591-3f8887d642ef\") " pod="openshift-dns/node-resolver-kmk8h" Mar 18 16:44:44.157653 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.157448 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f8f81e7e-bafc-43ea-a249-a4269a4090b1-cnibin\") pod \"multus-additional-cni-plugins-w2ddv\" (UID: \"f8f81e7e-bafc-43ea-a249-a4269a4090b1\") " pod="openshift-multus/multus-additional-cni-plugins-w2ddv" Mar 18 16:44:44.158074 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.157482 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/42adf8fc-4871-4bee-8ebf-d7519c60b6af-etc-selinux\") pod \"aws-ebs-csi-driver-node-t4bgd\" (UID: \"42adf8fc-4871-4bee-8ebf-d7519c60b6af\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-t4bgd" Mar 18 16:44:44.158074 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.157497 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/2e545205-6f68-4391-a421-c5d55b65e2d0-etc-sysctl-conf\") pod \"tuned-mwhvr\" (UID: \"2e545205-6f68-4391-a421-c5d55b65e2d0\") " pod="openshift-cluster-node-tuning-operator/tuned-mwhvr" Mar 18 16:44:44.158074 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.157518 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a51f93e2-c750-4700-bb46-109456c7c78a-run-ovn\") pod \"ovnkube-node-qktrm\" (UID: \"a51f93e2-c750-4700-bb46-109456c7c78a\") " pod="openshift-ovn-kubernetes/ovnkube-node-qktrm" Mar 18 16:44:44.158074 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.157538 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/84aaa605-3cf0-4df0-9d6c-85011b39807e-cnibin\") pod \"multus-qq8q7\" (UID: \"84aaa605-3cf0-4df0-9d6c-85011b39807e\") " pod="openshift-multus/multus-qq8q7" Mar 18 16:44:44.158074 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.157567 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/84aaa605-3cf0-4df0-9d6c-85011b39807e-multus-socket-dir-parent\") pod \"multus-qq8q7\" (UID: \"84aaa605-3cf0-4df0-9d6c-85011b39807e\") " pod="openshift-multus/multus-qq8q7" Mar 18 16:44:44.158074 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.157599 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/84aaa605-3cf0-4df0-9d6c-85011b39807e-host-var-lib-cni-multus\") pod \"multus-qq8q7\" (UID: \"84aaa605-3cf0-4df0-9d6c-85011b39807e\") " pod="openshift-multus/multus-qq8q7" Mar 18 16:44:44.158074 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.157625 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mp5p8\" (UniqueName: \"kubernetes.io/projected/d6948911-017b-4b29-b362-5520b984c273-kube-api-access-mp5p8\") pod \"network-metrics-daemon-nbj8s\" (UID: \"d6948911-017b-4b29-b362-5520b984c273\") " pod="openshift-multus/network-metrics-daemon-nbj8s" Mar 18 16:44:44.158074 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.157644 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/84aaa605-3cf0-4df0-9d6c-85011b39807e-system-cni-dir\") pod \"multus-qq8q7\" (UID: \"84aaa605-3cf0-4df0-9d6c-85011b39807e\") " pod="openshift-multus/multus-qq8q7" Mar 18 16:44:44.158074 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.157659 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/84aaa605-3cf0-4df0-9d6c-85011b39807e-cni-binary-copy\") pod \"multus-qq8q7\" (UID: \"84aaa605-3cf0-4df0-9d6c-85011b39807e\") " pod="openshift-multus/multus-qq8q7" Mar 18 16:44:44.158074 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.157673 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2e545205-6f68-4391-a421-c5d55b65e2d0-var-lib-kubelet\") pod \"tuned-mwhvr\" (UID: \"2e545205-6f68-4391-a421-c5d55b65e2d0\") " pod="openshift-cluster-node-tuning-operator/tuned-mwhvr" Mar 18 16:44:44.158074 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.157694 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a51f93e2-c750-4700-bb46-109456c7c78a-log-socket\") pod \"ovnkube-node-qktrm\" (UID: \"a51f93e2-c750-4700-bb46-109456c7c78a\") " pod="openshift-ovn-kubernetes/ovnkube-node-qktrm" Mar 18 16:44:44.158074 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.157708 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a51f93e2-c750-4700-bb46-109456c7c78a-var-lib-openvswitch\") pod \"ovnkube-node-qktrm\" (UID: \"a51f93e2-c750-4700-bb46-109456c7c78a\") " pod="openshift-ovn-kubernetes/ovnkube-node-qktrm" Mar 18 16:44:44.158074 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.157721 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22zln\" (UniqueName: \"kubernetes.io/projected/84aaa605-3cf0-4df0-9d6c-85011b39807e-kube-api-access-22zln\") pod \"multus-qq8q7\" (UID: \"84aaa605-3cf0-4df0-9d6c-85011b39807e\") " pod="openshift-multus/multus-qq8q7" Mar 18 16:44:44.158074 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.157736 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/af60e70e-528b-44c4-a08d-8bd69ee9547f-host-slash\") pod \"iptables-alerter-kcvj8\" (UID: \"af60e70e-528b-44c4-a08d-8bd69ee9547f\") " pod="openshift-network-operator/iptables-alerter-kcvj8" Mar 18 16:44:44.158074 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.157758 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/2e545205-6f68-4391-a421-c5d55b65e2d0-etc-sysctl-d\") pod \"tuned-mwhvr\" (UID: \"2e545205-6f68-4391-a421-c5d55b65e2d0\") " pod="openshift-cluster-node-tuning-operator/tuned-mwhvr" Mar 18 16:44:44.158074 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.157779 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2e545205-6f68-4391-a421-c5d55b65e2d0-etc-kubernetes\") pod \"tuned-mwhvr\" (UID: \"2e545205-6f68-4391-a421-c5d55b65e2d0\") " pod="openshift-cluster-node-tuning-operator/tuned-mwhvr" Mar 18 16:44:44.158074 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.157793 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2e545205-6f68-4391-a421-c5d55b65e2d0-host\") pod \"tuned-mwhvr\" (UID: \"2e545205-6f68-4391-a421-c5d55b65e2d0\") " pod="openshift-cluster-node-tuning-operator/tuned-mwhvr" Mar 18 16:44:44.158628 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.157806 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2e545205-6f68-4391-a421-c5d55b65e2d0-tmp\") pod \"tuned-mwhvr\" (UID: \"2e545205-6f68-4391-a421-c5d55b65e2d0\") " pod="openshift-cluster-node-tuning-operator/tuned-mwhvr" Mar 18 16:44:44.158628 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.157818 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a51f93e2-c750-4700-bb46-109456c7c78a-node-log\") pod \"ovnkube-node-qktrm\" (UID: \"a51f93e2-c750-4700-bb46-109456c7c78a\") " pod="openshift-ovn-kubernetes/ovnkube-node-qktrm" Mar 18 16:44:44.158628 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.157837 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a51f93e2-c750-4700-bb46-109456c7c78a-etc-openvswitch\") pod \"ovnkube-node-qktrm\" (UID: \"a51f93e2-c750-4700-bb46-109456c7c78a\") " pod="openshift-ovn-kubernetes/ovnkube-node-qktrm" Mar 18 16:44:44.158628 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.157861 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/84aaa605-3cf0-4df0-9d6c-85011b39807e-host-run-multus-certs\") pod \"multus-qq8q7\" (UID: \"84aaa605-3cf0-4df0-9d6c-85011b39807e\") " pod="openshift-multus/multus-qq8q7" Mar 18 16:44:44.158628 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.157890 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/2e545205-6f68-4391-a421-c5d55b65e2d0-run\") pod \"tuned-mwhvr\" (UID: \"2e545205-6f68-4391-a421-c5d55b65e2d0\") " pod="openshift-cluster-node-tuning-operator/tuned-mwhvr" Mar 18 16:44:44.158628 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.157908 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5wpp\" (UniqueName: \"kubernetes.io/projected/2e545205-6f68-4391-a421-c5d55b65e2d0-kube-api-access-c5wpp\") pod \"tuned-mwhvr\" (UID: \"2e545205-6f68-4391-a421-c5d55b65e2d0\") " pod="openshift-cluster-node-tuning-operator/tuned-mwhvr" Mar 18 16:44:44.158628 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.157923 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a51f93e2-c750-4700-bb46-109456c7c78a-host-kubelet\") pod \"ovnkube-node-qktrm\" (UID: \"a51f93e2-c750-4700-bb46-109456c7c78a\") " pod="openshift-ovn-kubernetes/ovnkube-node-qktrm" Mar 18 16:44:44.158628 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.157943 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/84aaa605-3cf0-4df0-9d6c-85011b39807e-host-var-lib-cni-bin\") pod \"multus-qq8q7\" (UID: \"84aaa605-3cf0-4df0-9d6c-85011b39807e\") " pod="openshift-multus/multus-qq8q7" Mar 18 16:44:44.158628 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.157960 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/42adf8fc-4871-4bee-8ebf-d7519c60b6af-registration-dir\") pod \"aws-ebs-csi-driver-node-t4bgd\" (UID: \"42adf8fc-4871-4bee-8ebf-d7519c60b6af\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-t4bgd" Mar 18 16:44:44.158628 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.157974 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2e545205-6f68-4391-a421-c5d55b65e2d0-lib-modules\") pod \"tuned-mwhvr\" (UID: \"2e545205-6f68-4391-a421-c5d55b65e2d0\") " pod="openshift-cluster-node-tuning-operator/tuned-mwhvr" Mar 18 16:44:44.158628 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.158007 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a51f93e2-c750-4700-bb46-109456c7c78a-host-run-ovn-kubernetes\") pod \"ovnkube-node-qktrm\" (UID: \"a51f93e2-c750-4700-bb46-109456c7c78a\") " pod="openshift-ovn-kubernetes/ovnkube-node-qktrm" Mar 18 16:44:44.158628 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.158027 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wptxk\" (UniqueName: \"kubernetes.io/projected/af60e70e-528b-44c4-a08d-8bd69ee9547f-kube-api-access-wptxk\") pod \"iptables-alerter-kcvj8\" (UID: \"af60e70e-528b-44c4-a08d-8bd69ee9547f\") " pod="openshift-network-operator/iptables-alerter-kcvj8" Mar 18 16:44:44.158628 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.158041 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f8f81e7e-bafc-43ea-a249-a4269a4090b1-system-cni-dir\") pod \"multus-additional-cni-plugins-w2ddv\" (UID: \"f8f81e7e-bafc-43ea-a249-a4269a4090b1\") " pod="openshift-multus/multus-additional-cni-plugins-w2ddv" Mar 18 16:44:44.158628 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.158055 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f8f81e7e-bafc-43ea-a249-a4269a4090b1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-w2ddv\" (UID: \"f8f81e7e-bafc-43ea-a249-a4269a4090b1\") " pod="openshift-multus/multus-additional-cni-plugins-w2ddv" Mar 18 16:44:44.158628 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.158081 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f8f81e7e-bafc-43ea-a249-a4269a4090b1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-w2ddv\" (UID: \"f8f81e7e-bafc-43ea-a249-a4269a4090b1\") " pod="openshift-multus/multus-additional-cni-plugins-w2ddv" Mar 18 16:44:44.158628 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.158103 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/42adf8fc-4871-4bee-8ebf-d7519c60b6af-device-dir\") pod \"aws-ebs-csi-driver-node-t4bgd\" (UID: \"42adf8fc-4871-4bee-8ebf-d7519c60b6af\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-t4bgd" Mar 18 16:44:44.170559 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.170540 2575 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Mar 18 16:44:44.188013 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.187988 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-03-17 16:39:43 +0000 UTC" deadline="2027-09-22 07:13:33.284558742 +0000 UTC" Mar 18 16:44:44.188091 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.188034 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13262h28m49.096531879s" Mar 18 16:44:44.258384 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.258340 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a51f93e2-c750-4700-bb46-109456c7c78a-etc-openvswitch\") pod \"ovnkube-node-qktrm\" (UID: \"a51f93e2-c750-4700-bb46-109456c7c78a\") " pod="openshift-ovn-kubernetes/ovnkube-node-qktrm" Mar 18 16:44:44.258539 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.258399 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/84aaa605-3cf0-4df0-9d6c-85011b39807e-host-run-multus-certs\") pod \"multus-qq8q7\" (UID: \"84aaa605-3cf0-4df0-9d6c-85011b39807e\") " pod="openshift-multus/multus-qq8q7" Mar 18 16:44:44.258539 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.258425 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/2e545205-6f68-4391-a421-c5d55b65e2d0-run\") pod \"tuned-mwhvr\" (UID: \"2e545205-6f68-4391-a421-c5d55b65e2d0\") " pod="openshift-cluster-node-tuning-operator/tuned-mwhvr" Mar 18 16:44:44.258539 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.258428 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a51f93e2-c750-4700-bb46-109456c7c78a-etc-openvswitch\") pod \"ovnkube-node-qktrm\" (UID: \"a51f93e2-c750-4700-bb46-109456c7c78a\") " pod="openshift-ovn-kubernetes/ovnkube-node-qktrm" Mar 18 16:44:44.258539 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.258447 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c5wpp\" (UniqueName: \"kubernetes.io/projected/2e545205-6f68-4391-a421-c5d55b65e2d0-kube-api-access-c5wpp\") pod \"tuned-mwhvr\" (UID: \"2e545205-6f68-4391-a421-c5d55b65e2d0\") " pod="openshift-cluster-node-tuning-operator/tuned-mwhvr" Mar 18 16:44:44.258539 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.258472 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a51f93e2-c750-4700-bb46-109456c7c78a-host-kubelet\") pod \"ovnkube-node-qktrm\" (UID: \"a51f93e2-c750-4700-bb46-109456c7c78a\") " pod="openshift-ovn-kubernetes/ovnkube-node-qktrm" Mar 18 16:44:44.258539 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.258512 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/84aaa605-3cf0-4df0-9d6c-85011b39807e-host-var-lib-cni-bin\") pod \"multus-qq8q7\" (UID: \"84aaa605-3cf0-4df0-9d6c-85011b39807e\") " pod="openshift-multus/multus-qq8q7" Mar 18 16:44:44.258539 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.258540 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/84aaa605-3cf0-4df0-9d6c-85011b39807e-host-run-multus-certs\") pod \"multus-qq8q7\" (UID: \"84aaa605-3cf0-4df0-9d6c-85011b39807e\") " pod="openshift-multus/multus-qq8q7" Mar 18 16:44:44.258875 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.258540 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/42adf8fc-4871-4bee-8ebf-d7519c60b6af-registration-dir\") pod \"aws-ebs-csi-driver-node-t4bgd\" (UID: \"42adf8fc-4871-4bee-8ebf-d7519c60b6af\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-t4bgd" Mar 18 16:44:44.258875 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.258547 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/2e545205-6f68-4391-a421-c5d55b65e2d0-run\") pod \"tuned-mwhvr\" (UID: \"2e545205-6f68-4391-a421-c5d55b65e2d0\") " pod="openshift-cluster-node-tuning-operator/tuned-mwhvr" Mar 18 16:44:44.258875 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.258570 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2e545205-6f68-4391-a421-c5d55b65e2d0-lib-modules\") pod \"tuned-mwhvr\" (UID: \"2e545205-6f68-4391-a421-c5d55b65e2d0\") " pod="openshift-cluster-node-tuning-operator/tuned-mwhvr" Mar 18 16:44:44.258875 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.258596 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a51f93e2-c750-4700-bb46-109456c7c78a-host-run-ovn-kubernetes\") pod \"ovnkube-node-qktrm\" (UID: \"a51f93e2-c750-4700-bb46-109456c7c78a\") " pod="openshift-ovn-kubernetes/ovnkube-node-qktrm" Mar 18 16:44:44.258875 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.258514 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a51f93e2-c750-4700-bb46-109456c7c78a-host-kubelet\") pod \"ovnkube-node-qktrm\" (UID: \"a51f93e2-c750-4700-bb46-109456c7c78a\") " pod="openshift-ovn-kubernetes/ovnkube-node-qktrm" Mar 18 16:44:44.258875 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.258622 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wptxk\" (UniqueName: \"kubernetes.io/projected/af60e70e-528b-44c4-a08d-8bd69ee9547f-kube-api-access-wptxk\") pod \"iptables-alerter-kcvj8\" (UID: \"af60e70e-528b-44c4-a08d-8bd69ee9547f\") " pod="openshift-network-operator/iptables-alerter-kcvj8" Mar 18 16:44:44.258875 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.258665 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f8f81e7e-bafc-43ea-a249-a4269a4090b1-system-cni-dir\") pod \"multus-additional-cni-plugins-w2ddv\" (UID: \"f8f81e7e-bafc-43ea-a249-a4269a4090b1\") " pod="openshift-multus/multus-additional-cni-plugins-w2ddv" Mar 18 16:44:44.258875 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.258688 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f8f81e7e-bafc-43ea-a249-a4269a4090b1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-w2ddv\" (UID: \"f8f81e7e-bafc-43ea-a249-a4269a4090b1\") " pod="openshift-multus/multus-additional-cni-plugins-w2ddv" Mar 18 16:44:44.258875 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.258715 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f8f81e7e-bafc-43ea-a249-a4269a4090b1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-w2ddv\" (UID: \"f8f81e7e-bafc-43ea-a249-a4269a4090b1\") " pod="openshift-multus/multus-additional-cni-plugins-w2ddv" Mar 18 16:44:44.258875 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.258740 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2e545205-6f68-4391-a421-c5d55b65e2d0-lib-modules\") pod \"tuned-mwhvr\" (UID: \"2e545205-6f68-4391-a421-c5d55b65e2d0\") " pod="openshift-cluster-node-tuning-operator/tuned-mwhvr" Mar 18 16:44:44.258875 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.258775 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f8f81e7e-bafc-43ea-a249-a4269a4090b1-system-cni-dir\") pod \"multus-additional-cni-plugins-w2ddv\" (UID: \"f8f81e7e-bafc-43ea-a249-a4269a4090b1\") " pod="openshift-multus/multus-additional-cni-plugins-w2ddv" Mar 18 16:44:44.258875 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.258784 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/42adf8fc-4871-4bee-8ebf-d7519c60b6af-device-dir\") pod \"aws-ebs-csi-driver-node-t4bgd\" (UID: \"42adf8fc-4871-4bee-8ebf-d7519c60b6af\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-t4bgd" Mar 18 16:44:44.258875 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.258809 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a51f93e2-c750-4700-bb46-109456c7c78a-host-run-ovn-kubernetes\") pod \"ovnkube-node-qktrm\" (UID: \"a51f93e2-c750-4700-bb46-109456c7c78a\") " pod="openshift-ovn-kubernetes/ovnkube-node-qktrm" Mar 18 16:44:44.258875 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.258814 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/42adf8fc-4871-4bee-8ebf-d7519c60b6af-registration-dir\") pod \"aws-ebs-csi-driver-node-t4bgd\" (UID: \"42adf8fc-4871-4bee-8ebf-d7519c60b6af\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-t4bgd" Mar 18 16:44:44.258875 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.258742 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/42adf8fc-4871-4bee-8ebf-d7519c60b6af-device-dir\") pod \"aws-ebs-csi-driver-node-t4bgd\" (UID: \"42adf8fc-4871-4bee-8ebf-d7519c60b6af\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-t4bgd" Mar 18 16:44:44.258875 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.258844 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a51f93e2-c750-4700-bb46-109456c7c78a-host-cni-bin\") pod \"ovnkube-node-qktrm\" (UID: \"a51f93e2-c750-4700-bb46-109456c7c78a\") " pod="openshift-ovn-kubernetes/ovnkube-node-qktrm" Mar 18 16:44:44.258875 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.258850 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/84aaa605-3cf0-4df0-9d6c-85011b39807e-host-var-lib-cni-bin\") pod \"multus-qq8q7\" (UID: \"84aaa605-3cf0-4df0-9d6c-85011b39807e\") " pod="openshift-multus/multus-qq8q7" Mar 18 16:44:44.259552 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.258877 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a51f93e2-c750-4700-bb46-109456c7c78a-host-cni-bin\") pod \"ovnkube-node-qktrm\" (UID: \"a51f93e2-c750-4700-bb46-109456c7c78a\") " pod="openshift-ovn-kubernetes/ovnkube-node-qktrm" Mar 18 16:44:44.259552 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.258885 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a51f93e2-c750-4700-bb46-109456c7c78a-ovnkube-config\") pod \"ovnkube-node-qktrm\" (UID: \"a51f93e2-c750-4700-bb46-109456c7c78a\") " pod="openshift-ovn-kubernetes/ovnkube-node-qktrm" Mar 18 16:44:44.259552 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.258932 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f8f81e7e-bafc-43ea-a249-a4269a4090b1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-w2ddv\" (UID: \"f8f81e7e-bafc-43ea-a249-a4269a4090b1\") " pod="openshift-multus/multus-additional-cni-plugins-w2ddv" Mar 18 16:44:44.259552 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.258960 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/2e545205-6f68-4391-a421-c5d55b65e2d0-etc-tuned\") pod \"tuned-mwhvr\" (UID: \"2e545205-6f68-4391-a421-c5d55b65e2d0\") " pod="openshift-cluster-node-tuning-operator/tuned-mwhvr" Mar 18 16:44:44.259552 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.259067 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a51f93e2-c750-4700-bb46-109456c7c78a-host-cni-netd\") pod \"ovnkube-node-qktrm\" (UID: \"a51f93e2-c750-4700-bb46-109456c7c78a\") " pod="openshift-ovn-kubernetes/ovnkube-node-qktrm" Mar 18 16:44:44.259552 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.259099 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d6948911-017b-4b29-b362-5520b984c273-metrics-certs\") pod \"network-metrics-daemon-nbj8s\" (UID: \"d6948911-017b-4b29-b362-5520b984c273\") " pod="openshift-multus/network-metrics-daemon-nbj8s" Mar 18 16:44:44.259552 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.259122 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0452f967-d43e-4fc8-8591-3f8887d642ef-tmp-dir\") pod \"node-resolver-kmk8h\" (UID: \"0452f967-d43e-4fc8-8591-3f8887d642ef\") " pod="openshift-dns/node-resolver-kmk8h" Mar 18 16:44:44.259552 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.259145 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/42adf8fc-4871-4bee-8ebf-d7519c60b6af-kubelet-dir\") pod \"aws-ebs-csi-driver-node-t4bgd\" (UID: \"42adf8fc-4871-4bee-8ebf-d7519c60b6af\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-t4bgd" Mar 18 16:44:44.259552 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.259156 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a51f93e2-c750-4700-bb46-109456c7c78a-host-cni-netd\") pod \"ovnkube-node-qktrm\" (UID: \"a51f93e2-c750-4700-bb46-109456c7c78a\") " pod="openshift-ovn-kubernetes/ovnkube-node-qktrm" Mar 18 16:44:44.259552 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.259171 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l4tmq\" (UniqueName: \"kubernetes.io/projected/42adf8fc-4871-4bee-8ebf-d7519c60b6af-kube-api-access-l4tmq\") pod \"aws-ebs-csi-driver-node-t4bgd\" (UID: \"42adf8fc-4871-4bee-8ebf-d7519c60b6af\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-t4bgd" Mar 18 16:44:44.259552 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.259195 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a51f93e2-c750-4700-bb46-109456c7c78a-systemd-units\") pod \"ovnkube-node-qktrm\" (UID: \"a51f93e2-c750-4700-bb46-109456c7c78a\") " pod="openshift-ovn-kubernetes/ovnkube-node-qktrm" Mar 18 16:44:44.259552 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.259218 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a51f93e2-c750-4700-bb46-109456c7c78a-host-slash\") pod \"ovnkube-node-qktrm\" (UID: \"a51f93e2-c750-4700-bb46-109456c7c78a\") " pod="openshift-ovn-kubernetes/ovnkube-node-qktrm" Mar 18 16:44:44.259552 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.259244 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a51f93e2-c750-4700-bb46-109456c7c78a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qktrm\" (UID: \"a51f93e2-c750-4700-bb46-109456c7c78a\") " pod="openshift-ovn-kubernetes/ovnkube-node-qktrm" Mar 18 16:44:44.259552 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.259283 2575 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 18 16:44:44.259552 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.259295 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a51f93e2-c750-4700-bb46-109456c7c78a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qktrm\" (UID: \"a51f93e2-c750-4700-bb46-109456c7c78a\") " pod="openshift-ovn-kubernetes/ovnkube-node-qktrm" Mar 18 16:44:44.259552 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.259351 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/42adf8fc-4871-4bee-8ebf-d7519c60b6af-kubelet-dir\") pod \"aws-ebs-csi-driver-node-t4bgd\" (UID: \"42adf8fc-4871-4bee-8ebf-d7519c60b6af\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-t4bgd" Mar 18 16:44:44.259552 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:44:44.259378 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:44:44.260317 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:44:44.259465 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d6948911-017b-4b29-b362-5520b984c273-metrics-certs podName:d6948911-017b-4b29-b362-5520b984c273 nodeName:}" failed. No retries permitted until 2026-03-18 16:44:44.759439605 +0000 UTC m=+3.070692526 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d6948911-017b-4b29-b362-5520b984c273-metrics-certs") pod "network-metrics-daemon-nbj8s" (UID: "d6948911-017b-4b29-b362-5520b984c273") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:44:44.260317 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.259473 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f8f81e7e-bafc-43ea-a249-a4269a4090b1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-w2ddv\" (UID: \"f8f81e7e-bafc-43ea-a249-a4269a4090b1\") " pod="openshift-multus/multus-additional-cni-plugins-w2ddv" Mar 18 16:44:44.260317 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.259487 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0452f967-d43e-4fc8-8591-3f8887d642ef-tmp-dir\") pod \"node-resolver-kmk8h\" (UID: \"0452f967-d43e-4fc8-8591-3f8887d642ef\") " pod="openshift-dns/node-resolver-kmk8h" Mar 18 16:44:44.260317 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.259522 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a51f93e2-c750-4700-bb46-109456c7c78a-systemd-units\") pod \"ovnkube-node-qktrm\" (UID: \"a51f93e2-c750-4700-bb46-109456c7c78a\") " pod="openshift-ovn-kubernetes/ovnkube-node-qktrm" Mar 18 16:44:44.260317 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.259473 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a51f93e2-c750-4700-bb46-109456c7c78a-ovnkube-config\") pod \"ovnkube-node-qktrm\" (UID: \"a51f93e2-c750-4700-bb46-109456c7c78a\") " pod="openshift-ovn-kubernetes/ovnkube-node-qktrm" Mar 18 16:44:44.260317 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.259551 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a51f93e2-c750-4700-bb46-109456c7c78a-env-overrides\") pod \"ovnkube-node-qktrm\" (UID: \"a51f93e2-c750-4700-bb46-109456c7c78a\") " pod="openshift-ovn-kubernetes/ovnkube-node-qktrm" Mar 18 16:44:44.260317 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.259567 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a51f93e2-c750-4700-bb46-109456c7c78a-host-slash\") pod \"ovnkube-node-qktrm\" (UID: \"a51f93e2-c750-4700-bb46-109456c7c78a\") " pod="openshift-ovn-kubernetes/ovnkube-node-qktrm" Mar 18 16:44:44.260317 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.259578 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a51f93e2-c750-4700-bb46-109456c7c78a-ovn-node-metrics-cert\") pod \"ovnkube-node-qktrm\" (UID: \"a51f93e2-c750-4700-bb46-109456c7c78a\") " pod="openshift-ovn-kubernetes/ovnkube-node-qktrm" Mar 18 16:44:44.260317 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.259602 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/84aaa605-3cf0-4df0-9d6c-85011b39807e-host-var-lib-kubelet\") pod \"multus-qq8q7\" (UID: \"84aaa605-3cf0-4df0-9d6c-85011b39807e\") " pod="openshift-multus/multus-qq8q7" Mar 18 16:44:44.260317 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.259627 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1def291b-47ff-4e2f-b1a9-30275edc2bd9-host\") pod \"node-ca-lqcmn\" (UID: \"1def291b-47ff-4e2f-b1a9-30275edc2bd9\") " pod="openshift-image-registry/node-ca-lqcmn" Mar 18 16:44:44.260317 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.259668 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1def291b-47ff-4e2f-b1a9-30275edc2bd9-host\") pod \"node-ca-lqcmn\" (UID: \"1def291b-47ff-4e2f-b1a9-30275edc2bd9\") " pod="openshift-image-registry/node-ca-lqcmn" Mar 18 16:44:44.260317 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.259675 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/84aaa605-3cf0-4df0-9d6c-85011b39807e-host-var-lib-kubelet\") pod \"multus-qq8q7\" (UID: \"84aaa605-3cf0-4df0-9d6c-85011b39807e\") " pod="openshift-multus/multus-qq8q7" Mar 18 16:44:44.260317 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.259714 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/42adf8fc-4871-4bee-8ebf-d7519c60b6af-sys-fs\") pod \"aws-ebs-csi-driver-node-t4bgd\" (UID: \"42adf8fc-4871-4bee-8ebf-d7519c60b6af\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-t4bgd" Mar 18 16:44:44.260317 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.259743 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a51f93e2-c750-4700-bb46-109456c7c78a-run-openvswitch\") pod \"ovnkube-node-qktrm\" (UID: \"a51f93e2-c750-4700-bb46-109456c7c78a\") " pod="openshift-ovn-kubernetes/ovnkube-node-qktrm" Mar 18 16:44:44.260317 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.259769 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/84aaa605-3cf0-4df0-9d6c-85011b39807e-hostroot\") pod \"multus-qq8q7\" (UID: \"84aaa605-3cf0-4df0-9d6c-85011b39807e\") " pod="openshift-multus/multus-qq8q7" Mar 18 16:44:44.260317 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.259776 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/42adf8fc-4871-4bee-8ebf-d7519c60b6af-sys-fs\") pod \"aws-ebs-csi-driver-node-t4bgd\" (UID: \"42adf8fc-4871-4bee-8ebf-d7519c60b6af\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-t4bgd" Mar 18 16:44:44.260317 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.259793 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/84aaa605-3cf0-4df0-9d6c-85011b39807e-etc-kubernetes\") pod \"multus-qq8q7\" (UID: \"84aaa605-3cf0-4df0-9d6c-85011b39807e\") " pod="openshift-multus/multus-qq8q7" Mar 18 16:44:44.261302 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.259798 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a51f93e2-c750-4700-bb46-109456c7c78a-run-openvswitch\") pod \"ovnkube-node-qktrm\" (UID: \"a51f93e2-c750-4700-bb46-109456c7c78a\") " pod="openshift-ovn-kubernetes/ovnkube-node-qktrm" Mar 18 16:44:44.261302 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.259815 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/84aaa605-3cf0-4df0-9d6c-85011b39807e-hostroot\") pod \"multus-qq8q7\" (UID: \"84aaa605-3cf0-4df0-9d6c-85011b39807e\") " pod="openshift-multus/multus-qq8q7" Mar 18 16:44:44.261302 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.259821 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zh9pp\" (UniqueName: \"kubernetes.io/projected/0452f967-d43e-4fc8-8591-3f8887d642ef-kube-api-access-zh9pp\") pod \"node-resolver-kmk8h\" (UID: \"0452f967-d43e-4fc8-8591-3f8887d642ef\") " pod="openshift-dns/node-resolver-kmk8h" Mar 18 16:44:44.261302 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.259840 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/84aaa605-3cf0-4df0-9d6c-85011b39807e-etc-kubernetes\") pod \"multus-qq8q7\" (UID: \"84aaa605-3cf0-4df0-9d6c-85011b39807e\") " pod="openshift-multus/multus-qq8q7" Mar 18 16:44:44.261302 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.259850 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/2e545205-6f68-4391-a421-c5d55b65e2d0-etc-sysconfig\") pod \"tuned-mwhvr\" (UID: \"2e545205-6f68-4391-a421-c5d55b65e2d0\") " pod="openshift-cluster-node-tuning-operator/tuned-mwhvr" Mar 18 16:44:44.261302 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.259875 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/2e545205-6f68-4391-a421-c5d55b65e2d0-etc-systemd\") pod \"tuned-mwhvr\" (UID: \"2e545205-6f68-4391-a421-c5d55b65e2d0\") " pod="openshift-cluster-node-tuning-operator/tuned-mwhvr" Mar 18 16:44:44.261302 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.259890 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/2e545205-6f68-4391-a421-c5d55b65e2d0-etc-sysconfig\") pod \"tuned-mwhvr\" (UID: \"2e545205-6f68-4391-a421-c5d55b65e2d0\") " pod="openshift-cluster-node-tuning-operator/tuned-mwhvr" Mar 18 16:44:44.261302 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.259900 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/cac8f65a-dd3e-4b2c-8a45-1ba1e842dd91-agent-certs\") pod \"konnectivity-agent-88mp7\" (UID: \"cac8f65a-dd3e-4b2c-8a45-1ba1e842dd91\") " pod="kube-system/konnectivity-agent-88mp7" Mar 18 16:44:44.261302 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.259907 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a51f93e2-c750-4700-bb46-109456c7c78a-env-overrides\") pod \"ovnkube-node-qktrm\" (UID: \"a51f93e2-c750-4700-bb46-109456c7c78a\") " pod="openshift-ovn-kubernetes/ovnkube-node-qktrm" Mar 18 16:44:44.261302 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.259928 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rpztg\" (UniqueName: \"kubernetes.io/projected/35d62bb3-aced-4698-ba43-4c7f828a050d-kube-api-access-rpztg\") pod \"network-check-target-hr9bv\" (UID: \"35d62bb3-aced-4698-ba43-4c7f828a050d\") " pod="openshift-network-diagnostics/network-check-target-hr9bv" Mar 18 16:44:44.261302 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.259931 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/2e545205-6f68-4391-a421-c5d55b65e2d0-etc-systemd\") pod \"tuned-mwhvr\" (UID: \"2e545205-6f68-4391-a421-c5d55b65e2d0\") " pod="openshift-cluster-node-tuning-operator/tuned-mwhvr" Mar 18 16:44:44.261302 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.260053 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/af60e70e-528b-44c4-a08d-8bd69ee9547f-iptables-alerter-script\") pod \"iptables-alerter-kcvj8\" (UID: \"af60e70e-528b-44c4-a08d-8bd69ee9547f\") " pod="openshift-network-operator/iptables-alerter-kcvj8" Mar 18 16:44:44.261302 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.260085 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f8f81e7e-bafc-43ea-a249-a4269a4090b1-cni-binary-copy\") pod \"multus-additional-cni-plugins-w2ddv\" (UID: \"f8f81e7e-bafc-43ea-a249-a4269a4090b1\") " pod="openshift-multus/multus-additional-cni-plugins-w2ddv" Mar 18 16:44:44.261302 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.260124 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b6h7w\" (UniqueName: \"kubernetes.io/projected/f8f81e7e-bafc-43ea-a249-a4269a4090b1-kube-api-access-b6h7w\") pod \"multus-additional-cni-plugins-w2ddv\" (UID: \"f8f81e7e-bafc-43ea-a249-a4269a4090b1\") " pod="openshift-multus/multus-additional-cni-plugins-w2ddv" Mar 18 16:44:44.261302 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.260156 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a51f93e2-c750-4700-bb46-109456c7c78a-host-run-netns\") pod \"ovnkube-node-qktrm\" (UID: \"a51f93e2-c750-4700-bb46-109456c7c78a\") " pod="openshift-ovn-kubernetes/ovnkube-node-qktrm" Mar 18 16:44:44.261302 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.260198 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a51f93e2-c750-4700-bb46-109456c7c78a-run-systemd\") pod \"ovnkube-node-qktrm\" (UID: \"a51f93e2-c750-4700-bb46-109456c7c78a\") " pod="openshift-ovn-kubernetes/ovnkube-node-qktrm" Mar 18 16:44:44.261302 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.260226 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/84aaa605-3cf0-4df0-9d6c-85011b39807e-host-run-netns\") pod \"multus-qq8q7\" (UID: \"84aaa605-3cf0-4df0-9d6c-85011b39807e\") " pod="openshift-multus/multus-qq8q7" Mar 18 16:44:44.262066 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.260258 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/84aaa605-3cf0-4df0-9d6c-85011b39807e-multus-daemon-config\") pod \"multus-qq8q7\" (UID: \"84aaa605-3cf0-4df0-9d6c-85011b39807e\") " pod="openshift-multus/multus-qq8q7" Mar 18 16:44:44.262066 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.260286 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f8f81e7e-bafc-43ea-a249-a4269a4090b1-os-release\") pod \"multus-additional-cni-plugins-w2ddv\" (UID: \"f8f81e7e-bafc-43ea-a249-a4269a4090b1\") " pod="openshift-multus/multus-additional-cni-plugins-w2ddv" Mar 18 16:44:44.262066 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.260315 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/84aaa605-3cf0-4df0-9d6c-85011b39807e-os-release\") pod \"multus-qq8q7\" (UID: \"84aaa605-3cf0-4df0-9d6c-85011b39807e\") " pod="openshift-multus/multus-qq8q7" Mar 18 16:44:44.262066 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.260343 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1def291b-47ff-4e2f-b1a9-30275edc2bd9-serviceca\") pod \"node-ca-lqcmn\" (UID: \"1def291b-47ff-4e2f-b1a9-30275edc2bd9\") " pod="openshift-image-registry/node-ca-lqcmn" Mar 18 16:44:44.262066 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.260415 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/42adf8fc-4871-4bee-8ebf-d7519c60b6af-socket-dir\") pod \"aws-ebs-csi-driver-node-t4bgd\" (UID: \"42adf8fc-4871-4bee-8ebf-d7519c60b6af\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-t4bgd" Mar 18 16:44:44.262066 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.260441 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/2e545205-6f68-4391-a421-c5d55b65e2d0-etc-modprobe-d\") pod \"tuned-mwhvr\" (UID: \"2e545205-6f68-4391-a421-c5d55b65e2d0\") " pod="openshift-cluster-node-tuning-operator/tuned-mwhvr" Mar 18 16:44:44.262066 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.260470 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2e545205-6f68-4391-a421-c5d55b65e2d0-sys\") pod \"tuned-mwhvr\" (UID: \"2e545205-6f68-4391-a421-c5d55b65e2d0\") " pod="openshift-cluster-node-tuning-operator/tuned-mwhvr" Mar 18 16:44:44.262066 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.260507 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d7gfs\" (UniqueName: \"kubernetes.io/projected/a51f93e2-c750-4700-bb46-109456c7c78a-kube-api-access-d7gfs\") pod \"ovnkube-node-qktrm\" (UID: \"a51f93e2-c750-4700-bb46-109456c7c78a\") " pod="openshift-ovn-kubernetes/ovnkube-node-qktrm" Mar 18 16:44:44.262066 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.260535 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/84aaa605-3cf0-4df0-9d6c-85011b39807e-multus-cni-dir\") pod \"multus-qq8q7\" (UID: \"84aaa605-3cf0-4df0-9d6c-85011b39807e\") " pod="openshift-multus/multus-qq8q7" Mar 18 16:44:44.262066 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.260562 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/84aaa605-3cf0-4df0-9d6c-85011b39807e-host-run-k8s-cni-cncf-io\") pod \"multus-qq8q7\" (UID: \"84aaa605-3cf0-4df0-9d6c-85011b39807e\") " pod="openshift-multus/multus-qq8q7" Mar 18 16:44:44.262066 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.260573 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f8f81e7e-bafc-43ea-a249-a4269a4090b1-cni-binary-copy\") pod \"multus-additional-cni-plugins-w2ddv\" (UID: \"f8f81e7e-bafc-43ea-a249-a4269a4090b1\") " pod="openshift-multus/multus-additional-cni-plugins-w2ddv" Mar 18 16:44:44.262066 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.260590 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/84aaa605-3cf0-4df0-9d6c-85011b39807e-multus-conf-dir\") pod \"multus-qq8q7\" (UID: \"84aaa605-3cf0-4df0-9d6c-85011b39807e\") " pod="openshift-multus/multus-qq8q7" Mar 18 16:44:44.262066 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.260573 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/af60e70e-528b-44c4-a08d-8bd69ee9547f-iptables-alerter-script\") pod \"iptables-alerter-kcvj8\" (UID: \"af60e70e-528b-44c4-a08d-8bd69ee9547f\") " pod="openshift-network-operator/iptables-alerter-kcvj8" Mar 18 16:44:44.262066 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.260617 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wkw5t\" (UniqueName: \"kubernetes.io/projected/1def291b-47ff-4e2f-b1a9-30275edc2bd9-kube-api-access-wkw5t\") pod \"node-ca-lqcmn\" (UID: \"1def291b-47ff-4e2f-b1a9-30275edc2bd9\") " pod="openshift-image-registry/node-ca-lqcmn" Mar 18 16:44:44.262066 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.260648 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/f8f81e7e-bafc-43ea-a249-a4269a4090b1-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-w2ddv\" (UID: \"f8f81e7e-bafc-43ea-a249-a4269a4090b1\") " pod="openshift-multus/multus-additional-cni-plugins-w2ddv" Mar 18 16:44:44.262066 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.260674 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/84aaa605-3cf0-4df0-9d6c-85011b39807e-host-run-netns\") pod \"multus-qq8q7\" (UID: \"84aaa605-3cf0-4df0-9d6c-85011b39807e\") " pod="openshift-multus/multus-qq8q7" Mar 18 16:44:44.262066 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.260678 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a51f93e2-c750-4700-bb46-109456c7c78a-ovnkube-script-lib\") pod \"ovnkube-node-qktrm\" (UID: \"a51f93e2-c750-4700-bb46-109456c7c78a\") " pod="openshift-ovn-kubernetes/ovnkube-node-qktrm" Mar 18 16:44:44.263683 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.260714 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/cac8f65a-dd3e-4b2c-8a45-1ba1e842dd91-konnectivity-ca\") pod \"konnectivity-agent-88mp7\" (UID: \"cac8f65a-dd3e-4b2c-8a45-1ba1e842dd91\") " pod="kube-system/konnectivity-agent-88mp7" Mar 18 16:44:44.263683 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.260742 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/0452f967-d43e-4fc8-8591-3f8887d642ef-hosts-file\") pod \"node-resolver-kmk8h\" (UID: \"0452f967-d43e-4fc8-8591-3f8887d642ef\") " pod="openshift-dns/node-resolver-kmk8h" Mar 18 16:44:44.263683 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.260766 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f8f81e7e-bafc-43ea-a249-a4269a4090b1-cnibin\") pod \"multus-additional-cni-plugins-w2ddv\" (UID: \"f8f81e7e-bafc-43ea-a249-a4269a4090b1\") " pod="openshift-multus/multus-additional-cni-plugins-w2ddv" Mar 18 16:44:44.263683 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.260809 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/42adf8fc-4871-4bee-8ebf-d7519c60b6af-etc-selinux\") pod \"aws-ebs-csi-driver-node-t4bgd\" (UID: \"42adf8fc-4871-4bee-8ebf-d7519c60b6af\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-t4bgd" Mar 18 16:44:44.263683 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.260816 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f8f81e7e-bafc-43ea-a249-a4269a4090b1-cnibin\") pod \"multus-additional-cni-plugins-w2ddv\" (UID: \"f8f81e7e-bafc-43ea-a249-a4269a4090b1\") " pod="openshift-multus/multus-additional-cni-plugins-w2ddv" Mar 18 16:44:44.263683 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.260838 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/2e545205-6f68-4391-a421-c5d55b65e2d0-etc-sysctl-conf\") pod \"tuned-mwhvr\" (UID: \"2e545205-6f68-4391-a421-c5d55b65e2d0\") " pod="openshift-cluster-node-tuning-operator/tuned-mwhvr" Mar 18 16:44:44.263683 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.260866 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a51f93e2-c750-4700-bb46-109456c7c78a-run-ovn\") pod \"ovnkube-node-qktrm\" (UID: \"a51f93e2-c750-4700-bb46-109456c7c78a\") " pod="openshift-ovn-kubernetes/ovnkube-node-qktrm" Mar 18 16:44:44.263683 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.260889 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/84aaa605-3cf0-4df0-9d6c-85011b39807e-cnibin\") pod \"multus-qq8q7\" (UID: \"84aaa605-3cf0-4df0-9d6c-85011b39807e\") " pod="openshift-multus/multus-qq8q7" Mar 18 16:44:44.263683 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.260912 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/84aaa605-3cf0-4df0-9d6c-85011b39807e-multus-socket-dir-parent\") pod \"multus-qq8q7\" (UID: \"84aaa605-3cf0-4df0-9d6c-85011b39807e\") " pod="openshift-multus/multus-qq8q7" Mar 18 16:44:44.263683 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.260940 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/84aaa605-3cf0-4df0-9d6c-85011b39807e-host-var-lib-cni-multus\") pod \"multus-qq8q7\" (UID: \"84aaa605-3cf0-4df0-9d6c-85011b39807e\") " pod="openshift-multus/multus-qq8q7" Mar 18 16:44:44.263683 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.260966 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mp5p8\" (UniqueName: \"kubernetes.io/projected/d6948911-017b-4b29-b362-5520b984c273-kube-api-access-mp5p8\") pod \"network-metrics-daemon-nbj8s\" (UID: \"d6948911-017b-4b29-b362-5520b984c273\") " pod="openshift-multus/network-metrics-daemon-nbj8s" Mar 18 16:44:44.263683 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.261004 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/84aaa605-3cf0-4df0-9d6c-85011b39807e-system-cni-dir\") pod \"multus-qq8q7\" (UID: \"84aaa605-3cf0-4df0-9d6c-85011b39807e\") " pod="openshift-multus/multus-qq8q7" Mar 18 16:44:44.263683 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.261028 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/84aaa605-3cf0-4df0-9d6c-85011b39807e-cni-binary-copy\") pod \"multus-qq8q7\" (UID: \"84aaa605-3cf0-4df0-9d6c-85011b39807e\") " pod="openshift-multus/multus-qq8q7" Mar 18 16:44:44.263683 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.261052 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2e545205-6f68-4391-a421-c5d55b65e2d0-var-lib-kubelet\") pod \"tuned-mwhvr\" (UID: \"2e545205-6f68-4391-a421-c5d55b65e2d0\") " pod="openshift-cluster-node-tuning-operator/tuned-mwhvr" Mar 18 16:44:44.263683 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.261059 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a51f93e2-c750-4700-bb46-109456c7c78a-run-systemd\") pod \"ovnkube-node-qktrm\" (UID: \"a51f93e2-c750-4700-bb46-109456c7c78a\") " pod="openshift-ovn-kubernetes/ovnkube-node-qktrm" Mar 18 16:44:44.263683 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.261078 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a51f93e2-c750-4700-bb46-109456c7c78a-log-socket\") pod \"ovnkube-node-qktrm\" (UID: \"a51f93e2-c750-4700-bb46-109456c7c78a\") " pod="openshift-ovn-kubernetes/ovnkube-node-qktrm" Mar 18 16:44:44.263683 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.261108 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a51f93e2-c750-4700-bb46-109456c7c78a-var-lib-openvswitch\") pod \"ovnkube-node-qktrm\" (UID: \"a51f93e2-c750-4700-bb46-109456c7c78a\") " pod="openshift-ovn-kubernetes/ovnkube-node-qktrm" Mar 18 16:44:44.263683 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.261116 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a51f93e2-c750-4700-bb46-109456c7c78a-host-run-netns\") pod \"ovnkube-node-qktrm\" (UID: \"a51f93e2-c750-4700-bb46-109456c7c78a\") " pod="openshift-ovn-kubernetes/ovnkube-node-qktrm" Mar 18 16:44:44.264538 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.261137 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-22zln\" (UniqueName: \"kubernetes.io/projected/84aaa605-3cf0-4df0-9d6c-85011b39807e-kube-api-access-22zln\") pod \"multus-qq8q7\" (UID: \"84aaa605-3cf0-4df0-9d6c-85011b39807e\") " pod="openshift-multus/multus-qq8q7" Mar 18 16:44:44.264538 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.261166 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/af60e70e-528b-44c4-a08d-8bd69ee9547f-host-slash\") pod \"iptables-alerter-kcvj8\" (UID: \"af60e70e-528b-44c4-a08d-8bd69ee9547f\") " pod="openshift-network-operator/iptables-alerter-kcvj8" Mar 18 16:44:44.264538 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.261190 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/2e545205-6f68-4391-a421-c5d55b65e2d0-etc-sysctl-d\") pod \"tuned-mwhvr\" (UID: \"2e545205-6f68-4391-a421-c5d55b65e2d0\") " pod="openshift-cluster-node-tuning-operator/tuned-mwhvr" Mar 18 16:44:44.264538 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.261216 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2e545205-6f68-4391-a421-c5d55b65e2d0-etc-kubernetes\") pod \"tuned-mwhvr\" (UID: \"2e545205-6f68-4391-a421-c5d55b65e2d0\") " pod="openshift-cluster-node-tuning-operator/tuned-mwhvr" Mar 18 16:44:44.264538 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.261230 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a51f93e2-c750-4700-bb46-109456c7c78a-ovnkube-script-lib\") pod \"ovnkube-node-qktrm\" (UID: \"a51f93e2-c750-4700-bb46-109456c7c78a\") " pod="openshift-ovn-kubernetes/ovnkube-node-qktrm" Mar 18 16:44:44.264538 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.261241 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2e545205-6f68-4391-a421-c5d55b65e2d0-host\") pod \"tuned-mwhvr\" (UID: \"2e545205-6f68-4391-a421-c5d55b65e2d0\") " pod="openshift-cluster-node-tuning-operator/tuned-mwhvr" Mar 18 16:44:44.264538 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.261268 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2e545205-6f68-4391-a421-c5d55b65e2d0-tmp\") pod \"tuned-mwhvr\" (UID: \"2e545205-6f68-4391-a421-c5d55b65e2d0\") " pod="openshift-cluster-node-tuning-operator/tuned-mwhvr" Mar 18 16:44:44.264538 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.261293 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a51f93e2-c750-4700-bb46-109456c7c78a-node-log\") pod \"ovnkube-node-qktrm\" (UID: \"a51f93e2-c750-4700-bb46-109456c7c78a\") " pod="openshift-ovn-kubernetes/ovnkube-node-qktrm" Mar 18 16:44:44.264538 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.261317 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/84aaa605-3cf0-4df0-9d6c-85011b39807e-multus-daemon-config\") pod \"multus-qq8q7\" (UID: \"84aaa605-3cf0-4df0-9d6c-85011b39807e\") " pod="openshift-multus/multus-qq8q7" Mar 18 16:44:44.264538 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.261320 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/84aaa605-3cf0-4df0-9d6c-85011b39807e-multus-cni-dir\") pod \"multus-qq8q7\" (UID: \"84aaa605-3cf0-4df0-9d6c-85011b39807e\") " pod="openshift-multus/multus-qq8q7" Mar 18 16:44:44.264538 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.261404 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/84aaa605-3cf0-4df0-9d6c-85011b39807e-host-run-k8s-cni-cncf-io\") pod \"multus-qq8q7\" (UID: \"84aaa605-3cf0-4df0-9d6c-85011b39807e\") " pod="openshift-multus/multus-qq8q7" Mar 18 16:44:44.264538 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.261431 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2e545205-6f68-4391-a421-c5d55b65e2d0-sys\") pod \"tuned-mwhvr\" (UID: \"2e545205-6f68-4391-a421-c5d55b65e2d0\") " pod="openshift-cluster-node-tuning-operator/tuned-mwhvr" Mar 18 16:44:44.264538 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.261445 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a51f93e2-c750-4700-bb46-109456c7c78a-node-log\") pod \"ovnkube-node-qktrm\" (UID: \"a51f93e2-c750-4700-bb46-109456c7c78a\") " pod="openshift-ovn-kubernetes/ovnkube-node-qktrm" Mar 18 16:44:44.264538 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.261638 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/84aaa605-3cf0-4df0-9d6c-85011b39807e-multus-conf-dir\") pod \"multus-qq8q7\" (UID: \"84aaa605-3cf0-4df0-9d6c-85011b39807e\") " pod="openshift-multus/multus-qq8q7" Mar 18 16:44:44.264538 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.261755 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1def291b-47ff-4e2f-b1a9-30275edc2bd9-serviceca\") pod \"node-ca-lqcmn\" (UID: \"1def291b-47ff-4e2f-b1a9-30275edc2bd9\") " pod="openshift-image-registry/node-ca-lqcmn" Mar 18 16:44:44.264538 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.261796 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/0452f967-d43e-4fc8-8591-3f8887d642ef-hosts-file\") pod \"node-resolver-kmk8h\" (UID: \"0452f967-d43e-4fc8-8591-3f8887d642ef\") " pod="openshift-dns/node-resolver-kmk8h" Mar 18 16:44:44.264538 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.261850 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/2e545205-6f68-4391-a421-c5d55b65e2d0-etc-modprobe-d\") pod \"tuned-mwhvr\" (UID: \"2e545205-6f68-4391-a421-c5d55b65e2d0\") " pod="openshift-cluster-node-tuning-operator/tuned-mwhvr" Mar 18 16:44:44.264538 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.261862 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/84aaa605-3cf0-4df0-9d6c-85011b39807e-system-cni-dir\") pod \"multus-qq8q7\" (UID: \"84aaa605-3cf0-4df0-9d6c-85011b39807e\") " pod="openshift-multus/multus-qq8q7" Mar 18 16:44:44.265325 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.261917 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/84aaa605-3cf0-4df0-9d6c-85011b39807e-os-release\") pod \"multus-qq8q7\" (UID: \"84aaa605-3cf0-4df0-9d6c-85011b39807e\") " pod="openshift-multus/multus-qq8q7" Mar 18 16:44:44.265325 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.261975 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/cac8f65a-dd3e-4b2c-8a45-1ba1e842dd91-konnectivity-ca\") pod \"konnectivity-agent-88mp7\" (UID: \"cac8f65a-dd3e-4b2c-8a45-1ba1e842dd91\") " pod="kube-system/konnectivity-agent-88mp7" Mar 18 16:44:44.265325 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.261980 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/42adf8fc-4871-4bee-8ebf-d7519c60b6af-etc-selinux\") pod \"aws-ebs-csi-driver-node-t4bgd\" (UID: \"42adf8fc-4871-4bee-8ebf-d7519c60b6af\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-t4bgd" Mar 18 16:44:44.265325 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.262038 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/af60e70e-528b-44c4-a08d-8bd69ee9547f-host-slash\") pod \"iptables-alerter-kcvj8\" (UID: \"af60e70e-528b-44c4-a08d-8bd69ee9547f\") " pod="openshift-network-operator/iptables-alerter-kcvj8" Mar 18 16:44:44.265325 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.262068 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/2e545205-6f68-4391-a421-c5d55b65e2d0-etc-sysctl-conf\") pod \"tuned-mwhvr\" (UID: \"2e545205-6f68-4391-a421-c5d55b65e2d0\") " pod="openshift-cluster-node-tuning-operator/tuned-mwhvr" Mar 18 16:44:44.265325 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.262089 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2e545205-6f68-4391-a421-c5d55b65e2d0-var-lib-kubelet\") pod \"tuned-mwhvr\" (UID: \"2e545205-6f68-4391-a421-c5d55b65e2d0\") " pod="openshift-cluster-node-tuning-operator/tuned-mwhvr" Mar 18 16:44:44.265325 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.262109 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a51f93e2-c750-4700-bb46-109456c7c78a-run-ovn\") pod \"ovnkube-node-qktrm\" (UID: \"a51f93e2-c750-4700-bb46-109456c7c78a\") " pod="openshift-ovn-kubernetes/ovnkube-node-qktrm" Mar 18 16:44:44.265325 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.262132 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a51f93e2-c750-4700-bb46-109456c7c78a-log-socket\") pod \"ovnkube-node-qktrm\" (UID: \"a51f93e2-c750-4700-bb46-109456c7c78a\") " pod="openshift-ovn-kubernetes/ovnkube-node-qktrm" Mar 18 16:44:44.265325 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.262153 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/84aaa605-3cf0-4df0-9d6c-85011b39807e-cnibin\") pod \"multus-qq8q7\" (UID: \"84aaa605-3cf0-4df0-9d6c-85011b39807e\") " pod="openshift-multus/multus-qq8q7" Mar 18 16:44:44.265325 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.262200 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a51f93e2-c750-4700-bb46-109456c7c78a-var-lib-openvswitch\") pod \"ovnkube-node-qktrm\" (UID: \"a51f93e2-c750-4700-bb46-109456c7c78a\") " pod="openshift-ovn-kubernetes/ovnkube-node-qktrm" Mar 18 16:44:44.265325 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.262203 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/84aaa605-3cf0-4df0-9d6c-85011b39807e-multus-socket-dir-parent\") pod \"multus-qq8q7\" (UID: \"84aaa605-3cf0-4df0-9d6c-85011b39807e\") " pod="openshift-multus/multus-qq8q7" Mar 18 16:44:44.265325 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.262247 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/84aaa605-3cf0-4df0-9d6c-85011b39807e-host-var-lib-cni-multus\") pod \"multus-qq8q7\" (UID: \"84aaa605-3cf0-4df0-9d6c-85011b39807e\") " pod="openshift-multus/multus-qq8q7" Mar 18 16:44:44.265325 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.262263 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/84aaa605-3cf0-4df0-9d6c-85011b39807e-cni-binary-copy\") pod \"multus-qq8q7\" (UID: \"84aaa605-3cf0-4df0-9d6c-85011b39807e\") " pod="openshift-multus/multus-qq8q7" Mar 18 16:44:44.265325 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.262388 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/2e545205-6f68-4391-a421-c5d55b65e2d0-etc-sysctl-d\") pod \"tuned-mwhvr\" (UID: \"2e545205-6f68-4391-a421-c5d55b65e2d0\") " pod="openshift-cluster-node-tuning-operator/tuned-mwhvr" Mar 18 16:44:44.265325 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.262443 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2e545205-6f68-4391-a421-c5d55b65e2d0-host\") pod \"tuned-mwhvr\" (UID: \"2e545205-6f68-4391-a421-c5d55b65e2d0\") " pod="openshift-cluster-node-tuning-operator/tuned-mwhvr" Mar 18 16:44:44.265325 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.262455 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f8f81e7e-bafc-43ea-a249-a4269a4090b1-os-release\") pod \"multus-additional-cni-plugins-w2ddv\" (UID: \"f8f81e7e-bafc-43ea-a249-a4269a4090b1\") " pod="openshift-multus/multus-additional-cni-plugins-w2ddv" Mar 18 16:44:44.265325 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.262607 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2e545205-6f68-4391-a421-c5d55b65e2d0-etc-kubernetes\") pod \"tuned-mwhvr\" (UID: \"2e545205-6f68-4391-a421-c5d55b65e2d0\") " pod="openshift-cluster-node-tuning-operator/tuned-mwhvr" Mar 18 16:44:44.265325 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.262714 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/42adf8fc-4871-4bee-8ebf-d7519c60b6af-socket-dir\") pod \"aws-ebs-csi-driver-node-t4bgd\" (UID: \"42adf8fc-4871-4bee-8ebf-d7519c60b6af\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-t4bgd" Mar 18 16:44:44.266178 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.262815 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/f8f81e7e-bafc-43ea-a249-a4269a4090b1-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-w2ddv\" (UID: \"f8f81e7e-bafc-43ea-a249-a4269a4090b1\") " pod="openshift-multus/multus-additional-cni-plugins-w2ddv" Mar 18 16:44:44.266178 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.263042 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/2e545205-6f68-4391-a421-c5d55b65e2d0-etc-tuned\") pod \"tuned-mwhvr\" (UID: \"2e545205-6f68-4391-a421-c5d55b65e2d0\") " pod="openshift-cluster-node-tuning-operator/tuned-mwhvr" Mar 18 16:44:44.266178 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.263274 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/cac8f65a-dd3e-4b2c-8a45-1ba1e842dd91-agent-certs\") pod \"konnectivity-agent-88mp7\" (UID: \"cac8f65a-dd3e-4b2c-8a45-1ba1e842dd91\") " pod="kube-system/konnectivity-agent-88mp7" Mar 18 16:44:44.266178 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.263722 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a51f93e2-c750-4700-bb46-109456c7c78a-ovn-node-metrics-cert\") pod \"ovnkube-node-qktrm\" (UID: \"a51f93e2-c750-4700-bb46-109456c7c78a\") " pod="openshift-ovn-kubernetes/ovnkube-node-qktrm" Mar 18 16:44:44.266178 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.264908 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2e545205-6f68-4391-a421-c5d55b65e2d0-tmp\") pod \"tuned-mwhvr\" (UID: \"2e545205-6f68-4391-a421-c5d55b65e2d0\") " pod="openshift-cluster-node-tuning-operator/tuned-mwhvr" Mar 18 16:44:44.269861 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:44:44.269843 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 16:44:44.269861 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:44:44.269861 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 16:44:44.270010 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:44:44.269870 2575 projected.go:194] Error preparing data for projected volume kube-api-access-rpztg for pod openshift-network-diagnostics/network-check-target-hr9bv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:44:44.270010 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:44:44.269929 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/35d62bb3-aced-4698-ba43-4c7f828a050d-kube-api-access-rpztg podName:35d62bb3-aced-4698-ba43-4c7f828a050d nodeName:}" failed. No retries permitted until 2026-03-18 16:44:44.769915106 +0000 UTC m=+3.081168009 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-rpztg" (UniqueName: "kubernetes.io/projected/35d62bb3-aced-4698-ba43-4c7f828a050d-kube-api-access-rpztg") pod "network-check-target-hr9bv" (UID: "35d62bb3-aced-4698-ba43-4c7f828a050d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:44:44.270312 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.270292 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4tmq\" (UniqueName: \"kubernetes.io/projected/42adf8fc-4871-4bee-8ebf-d7519c60b6af-kube-api-access-l4tmq\") pod \"aws-ebs-csi-driver-node-t4bgd\" (UID: \"42adf8fc-4871-4bee-8ebf-d7519c60b6af\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-t4bgd" Mar 18 16:44:44.272299 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.272281 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wptxk\" (UniqueName: \"kubernetes.io/projected/af60e70e-528b-44c4-a08d-8bd69ee9547f-kube-api-access-wptxk\") pod \"iptables-alerter-kcvj8\" (UID: \"af60e70e-528b-44c4-a08d-8bd69ee9547f\") " pod="openshift-network-operator/iptables-alerter-kcvj8" Mar 18 16:44:44.272670 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.272647 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zh9pp\" (UniqueName: \"kubernetes.io/projected/0452f967-d43e-4fc8-8591-3f8887d642ef-kube-api-access-zh9pp\") pod \"node-resolver-kmk8h\" (UID: \"0452f967-d43e-4fc8-8591-3f8887d642ef\") " pod="openshift-dns/node-resolver-kmk8h" Mar 18 16:44:44.273620 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.273586 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5wpp\" (UniqueName: \"kubernetes.io/projected/2e545205-6f68-4391-a421-c5d55b65e2d0-kube-api-access-c5wpp\") pod \"tuned-mwhvr\" (UID: \"2e545205-6f68-4391-a421-c5d55b65e2d0\") " pod="openshift-cluster-node-tuning-operator/tuned-mwhvr" Mar 18 16:44:44.275114 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.275090 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6h7w\" (UniqueName: \"kubernetes.io/projected/f8f81e7e-bafc-43ea-a249-a4269a4090b1-kube-api-access-b6h7w\") pod \"multus-additional-cni-plugins-w2ddv\" (UID: \"f8f81e7e-bafc-43ea-a249-a4269a4090b1\") " pod="openshift-multus/multus-additional-cni-plugins-w2ddv" Mar 18 16:44:44.275347 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.275324 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mp5p8\" (UniqueName: \"kubernetes.io/projected/d6948911-017b-4b29-b362-5520b984c273-kube-api-access-mp5p8\") pod \"network-metrics-daemon-nbj8s\" (UID: \"d6948911-017b-4b29-b362-5520b984c273\") " pod="openshift-multus/network-metrics-daemon-nbj8s" Mar 18 16:44:44.275635 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.275617 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkw5t\" (UniqueName: \"kubernetes.io/projected/1def291b-47ff-4e2f-b1a9-30275edc2bd9-kube-api-access-wkw5t\") pod \"node-ca-lqcmn\" (UID: \"1def291b-47ff-4e2f-b1a9-30275edc2bd9\") " pod="openshift-image-registry/node-ca-lqcmn" Mar 18 16:44:44.276026 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.276008 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-22zln\" (UniqueName: \"kubernetes.io/projected/84aaa605-3cf0-4df0-9d6c-85011b39807e-kube-api-access-22zln\") pod \"multus-qq8q7\" (UID: \"84aaa605-3cf0-4df0-9d6c-85011b39807e\") " pod="openshift-multus/multus-qq8q7" Mar 18 16:44:44.276447 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.276431 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7gfs\" (UniqueName: \"kubernetes.io/projected/a51f93e2-c750-4700-bb46-109456c7c78a-kube-api-access-d7gfs\") pod \"ovnkube-node-qktrm\" (UID: \"a51f93e2-c750-4700-bb46-109456c7c78a\") " pod="openshift-ovn-kubernetes/ovnkube-node-qktrm" Mar 18 16:44:44.443791 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.443722 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-w2ddv" Mar 18 16:44:44.452589 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.452569 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-kcvj8" Mar 18 16:44:44.460176 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.460155 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-t4bgd" Mar 18 16:44:44.466681 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.466655 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-mwhvr" Mar 18 16:44:44.474162 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.474146 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-lqcmn" Mar 18 16:44:44.480772 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.480752 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-qktrm" Mar 18 16:44:44.486294 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.486274 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-88mp7" Mar 18 16:44:44.492887 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.492868 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-qq8q7" Mar 18 16:44:44.497424 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.497407 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-kmk8h" Mar 18 16:44:44.764451 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.764342 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d6948911-017b-4b29-b362-5520b984c273-metrics-certs\") pod \"network-metrics-daemon-nbj8s\" (UID: \"d6948911-017b-4b29-b362-5520b984c273\") " pod="openshift-multus/network-metrics-daemon-nbj8s" Mar 18 16:44:44.764572 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:44:44.764485 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:44:44.764572 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:44:44.764551 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d6948911-017b-4b29-b362-5520b984c273-metrics-certs podName:d6948911-017b-4b29-b362-5520b984c273 nodeName:}" failed. No retries permitted until 2026-03-18 16:44:45.764536918 +0000 UTC m=+4.075789817 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d6948911-017b-4b29-b362-5520b984c273-metrics-certs") pod "network-metrics-daemon-nbj8s" (UID: "d6948911-017b-4b29-b362-5520b984c273") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:44:44.804268 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:44.804245 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda51f93e2_c750_4700_bb46_109456c7c78a.slice/crio-f593be8036367e503351a57f1da3bcf2390dd7689a2bf566a6a3ed38f989ad9a WatchSource:0}: Error finding container f593be8036367e503351a57f1da3bcf2390dd7689a2bf566a6a3ed38f989ad9a: Status 404 returned error can't find the container with id f593be8036367e503351a57f1da3bcf2390dd7689a2bf566a6a3ed38f989ad9a Mar 18 16:44:44.805217 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:44.805195 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1def291b_47ff_4e2f_b1a9_30275edc2bd9.slice/crio-47e1cbadb862e59bbe74f679dbdc33e9ebabf8c282ba0207dd1094828012490f WatchSource:0}: Error finding container 47e1cbadb862e59bbe74f679dbdc33e9ebabf8c282ba0207dd1094828012490f: Status 404 returned error can't find the container with id 47e1cbadb862e59bbe74f679dbdc33e9ebabf8c282ba0207dd1094828012490f Mar 18 16:44:44.808173 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:44.806606 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcac8f65a_dd3e_4b2c_8a45_1ba1e842dd91.slice/crio-b95221fb92bd08c963550bd114a47339b8d530bf2199c561b58ca262a7556494 WatchSource:0}: Error finding container b95221fb92bd08c963550bd114a47339b8d530bf2199c561b58ca262a7556494: Status 404 returned error can't find the container with id b95221fb92bd08c963550bd114a47339b8d530bf2199c561b58ca262a7556494 Mar 18 16:44:44.808173 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:44.807257 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e545205_6f68_4391_a421_c5d55b65e2d0.slice/crio-b17db052f7ff7db7b1c4a8861725d09b373b3a9f205c3099268a3aa91520b363 WatchSource:0}: Error finding container b17db052f7ff7db7b1c4a8861725d09b373b3a9f205c3099268a3aa91520b363: Status 404 returned error can't find the container with id b17db052f7ff7db7b1c4a8861725d09b373b3a9f205c3099268a3aa91520b363 Mar 18 16:44:44.808497 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:44.808298 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8f81e7e_bafc_43ea_a249_a4269a4090b1.slice/crio-fc5a24c3dd173ce32cfa18cc72eb525d977020f86362edc234e0b368c3263907 WatchSource:0}: Error finding container fc5a24c3dd173ce32cfa18cc72eb525d977020f86362edc234e0b368c3263907: Status 404 returned error can't find the container with id fc5a24c3dd173ce32cfa18cc72eb525d977020f86362edc234e0b368c3263907 Mar 18 16:44:44.813417 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:44.812539 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42adf8fc_4871_4bee_8ebf_d7519c60b6af.slice/crio-add798c9a82fbb6fd7edf9a84b452dac37306c640485af8b91595c903887a7f4 WatchSource:0}: Error finding container add798c9a82fbb6fd7edf9a84b452dac37306c640485af8b91595c903887a7f4: Status 404 returned error can't find the container with id add798c9a82fbb6fd7edf9a84b452dac37306c640485af8b91595c903887a7f4 Mar 18 16:44:44.814566 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:44.814477 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0452f967_d43e_4fc8_8591_3f8887d642ef.slice/crio-3506f326223cee6a5624046af95de02282bc5b9357f9d606d98122b9d35f9332 WatchSource:0}: Error finding container 3506f326223cee6a5624046af95de02282bc5b9357f9d606d98122b9d35f9332: Status 404 returned error can't find the container with id 3506f326223cee6a5624046af95de02282bc5b9357f9d606d98122b9d35f9332 Mar 18 16:44:44.814735 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:44:44.814703 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84aaa605_3cf0_4df0_9d6c_85011b39807e.slice/crio-08b9bb763914f1da12b305d2b6002786fa5cd767f63318c9862868b9a62369bb WatchSource:0}: Error finding container 08b9bb763914f1da12b305d2b6002786fa5cd767f63318c9862868b9a62369bb: Status 404 returned error can't find the container with id 08b9bb763914f1da12b305d2b6002786fa5cd767f63318c9862868b9a62369bb Mar 18 16:44:44.864942 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:44.864829 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rpztg\" (UniqueName: \"kubernetes.io/projected/35d62bb3-aced-4698-ba43-4c7f828a050d-kube-api-access-rpztg\") pod \"network-check-target-hr9bv\" (UID: \"35d62bb3-aced-4698-ba43-4c7f828a050d\") " pod="openshift-network-diagnostics/network-check-target-hr9bv" Mar 18 16:44:44.865035 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:44:44.864969 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 16:44:44.865035 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:44:44.864986 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 16:44:44.865035 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:44:44.864996 2575 projected.go:194] Error preparing data for projected volume kube-api-access-rpztg for pod openshift-network-diagnostics/network-check-target-hr9bv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:44:44.865137 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:44:44.865038 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/35d62bb3-aced-4698-ba43-4c7f828a050d-kube-api-access-rpztg podName:35d62bb3-aced-4698-ba43-4c7f828a050d nodeName:}" failed. No retries permitted until 2026-03-18 16:44:45.865024555 +0000 UTC m=+4.176277467 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-rpztg" (UniqueName: "kubernetes.io/projected/35d62bb3-aced-4698-ba43-4c7f828a050d-kube-api-access-rpztg") pod "network-check-target-hr9bv" (UID: "35d62bb3-aced-4698-ba43-4c7f828a050d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:44:45.189058 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:45.189021 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-03-17 16:39:43 +0000 UTC" deadline="2027-12-03 05:35:03.521236252 +0000 UTC" Mar 18 16:44:45.189058 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:45.189052 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14988h50m18.332187099s" Mar 18 16:44:45.298255 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:45.298208 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-49.ec2.internal" event={"ID":"ea09336bc3d6205db966e5eb690fe85d","Type":"ContainerStarted","Data":"477b079073aebb487d8e110ef574947544eef36726e82bcb8df4c98c085553f2"} Mar 18 16:44:45.306025 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:45.305966 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-kcvj8" event={"ID":"af60e70e-528b-44c4-a08d-8bd69ee9547f","Type":"ContainerStarted","Data":"7282defdaf55127b411f59963d3eb528835e7fed172d321c95a5e63f8293e939"} Mar 18 16:44:45.309155 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:45.309089 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-kmk8h" event={"ID":"0452f967-d43e-4fc8-8591-3f8887d642ef","Type":"ContainerStarted","Data":"3506f326223cee6a5624046af95de02282bc5b9357f9d606d98122b9d35f9332"} Mar 18 16:44:45.313296 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:45.313245 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-49.ec2.internal" podStartSLOduration=2.313225834 podStartE2EDuration="2.313225834s" podCreationTimestamp="2026-03-18 16:44:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:44:45.312605127 +0000 UTC m=+3.623858049" watchObservedRunningTime="2026-03-18 16:44:45.313225834 +0000 UTC m=+3.624478754" Mar 18 16:44:45.314164 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:45.314140 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-qq8q7" event={"ID":"84aaa605-3cf0-4df0-9d6c-85011b39807e","Type":"ContainerStarted","Data":"08b9bb763914f1da12b305d2b6002786fa5cd767f63318c9862868b9a62369bb"} Mar 18 16:44:45.321803 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:45.321739 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-t4bgd" event={"ID":"42adf8fc-4871-4bee-8ebf-d7519c60b6af","Type":"ContainerStarted","Data":"add798c9a82fbb6fd7edf9a84b452dac37306c640485af8b91595c903887a7f4"} Mar 18 16:44:45.325992 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:45.325971 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-w2ddv" event={"ID":"f8f81e7e-bafc-43ea-a249-a4269a4090b1","Type":"ContainerStarted","Data":"fc5a24c3dd173ce32cfa18cc72eb525d977020f86362edc234e0b368c3263907"} Mar 18 16:44:45.329387 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:45.329342 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-mwhvr" event={"ID":"2e545205-6f68-4391-a421-c5d55b65e2d0","Type":"ContainerStarted","Data":"b17db052f7ff7db7b1c4a8861725d09b373b3a9f205c3099268a3aa91520b363"} Mar 18 16:44:45.332051 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:45.332025 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-88mp7" event={"ID":"cac8f65a-dd3e-4b2c-8a45-1ba1e842dd91","Type":"ContainerStarted","Data":"b95221fb92bd08c963550bd114a47339b8d530bf2199c561b58ca262a7556494"} Mar 18 16:44:45.339483 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:45.339446 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-lqcmn" event={"ID":"1def291b-47ff-4e2f-b1a9-30275edc2bd9","Type":"ContainerStarted","Data":"47e1cbadb862e59bbe74f679dbdc33e9ebabf8c282ba0207dd1094828012490f"} Mar 18 16:44:45.345144 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:45.345108 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qktrm" event={"ID":"a51f93e2-c750-4700-bb46-109456c7c78a","Type":"ContainerStarted","Data":"f593be8036367e503351a57f1da3bcf2390dd7689a2bf566a6a3ed38f989ad9a"} Mar 18 16:44:45.771582 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:45.771545 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d6948911-017b-4b29-b362-5520b984c273-metrics-certs\") pod \"network-metrics-daemon-nbj8s\" (UID: \"d6948911-017b-4b29-b362-5520b984c273\") " pod="openshift-multus/network-metrics-daemon-nbj8s" Mar 18 16:44:45.771746 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:44:45.771726 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:44:45.771836 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:44:45.771793 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d6948911-017b-4b29-b362-5520b984c273-metrics-certs podName:d6948911-017b-4b29-b362-5520b984c273 nodeName:}" failed. No retries permitted until 2026-03-18 16:44:47.771775343 +0000 UTC m=+6.083028245 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d6948911-017b-4b29-b362-5520b984c273-metrics-certs") pod "network-metrics-daemon-nbj8s" (UID: "d6948911-017b-4b29-b362-5520b984c273") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:44:45.872147 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:45.872065 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rpztg\" (UniqueName: \"kubernetes.io/projected/35d62bb3-aced-4698-ba43-4c7f828a050d-kube-api-access-rpztg\") pod \"network-check-target-hr9bv\" (UID: \"35d62bb3-aced-4698-ba43-4c7f828a050d\") " pod="openshift-network-diagnostics/network-check-target-hr9bv" Mar 18 16:44:45.872306 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:44:45.872237 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 16:44:45.872306 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:44:45.872257 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 16:44:45.872306 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:44:45.872270 2575 projected.go:194] Error preparing data for projected volume kube-api-access-rpztg for pod openshift-network-diagnostics/network-check-target-hr9bv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:44:45.872503 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:44:45.872350 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/35d62bb3-aced-4698-ba43-4c7f828a050d-kube-api-access-rpztg podName:35d62bb3-aced-4698-ba43-4c7f828a050d nodeName:}" failed. No retries permitted until 2026-03-18 16:44:47.872331517 +0000 UTC m=+6.183584421 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-rpztg" (UniqueName: "kubernetes.io/projected/35d62bb3-aced-4698-ba43-4c7f828a050d-kube-api-access-rpztg") pod "network-check-target-hr9bv" (UID: "35d62bb3-aced-4698-ba43-4c7f828a050d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:44:46.286580 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:46.286536 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nbj8s" Mar 18 16:44:46.287017 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:44:46.286674 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nbj8s" podUID="d6948911-017b-4b29-b362-5520b984c273" Mar 18 16:44:46.287117 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:46.287099 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hr9bv" Mar 18 16:44:46.287210 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:44:46.287189 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hr9bv" podUID="35d62bb3-aced-4698-ba43-4c7f828a050d" Mar 18 16:44:46.359447 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:46.359406 2575 generic.go:358] "Generic (PLEG): container finished" podID="bbdfc43637cf5d326752644715aecd84" containerID="74ca85e89dd43bc9aff1f5431fb876ce1cdaa2d410cb79630ec12eb09cdd90da" exitCode=0 Mar 18 16:44:46.359608 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:46.359561 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-49.ec2.internal" event={"ID":"bbdfc43637cf5d326752644715aecd84","Type":"ContainerDied","Data":"74ca85e89dd43bc9aff1f5431fb876ce1cdaa2d410cb79630ec12eb09cdd90da"} Mar 18 16:44:47.364283 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:47.364248 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-49.ec2.internal" event={"ID":"bbdfc43637cf5d326752644715aecd84","Type":"ContainerStarted","Data":"f9ac5b1ecb9e9a73f78edbbcc77d92a1336ab22bbe165d621d5f112f37685d28"} Mar 18 16:44:47.790058 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:47.789250 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d6948911-017b-4b29-b362-5520b984c273-metrics-certs\") pod \"network-metrics-daemon-nbj8s\" (UID: \"d6948911-017b-4b29-b362-5520b984c273\") " pod="openshift-multus/network-metrics-daemon-nbj8s" Mar 18 16:44:47.790058 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:44:47.789478 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:44:47.790058 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:44:47.789603 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d6948911-017b-4b29-b362-5520b984c273-metrics-certs podName:d6948911-017b-4b29-b362-5520b984c273 nodeName:}" failed. No retries permitted until 2026-03-18 16:44:51.789582602 +0000 UTC m=+10.100835543 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d6948911-017b-4b29-b362-5520b984c273-metrics-certs") pod "network-metrics-daemon-nbj8s" (UID: "d6948911-017b-4b29-b362-5520b984c273") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:44:47.890303 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:47.889700 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rpztg\" (UniqueName: \"kubernetes.io/projected/35d62bb3-aced-4698-ba43-4c7f828a050d-kube-api-access-rpztg\") pod \"network-check-target-hr9bv\" (UID: \"35d62bb3-aced-4698-ba43-4c7f828a050d\") " pod="openshift-network-diagnostics/network-check-target-hr9bv" Mar 18 16:44:47.890303 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:44:47.889877 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 16:44:47.890303 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:44:47.889897 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 16:44:47.890303 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:44:47.889906 2575 projected.go:194] Error preparing data for projected volume kube-api-access-rpztg for pod openshift-network-diagnostics/network-check-target-hr9bv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:44:47.890303 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:44:47.889951 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/35d62bb3-aced-4698-ba43-4c7f828a050d-kube-api-access-rpztg podName:35d62bb3-aced-4698-ba43-4c7f828a050d nodeName:}" failed. No retries permitted until 2026-03-18 16:44:51.889936649 +0000 UTC m=+10.201189561 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-rpztg" (UniqueName: "kubernetes.io/projected/35d62bb3-aced-4698-ba43-4c7f828a050d-kube-api-access-rpztg") pod "network-check-target-hr9bv" (UID: "35d62bb3-aced-4698-ba43-4c7f828a050d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:44:48.294254 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:48.294220 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hr9bv" Mar 18 16:44:48.294436 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:44:48.294399 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hr9bv" podUID="35d62bb3-aced-4698-ba43-4c7f828a050d" Mar 18 16:44:48.295249 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:48.295229 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nbj8s" Mar 18 16:44:48.295459 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:44:48.295436 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nbj8s" podUID="d6948911-017b-4b29-b362-5520b984c273" Mar 18 16:44:50.285920 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:50.285887 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hr9bv" Mar 18 16:44:50.286379 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:50.285935 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nbj8s" Mar 18 16:44:50.286379 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:44:50.286030 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hr9bv" podUID="35d62bb3-aced-4698-ba43-4c7f828a050d" Mar 18 16:44:50.286379 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:44:50.286137 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nbj8s" podUID="d6948911-017b-4b29-b362-5520b984c273" Mar 18 16:44:51.823117 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:51.823082 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d6948911-017b-4b29-b362-5520b984c273-metrics-certs\") pod \"network-metrics-daemon-nbj8s\" (UID: \"d6948911-017b-4b29-b362-5520b984c273\") " pod="openshift-multus/network-metrics-daemon-nbj8s" Mar 18 16:44:51.823723 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:44:51.823255 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:44:51.823723 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:44:51.823328 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d6948911-017b-4b29-b362-5520b984c273-metrics-certs podName:d6948911-017b-4b29-b362-5520b984c273 nodeName:}" failed. No retries permitted until 2026-03-18 16:44:59.823307151 +0000 UTC m=+18.134560066 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d6948911-017b-4b29-b362-5520b984c273-metrics-certs") pod "network-metrics-daemon-nbj8s" (UID: "d6948911-017b-4b29-b362-5520b984c273") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:44:51.924689 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:51.924114 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rpztg\" (UniqueName: \"kubernetes.io/projected/35d62bb3-aced-4698-ba43-4c7f828a050d-kube-api-access-rpztg\") pod \"network-check-target-hr9bv\" (UID: \"35d62bb3-aced-4698-ba43-4c7f828a050d\") " pod="openshift-network-diagnostics/network-check-target-hr9bv" Mar 18 16:44:51.924689 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:44:51.924301 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 16:44:51.924689 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:44:51.924320 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 16:44:51.924689 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:44:51.924332 2575 projected.go:194] Error preparing data for projected volume kube-api-access-rpztg for pod openshift-network-diagnostics/network-check-target-hr9bv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:44:51.924689 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:44:51.924411 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/35d62bb3-aced-4698-ba43-4c7f828a050d-kube-api-access-rpztg podName:35d62bb3-aced-4698-ba43-4c7f828a050d nodeName:}" failed. No retries permitted until 2026-03-18 16:44:59.924390025 +0000 UTC m=+18.235642947 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-rpztg" (UniqueName: "kubernetes.io/projected/35d62bb3-aced-4698-ba43-4c7f828a050d-kube-api-access-rpztg") pod "network-check-target-hr9bv" (UID: "35d62bb3-aced-4698-ba43-4c7f828a050d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:44:52.286445 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:52.286413 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hr9bv" Mar 18 16:44:52.286632 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:44:52.286517 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hr9bv" podUID="35d62bb3-aced-4698-ba43-4c7f828a050d" Mar 18 16:44:52.286632 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:52.286588 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nbj8s" Mar 18 16:44:52.286748 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:44:52.286683 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nbj8s" podUID="d6948911-017b-4b29-b362-5520b984c273" Mar 18 16:44:54.285934 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:54.285855 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hr9bv" Mar 18 16:44:54.286348 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:44:54.285982 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hr9bv" podUID="35d62bb3-aced-4698-ba43-4c7f828a050d" Mar 18 16:44:54.286348 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:54.286024 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nbj8s" Mar 18 16:44:54.286348 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:44:54.286115 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nbj8s" podUID="d6948911-017b-4b29-b362-5520b984c273" Mar 18 16:44:56.286217 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:56.286183 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hr9bv" Mar 18 16:44:56.286774 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:44:56.286308 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hr9bv" podUID="35d62bb3-aced-4698-ba43-4c7f828a050d" Mar 18 16:44:56.286774 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:56.286414 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nbj8s" Mar 18 16:44:56.286774 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:44:56.286506 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nbj8s" podUID="d6948911-017b-4b29-b362-5520b984c273" Mar 18 16:44:58.285877 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:58.285835 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hr9bv" Mar 18 16:44:58.286327 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:58.285842 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nbj8s" Mar 18 16:44:58.286327 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:44:58.285959 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hr9bv" podUID="35d62bb3-aced-4698-ba43-4c7f828a050d" Mar 18 16:44:58.286327 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:44:58.286024 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nbj8s" podUID="d6948911-017b-4b29-b362-5520b984c273" Mar 18 16:44:59.879030 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:59.878988 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d6948911-017b-4b29-b362-5520b984c273-metrics-certs\") pod \"network-metrics-daemon-nbj8s\" (UID: \"d6948911-017b-4b29-b362-5520b984c273\") " pod="openshift-multus/network-metrics-daemon-nbj8s" Mar 18 16:44:59.879512 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:44:59.879151 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:44:59.879512 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:44:59.879245 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d6948911-017b-4b29-b362-5520b984c273-metrics-certs podName:d6948911-017b-4b29-b362-5520b984c273 nodeName:}" failed. No retries permitted until 2026-03-18 16:45:15.879222968 +0000 UTC m=+34.190475884 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d6948911-017b-4b29-b362-5520b984c273-metrics-certs") pod "network-metrics-daemon-nbj8s" (UID: "d6948911-017b-4b29-b362-5520b984c273") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:44:59.979966 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:44:59.979928 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rpztg\" (UniqueName: \"kubernetes.io/projected/35d62bb3-aced-4698-ba43-4c7f828a050d-kube-api-access-rpztg\") pod \"network-check-target-hr9bv\" (UID: \"35d62bb3-aced-4698-ba43-4c7f828a050d\") " pod="openshift-network-diagnostics/network-check-target-hr9bv" Mar 18 16:44:59.980143 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:44:59.980126 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 16:44:59.980200 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:44:59.980153 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 16:44:59.980200 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:44:59.980168 2575 projected.go:194] Error preparing data for projected volume kube-api-access-rpztg for pod openshift-network-diagnostics/network-check-target-hr9bv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:44:59.980279 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:44:59.980232 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/35d62bb3-aced-4698-ba43-4c7f828a050d-kube-api-access-rpztg podName:35d62bb3-aced-4698-ba43-4c7f828a050d nodeName:}" failed. No retries permitted until 2026-03-18 16:45:15.980213173 +0000 UTC m=+34.291466087 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-rpztg" (UniqueName: "kubernetes.io/projected/35d62bb3-aced-4698-ba43-4c7f828a050d-kube-api-access-rpztg") pod "network-check-target-hr9bv" (UID: "35d62bb3-aced-4698-ba43-4c7f828a050d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:45:00.285702 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:00.285665 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nbj8s" Mar 18 16:45:00.285859 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:45:00.285801 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nbj8s" podUID="d6948911-017b-4b29-b362-5520b984c273" Mar 18 16:45:00.285941 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:00.285869 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hr9bv" Mar 18 16:45:00.285987 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:45:00.285970 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hr9bv" podUID="35d62bb3-aced-4698-ba43-4c7f828a050d" Mar 18 16:45:02.286953 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:02.286311 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hr9bv" Mar 18 16:45:02.286953 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:45:02.286628 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hr9bv" podUID="35d62bb3-aced-4698-ba43-4c7f828a050d" Mar 18 16:45:02.286953 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:02.286760 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nbj8s" Mar 18 16:45:02.286953 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:45:02.286840 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nbj8s" podUID="d6948911-017b-4b29-b362-5520b984c273" Mar 18 16:45:02.391319 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:02.391283 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-kmk8h" event={"ID":"0452f967-d43e-4fc8-8591-3f8887d642ef","Type":"ContainerStarted","Data":"a51eec87ca9b24bb57e4e6f57c863170da6f4ef4097485eb9327be075380450d"} Mar 18 16:45:02.392635 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:02.392535 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-qq8q7" event={"ID":"84aaa605-3cf0-4df0-9d6c-85011b39807e","Type":"ContainerStarted","Data":"e536cc3c2c519cc8a3c75baf05c5a2a8272f364663c17030435b23b02fd21e90"} Mar 18 16:45:02.394633 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:02.394595 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-t4bgd" event={"ID":"42adf8fc-4871-4bee-8ebf-d7519c60b6af","Type":"ContainerStarted","Data":"9e0f3f16e83b4ce974113707028290c71cfeae241fe0b1dee306bdfbf46da33a"} Mar 18 16:45:02.395848 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:02.395827 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-w2ddv" event={"ID":"f8f81e7e-bafc-43ea-a249-a4269a4090b1","Type":"ContainerStarted","Data":"23534ebf0823dfaad2a988834cb48944ec4cc7ea1180612debcad32888aa34b2"} Mar 18 16:45:02.397170 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:02.397148 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-mwhvr" event={"ID":"2e545205-6f68-4391-a421-c5d55b65e2d0","Type":"ContainerStarted","Data":"1f2e128428356c63c05328367ec4279ed4396204e4d8a1ebeac91420d1c5ac42"} Mar 18 16:45:02.398177 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:02.398157 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-88mp7" event={"ID":"cac8f65a-dd3e-4b2c-8a45-1ba1e842dd91","Type":"ContainerStarted","Data":"97b54e28c6b4e4cee22c76dc854f19173082b5b39d893d44ede3d0a12ea12c0c"} Mar 18 16:45:02.402886 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:02.402848 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-lqcmn" event={"ID":"1def291b-47ff-4e2f-b1a9-30275edc2bd9","Type":"ContainerStarted","Data":"a8e1e59d08e1004bde5aeea19c828232d2b8286f5dec909ad69e8d35d25bd83b"} Mar 18 16:45:02.403881 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:02.403862 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qktrm" event={"ID":"a51f93e2-c750-4700-bb46-109456c7c78a","Type":"ContainerStarted","Data":"3b57ab037decd9567d2975f0cc86eb03f87b4e1d3cf9716d81f9aa423c2054e0"} Mar 18 16:45:02.406887 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:02.406858 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-49.ec2.internal" podStartSLOduration=19.40683365 podStartE2EDuration="19.40683365s" podCreationTimestamp="2026-03-18 16:44:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:44:47.379212768 +0000 UTC m=+5.690465685" watchObservedRunningTime="2026-03-18 16:45:02.40683365 +0000 UTC m=+20.718086550" Mar 18 16:45:02.407135 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:02.407117 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-kmk8h" podStartSLOduration=3.168845773 podStartE2EDuration="20.407111374s" podCreationTimestamp="2026-03-18 16:44:42 +0000 UTC" firstStartedPulling="2026-03-18 16:44:44.81770275 +0000 UTC m=+3.128955649" lastFinishedPulling="2026-03-18 16:45:02.055968335 +0000 UTC m=+20.367221250" observedRunningTime="2026-03-18 16:45:02.406891257 +0000 UTC m=+20.718144179" watchObservedRunningTime="2026-03-18 16:45:02.407111374 +0000 UTC m=+20.718364294" Mar 18 16:45:02.427804 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:02.427771 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-88mp7" podStartSLOduration=11.321830972 podStartE2EDuration="20.42775921s" podCreationTimestamp="2026-03-18 16:44:42 +0000 UTC" firstStartedPulling="2026-03-18 16:44:44.811177199 +0000 UTC m=+3.122430098" lastFinishedPulling="2026-03-18 16:44:53.917105424 +0000 UTC m=+12.228358336" observedRunningTime="2026-03-18 16:45:02.422078522 +0000 UTC m=+20.733331444" watchObservedRunningTime="2026-03-18 16:45:02.42775921 +0000 UTC m=+20.739012131" Mar 18 16:45:02.438120 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:02.438077 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-mwhvr" podStartSLOduration=3.173168865 podStartE2EDuration="20.438066509s" podCreationTimestamp="2026-03-18 16:44:42 +0000 UTC" firstStartedPulling="2026-03-18 16:44:44.810862644 +0000 UTC m=+3.122115542" lastFinishedPulling="2026-03-18 16:45:02.075760274 +0000 UTC m=+20.387013186" observedRunningTime="2026-03-18 16:45:02.438052303 +0000 UTC m=+20.749305224" watchObservedRunningTime="2026-03-18 16:45:02.438066509 +0000 UTC m=+20.749319426" Mar 18 16:45:02.454022 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:02.453986 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-qq8q7" podStartSLOduration=3.179848489 podStartE2EDuration="20.453974168s" podCreationTimestamp="2026-03-18 16:44:42 +0000 UTC" firstStartedPulling="2026-03-18 16:44:44.821584238 +0000 UTC m=+3.132837151" lastFinishedPulling="2026-03-18 16:45:02.095709915 +0000 UTC m=+20.406962830" observedRunningTime="2026-03-18 16:45:02.453504718 +0000 UTC m=+20.764757638" watchObservedRunningTime="2026-03-18 16:45:02.453974168 +0000 UTC m=+20.765227089" Mar 18 16:45:02.485913 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:02.485872 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-lqcmn" podStartSLOduration=3.51294476 podStartE2EDuration="20.485859045s" podCreationTimestamp="2026-03-18 16:44:42 +0000 UTC" firstStartedPulling="2026-03-18 16:44:44.809399233 +0000 UTC m=+3.120652133" lastFinishedPulling="2026-03-18 16:45:01.782313517 +0000 UTC m=+20.093566418" observedRunningTime="2026-03-18 16:45:02.48553879 +0000 UTC m=+20.796791711" watchObservedRunningTime="2026-03-18 16:45:02.485859045 +0000 UTC m=+20.797111965" Mar 18 16:45:03.406696 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:03.406437 2575 generic.go:358] "Generic (PLEG): container finished" podID="f8f81e7e-bafc-43ea-a249-a4269a4090b1" containerID="23534ebf0823dfaad2a988834cb48944ec4cc7ea1180612debcad32888aa34b2" exitCode=0 Mar 18 16:45:03.407418 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:03.406488 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-w2ddv" event={"ID":"f8f81e7e-bafc-43ea-a249-a4269a4090b1","Type":"ContainerDied","Data":"23534ebf0823dfaad2a988834cb48944ec4cc7ea1180612debcad32888aa34b2"} Mar 18 16:45:03.409296 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:03.409278 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qktrm_a51f93e2-c750-4700-bb46-109456c7c78a/ovn-acl-logging/0.log" Mar 18 16:45:03.409634 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:03.409613 2575 generic.go:358] "Generic (PLEG): container finished" podID="a51f93e2-c750-4700-bb46-109456c7c78a" containerID="3c1b0580625feaca33a07514d8a9347eef3714298c715e0fdf3dd05da0e0edea" exitCode=1 Mar 18 16:45:03.409720 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:03.409697 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qktrm" event={"ID":"a51f93e2-c750-4700-bb46-109456c7c78a","Type":"ContainerStarted","Data":"44b2bbf819fc605f2f0d7f973b6325cef91527124fc987d8e46aa68fea5fa7cf"} Mar 18 16:45:03.409773 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:03.409721 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qktrm" event={"ID":"a51f93e2-c750-4700-bb46-109456c7c78a","Type":"ContainerStarted","Data":"a169f647480cd5da8865497062cac979af4cfb662aa37eb5985af31bb7f26a87"} Mar 18 16:45:03.409773 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:03.409732 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qktrm" event={"ID":"a51f93e2-c750-4700-bb46-109456c7c78a","Type":"ContainerStarted","Data":"45f01979f141f4af918027ebcf0a581855b2e0bf1a45cd74cc0e89eb2e4a98db"} Mar 18 16:45:03.409773 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:03.409740 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qktrm" event={"ID":"a51f93e2-c750-4700-bb46-109456c7c78a","Type":"ContainerStarted","Data":"c62336cddc38e7475b38429c28f9751ca92f717f482c1215d96f1d6ed5c185e9"} Mar 18 16:45:03.409773 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:03.409751 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qktrm" event={"ID":"a51f93e2-c750-4700-bb46-109456c7c78a","Type":"ContainerDied","Data":"3c1b0580625feaca33a07514d8a9347eef3714298c715e0fdf3dd05da0e0edea"} Mar 18 16:45:03.845957 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:03.845930 2575 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Mar 18 16:45:04.231725 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:04.231593 2575 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-03-18T16:45:03.845951281Z","UUID":"4061659e-63da-4071-a733-24eee4e46510","Handler":null,"Name":"","Endpoint":""} Mar 18 16:45:04.234911 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:04.234889 2575 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Mar 18 16:45:04.235045 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:04.234918 2575 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Mar 18 16:45:04.285730 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:04.285702 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hr9bv" Mar 18 16:45:04.285730 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:04.285724 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nbj8s" Mar 18 16:45:04.285975 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:45:04.285812 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hr9bv" podUID="35d62bb3-aced-4698-ba43-4c7f828a050d" Mar 18 16:45:04.286161 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:45:04.286138 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nbj8s" podUID="d6948911-017b-4b29-b362-5520b984c273" Mar 18 16:45:04.413528 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:04.413487 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-kcvj8" event={"ID":"af60e70e-528b-44c4-a08d-8bd69ee9547f","Type":"ContainerStarted","Data":"d8085de358e2bb0d5223fa60aa0ba139e74471b445f2483ac8e2aaf085198b79"} Mar 18 16:45:04.415843 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:04.415818 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-t4bgd" event={"ID":"42adf8fc-4871-4bee-8ebf-d7519c60b6af","Type":"ContainerStarted","Data":"b9da3fe4b307b0ea8f6515aeb56f098059e3ff6f2f9cd9b9febbe25387dcc066"} Mar 18 16:45:04.429693 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:04.429639 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-kcvj8" podStartSLOduration=5.175101142 podStartE2EDuration="22.429622904s" podCreationTimestamp="2026-03-18 16:44:42 +0000 UTC" firstStartedPulling="2026-03-18 16:44:44.820446063 +0000 UTC m=+3.131698965" lastFinishedPulling="2026-03-18 16:45:02.074967825 +0000 UTC m=+20.386220727" observedRunningTime="2026-03-18 16:45:04.428699246 +0000 UTC m=+22.739952167" watchObservedRunningTime="2026-03-18 16:45:04.429622904 +0000 UTC m=+22.740875826" Mar 18 16:45:04.437856 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:04.437831 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-88mp7" Mar 18 16:45:04.438650 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:04.438628 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-88mp7" Mar 18 16:45:05.420635 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:05.420404 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-t4bgd" event={"ID":"42adf8fc-4871-4bee-8ebf-d7519c60b6af","Type":"ContainerStarted","Data":"e2ad742f4545ecec63929ea9eb9f6bffcad3e5b1dff91f5ba794170732cc8b51"} Mar 18 16:45:05.423722 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:05.423702 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qktrm_a51f93e2-c750-4700-bb46-109456c7c78a/ovn-acl-logging/0.log" Mar 18 16:45:05.424129 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:05.424101 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qktrm" event={"ID":"a51f93e2-c750-4700-bb46-109456c7c78a","Type":"ContainerStarted","Data":"295809440a9df9a84efdab867876913e4bbd9fa8f3215c19145f014e641c1d72"} Mar 18 16:45:05.436867 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:05.436818 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-t4bgd" podStartSLOduration=3.11175394 podStartE2EDuration="23.436803558s" podCreationTimestamp="2026-03-18 16:44:42 +0000 UTC" firstStartedPulling="2026-03-18 16:44:44.81421266 +0000 UTC m=+3.125465574" lastFinishedPulling="2026-03-18 16:45:05.139262289 +0000 UTC m=+23.450515192" observedRunningTime="2026-03-18 16:45:05.436671615 +0000 UTC m=+23.747924562" watchObservedRunningTime="2026-03-18 16:45:05.436803558 +0000 UTC m=+23.748056488" Mar 18 16:45:06.285857 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:06.285814 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hr9bv" Mar 18 16:45:06.286056 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:45:06.285952 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hr9bv" podUID="35d62bb3-aced-4698-ba43-4c7f828a050d" Mar 18 16:45:06.286056 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:06.286012 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nbj8s" Mar 18 16:45:06.286171 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:45:06.286136 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nbj8s" podUID="d6948911-017b-4b29-b362-5520b984c273" Mar 18 16:45:06.464882 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:06.464834 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-88mp7" Mar 18 16:45:06.465588 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:06.464994 2575 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 18 16:45:06.465588 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:06.465557 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-88mp7" Mar 18 16:45:08.285849 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:08.285681 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hr9bv" Mar 18 16:45:08.286445 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:08.285688 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nbj8s" Mar 18 16:45:08.286445 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:45:08.285935 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hr9bv" podUID="35d62bb3-aced-4698-ba43-4c7f828a050d" Mar 18 16:45:08.286445 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:45:08.286037 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nbj8s" podUID="d6948911-017b-4b29-b362-5520b984c273" Mar 18 16:45:08.432335 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:08.432291 2575 generic.go:358] "Generic (PLEG): container finished" podID="f8f81e7e-bafc-43ea-a249-a4269a4090b1" containerID="bd8700a5028a911579f3fe8c84b6767066f9cd52936cf2564597f1083d5713d3" exitCode=0 Mar 18 16:45:08.432487 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:08.432387 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-w2ddv" event={"ID":"f8f81e7e-bafc-43ea-a249-a4269a4090b1","Type":"ContainerDied","Data":"bd8700a5028a911579f3fe8c84b6767066f9cd52936cf2564597f1083d5713d3"} Mar 18 16:45:08.435616 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:08.435600 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qktrm_a51f93e2-c750-4700-bb46-109456c7c78a/ovn-acl-logging/0.log" Mar 18 16:45:08.435901 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:08.435881 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qktrm" event={"ID":"a51f93e2-c750-4700-bb46-109456c7c78a","Type":"ContainerStarted","Data":"b4dc4c546ea602dc0e23792d1726c13976b968ab3268571c4a06c89471b48064"} Mar 18 16:45:08.436211 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:08.436192 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-qktrm" Mar 18 16:45:08.436292 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:08.436222 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-qktrm" Mar 18 16:45:08.436386 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:08.436349 2575 scope.go:117] "RemoveContainer" containerID="3c1b0580625feaca33a07514d8a9347eef3714298c715e0fdf3dd05da0e0edea" Mar 18 16:45:08.452791 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:08.452764 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qktrm" Mar 18 16:45:09.441016 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:09.440994 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qktrm_a51f93e2-c750-4700-bb46-109456c7c78a/ovn-acl-logging/0.log" Mar 18 16:45:09.441320 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:09.441283 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qktrm" event={"ID":"a51f93e2-c750-4700-bb46-109456c7c78a","Type":"ContainerStarted","Data":"1e2140c5f9bca1f6e101ea0b261e9e3f41dcc59df68f321783c0bf0efc6b9cb5"} Mar 18 16:45:09.454092 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:09.454074 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-qktrm" Mar 18 16:45:09.477514 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:09.477486 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qktrm" Mar 18 16:45:09.484755 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:09.484718 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-qktrm" podStartSLOduration=10.158803269 podStartE2EDuration="27.484706338s" podCreationTimestamp="2026-03-18 16:44:42 +0000 UTC" firstStartedPulling="2026-03-18 16:44:44.806885795 +0000 UTC m=+3.118138698" lastFinishedPulling="2026-03-18 16:45:02.132788853 +0000 UTC m=+20.444041767" observedRunningTime="2026-03-18 16:45:09.484014079 +0000 UTC m=+27.795267025" watchObservedRunningTime="2026-03-18 16:45:09.484706338 +0000 UTC m=+27.795959308" Mar 18 16:45:09.537229 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:09.537200 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-nbj8s"] Mar 18 16:45:09.537389 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:09.537376 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nbj8s" Mar 18 16:45:09.537533 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:45:09.537506 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nbj8s" podUID="d6948911-017b-4b29-b362-5520b984c273" Mar 18 16:45:09.540149 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:09.540122 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-hr9bv"] Mar 18 16:45:09.540255 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:09.540243 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hr9bv" Mar 18 16:45:09.540342 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:45:09.540323 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hr9bv" podUID="35d62bb3-aced-4698-ba43-4c7f828a050d" Mar 18 16:45:10.447570 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:10.447530 2575 generic.go:358] "Generic (PLEG): container finished" podID="f8f81e7e-bafc-43ea-a249-a4269a4090b1" containerID="8db34834564afe8386b8d556b175d377998193b016d67204d946d31f73aa81d5" exitCode=0 Mar 18 16:45:10.448055 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:10.447624 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-w2ddv" event={"ID":"f8f81e7e-bafc-43ea-a249-a4269a4090b1","Type":"ContainerDied","Data":"8db34834564afe8386b8d556b175d377998193b016d67204d946d31f73aa81d5"} Mar 18 16:45:11.285735 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:11.285703 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hr9bv" Mar 18 16:45:11.285735 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:11.285718 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nbj8s" Mar 18 16:45:11.285989 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:45:11.285829 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hr9bv" podUID="35d62bb3-aced-4698-ba43-4c7f828a050d" Mar 18 16:45:11.285989 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:45:11.285970 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nbj8s" podUID="d6948911-017b-4b29-b362-5520b984c273" Mar 18 16:45:12.452799 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:12.452764 2575 generic.go:358] "Generic (PLEG): container finished" podID="f8f81e7e-bafc-43ea-a249-a4269a4090b1" containerID="b9a9fe440cd9d00edded30c7d42d9888c27da8e684d8b0e59ad40d8a07757040" exitCode=0 Mar 18 16:45:12.453436 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:12.452806 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-w2ddv" event={"ID":"f8f81e7e-bafc-43ea-a249-a4269a4090b1","Type":"ContainerDied","Data":"b9a9fe440cd9d00edded30c7d42d9888c27da8e684d8b0e59ad40d8a07757040"} Mar 18 16:45:13.285596 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:13.285557 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nbj8s" Mar 18 16:45:13.285780 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:13.285557 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hr9bv" Mar 18 16:45:13.285780 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:45:13.285715 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nbj8s" podUID="d6948911-017b-4b29-b362-5520b984c273" Mar 18 16:45:13.285904 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:45:13.285848 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hr9bv" podUID="35d62bb3-aced-4698-ba43-4c7f828a050d" Mar 18 16:45:15.285615 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:15.285575 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hr9bv" Mar 18 16:45:15.285615 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:15.285588 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nbj8s" Mar 18 16:45:15.286221 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:45:15.285697 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hr9bv" podUID="35d62bb3-aced-4698-ba43-4c7f828a050d" Mar 18 16:45:15.286221 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:45:15.285827 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nbj8s" podUID="d6948911-017b-4b29-b362-5520b984c273" Mar 18 16:45:15.892964 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:15.892937 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d6948911-017b-4b29-b362-5520b984c273-metrics-certs\") pod \"network-metrics-daemon-nbj8s\" (UID: \"d6948911-017b-4b29-b362-5520b984c273\") " pod="openshift-multus/network-metrics-daemon-nbj8s" Mar 18 16:45:15.893170 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:45:15.893078 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:45:15.893170 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:45:15.893150 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d6948911-017b-4b29-b362-5520b984c273-metrics-certs podName:d6948911-017b-4b29-b362-5520b984c273 nodeName:}" failed. No retries permitted until 2026-03-18 16:45:47.893128596 +0000 UTC m=+66.204381511 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d6948911-017b-4b29-b362-5520b984c273-metrics-certs") pod "network-metrics-daemon-nbj8s" (UID: "d6948911-017b-4b29-b362-5520b984c273") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:45:15.994056 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:15.994022 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rpztg\" (UniqueName: \"kubernetes.io/projected/35d62bb3-aced-4698-ba43-4c7f828a050d-kube-api-access-rpztg\") pod \"network-check-target-hr9bv\" (UID: \"35d62bb3-aced-4698-ba43-4c7f828a050d\") " pod="openshift-network-diagnostics/network-check-target-hr9bv" Mar 18 16:45:15.994210 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:45:15.994155 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 16:45:15.994210 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:45:15.994173 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 16:45:15.994210 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:45:15.994185 2575 projected.go:194] Error preparing data for projected volume kube-api-access-rpztg for pod openshift-network-diagnostics/network-check-target-hr9bv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:45:15.994395 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:45:15.994248 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/35d62bb3-aced-4698-ba43-4c7f828a050d-kube-api-access-rpztg podName:35d62bb3-aced-4698-ba43-4c7f828a050d nodeName:}" failed. No retries permitted until 2026-03-18 16:45:47.994228308 +0000 UTC m=+66.305481221 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-rpztg" (UniqueName: "kubernetes.io/projected/35d62bb3-aced-4698-ba43-4c7f828a050d-kube-api-access-rpztg") pod "network-check-target-hr9bv" (UID: "35d62bb3-aced-4698-ba43-4c7f828a050d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:45:16.070174 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:16.070150 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-49.ec2.internal" event="NodeReady" Mar 18 16:45:16.070380 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:16.070316 2575 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Mar 18 16:45:16.122955 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:16.122921 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-697ns"] Mar 18 16:45:16.139769 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:16.139715 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-cx9cp"] Mar 18 16:45:16.139943 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:16.139912 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-697ns" Mar 18 16:45:16.142210 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:16.142185 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Mar 18 16:45:16.143230 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:16.142337 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Mar 18 16:45:16.143230 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:16.142557 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-zpjpl\"" Mar 18 16:45:16.160161 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:16.159999 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-697ns"] Mar 18 16:45:16.160161 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:16.160029 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-cx9cp"] Mar 18 16:45:16.160339 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:16.160160 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-cx9cp" Mar 18 16:45:16.162080 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:16.162062 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Mar 18 16:45:16.162414 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:16.162395 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Mar 18 16:45:16.162504 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:16.162457 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Mar 18 16:45:16.162504 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:16.162491 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-9v9vx\"" Mar 18 16:45:16.296947 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:16.296914 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7443181f-d141-4218-bcc9-ca8fbafa0034-config-volume\") pod \"dns-default-697ns\" (UID: \"7443181f-d141-4218-bcc9-ca8fbafa0034\") " pod="openshift-dns/dns-default-697ns" Mar 18 16:45:16.297395 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:16.296981 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7443181f-d141-4218-bcc9-ca8fbafa0034-metrics-tls\") pod \"dns-default-697ns\" (UID: \"7443181f-d141-4218-bcc9-ca8fbafa0034\") " pod="openshift-dns/dns-default-697ns" Mar 18 16:45:16.297395 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:16.297083 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7443181f-d141-4218-bcc9-ca8fbafa0034-tmp-dir\") pod \"dns-default-697ns\" (UID: \"7443181f-d141-4218-bcc9-ca8fbafa0034\") " pod="openshift-dns/dns-default-697ns" Mar 18 16:45:16.297395 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:16.297119 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cj7ts\" (UniqueName: \"kubernetes.io/projected/7443181f-d141-4218-bcc9-ca8fbafa0034-kube-api-access-cj7ts\") pod \"dns-default-697ns\" (UID: \"7443181f-d141-4218-bcc9-ca8fbafa0034\") " pod="openshift-dns/dns-default-697ns" Mar 18 16:45:16.297395 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:16.297136 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/df0141d5-7fdb-4d38-a11e-e2f21fffe1bb-cert\") pod \"ingress-canary-cx9cp\" (UID: \"df0141d5-7fdb-4d38-a11e-e2f21fffe1bb\") " pod="openshift-ingress-canary/ingress-canary-cx9cp" Mar 18 16:45:16.297395 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:16.297189 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2mxc\" (UniqueName: \"kubernetes.io/projected/df0141d5-7fdb-4d38-a11e-e2f21fffe1bb-kube-api-access-w2mxc\") pod \"ingress-canary-cx9cp\" (UID: \"df0141d5-7fdb-4d38-a11e-e2f21fffe1bb\") " pod="openshift-ingress-canary/ingress-canary-cx9cp" Mar 18 16:45:16.398317 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:16.398226 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7443181f-d141-4218-bcc9-ca8fbafa0034-tmp-dir\") pod \"dns-default-697ns\" (UID: \"7443181f-d141-4218-bcc9-ca8fbafa0034\") " pod="openshift-dns/dns-default-697ns" Mar 18 16:45:16.398317 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:16.398279 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cj7ts\" (UniqueName: \"kubernetes.io/projected/7443181f-d141-4218-bcc9-ca8fbafa0034-kube-api-access-cj7ts\") pod \"dns-default-697ns\" (UID: \"7443181f-d141-4218-bcc9-ca8fbafa0034\") " pod="openshift-dns/dns-default-697ns" Mar 18 16:45:16.398317 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:16.398309 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/df0141d5-7fdb-4d38-a11e-e2f21fffe1bb-cert\") pod \"ingress-canary-cx9cp\" (UID: \"df0141d5-7fdb-4d38-a11e-e2f21fffe1bb\") " pod="openshift-ingress-canary/ingress-canary-cx9cp" Mar 18 16:45:16.398617 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:16.398336 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w2mxc\" (UniqueName: \"kubernetes.io/projected/df0141d5-7fdb-4d38-a11e-e2f21fffe1bb-kube-api-access-w2mxc\") pod \"ingress-canary-cx9cp\" (UID: \"df0141d5-7fdb-4d38-a11e-e2f21fffe1bb\") " pod="openshift-ingress-canary/ingress-canary-cx9cp" Mar 18 16:45:16.398617 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:16.398378 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7443181f-d141-4218-bcc9-ca8fbafa0034-config-volume\") pod \"dns-default-697ns\" (UID: \"7443181f-d141-4218-bcc9-ca8fbafa0034\") " pod="openshift-dns/dns-default-697ns" Mar 18 16:45:16.398617 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:16.398440 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7443181f-d141-4218-bcc9-ca8fbafa0034-metrics-tls\") pod \"dns-default-697ns\" (UID: \"7443181f-d141-4218-bcc9-ca8fbafa0034\") " pod="openshift-dns/dns-default-697ns" Mar 18 16:45:16.398617 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:45:16.398482 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Mar 18 16:45:16.398617 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:45:16.398559 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/df0141d5-7fdb-4d38-a11e-e2f21fffe1bb-cert podName:df0141d5-7fdb-4d38-a11e-e2f21fffe1bb nodeName:}" failed. No retries permitted until 2026-03-18 16:45:16.898538677 +0000 UTC m=+35.209791589 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/df0141d5-7fdb-4d38-a11e-e2f21fffe1bb-cert") pod "ingress-canary-cx9cp" (UID: "df0141d5-7fdb-4d38-a11e-e2f21fffe1bb") : secret "canary-serving-cert" not found Mar 18 16:45:16.398617 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:45:16.398563 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Mar 18 16:45:16.398617 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:45:16.398613 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7443181f-d141-4218-bcc9-ca8fbafa0034-metrics-tls podName:7443181f-d141-4218-bcc9-ca8fbafa0034 nodeName:}" failed. No retries permitted until 2026-03-18 16:45:16.8985977 +0000 UTC m=+35.209850598 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7443181f-d141-4218-bcc9-ca8fbafa0034-metrics-tls") pod "dns-default-697ns" (UID: "7443181f-d141-4218-bcc9-ca8fbafa0034") : secret "dns-default-metrics-tls" not found Mar 18 16:45:16.398931 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:16.398624 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7443181f-d141-4218-bcc9-ca8fbafa0034-tmp-dir\") pod \"dns-default-697ns\" (UID: \"7443181f-d141-4218-bcc9-ca8fbafa0034\") " pod="openshift-dns/dns-default-697ns" Mar 18 16:45:16.399054 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:16.399036 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7443181f-d141-4218-bcc9-ca8fbafa0034-config-volume\") pod \"dns-default-697ns\" (UID: \"7443181f-d141-4218-bcc9-ca8fbafa0034\") " pod="openshift-dns/dns-default-697ns" Mar 18 16:45:16.409580 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:16.409553 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cj7ts\" (UniqueName: \"kubernetes.io/projected/7443181f-d141-4218-bcc9-ca8fbafa0034-kube-api-access-cj7ts\") pod \"dns-default-697ns\" (UID: \"7443181f-d141-4218-bcc9-ca8fbafa0034\") " pod="openshift-dns/dns-default-697ns" Mar 18 16:45:16.409727 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:16.409707 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2mxc\" (UniqueName: \"kubernetes.io/projected/df0141d5-7fdb-4d38-a11e-e2f21fffe1bb-kube-api-access-w2mxc\") pod \"ingress-canary-cx9cp\" (UID: \"df0141d5-7fdb-4d38-a11e-e2f21fffe1bb\") " pod="openshift-ingress-canary/ingress-canary-cx9cp" Mar 18 16:45:16.902563 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:16.902313 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7443181f-d141-4218-bcc9-ca8fbafa0034-metrics-tls\") pod \"dns-default-697ns\" (UID: \"7443181f-d141-4218-bcc9-ca8fbafa0034\") " pod="openshift-dns/dns-default-697ns" Mar 18 16:45:16.902848 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:16.902633 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/df0141d5-7fdb-4d38-a11e-e2f21fffe1bb-cert\") pod \"ingress-canary-cx9cp\" (UID: \"df0141d5-7fdb-4d38-a11e-e2f21fffe1bb\") " pod="openshift-ingress-canary/ingress-canary-cx9cp" Mar 18 16:45:16.902848 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:45:16.902483 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Mar 18 16:45:16.902848 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:45:16.902716 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Mar 18 16:45:16.902848 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:45:16.902720 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7443181f-d141-4218-bcc9-ca8fbafa0034-metrics-tls podName:7443181f-d141-4218-bcc9-ca8fbafa0034 nodeName:}" failed. No retries permitted until 2026-03-18 16:45:17.902703877 +0000 UTC m=+36.213956780 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7443181f-d141-4218-bcc9-ca8fbafa0034-metrics-tls") pod "dns-default-697ns" (UID: "7443181f-d141-4218-bcc9-ca8fbafa0034") : secret "dns-default-metrics-tls" not found Mar 18 16:45:16.902848 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:45:16.902791 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/df0141d5-7fdb-4d38-a11e-e2f21fffe1bb-cert podName:df0141d5-7fdb-4d38-a11e-e2f21fffe1bb nodeName:}" failed. No retries permitted until 2026-03-18 16:45:17.902772858 +0000 UTC m=+36.214025764 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/df0141d5-7fdb-4d38-a11e-e2f21fffe1bb-cert") pod "ingress-canary-cx9cp" (UID: "df0141d5-7fdb-4d38-a11e-e2f21fffe1bb") : secret "canary-serving-cert" not found Mar 18 16:45:17.286059 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:17.285971 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nbj8s" Mar 18 16:45:17.286229 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:17.285971 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hr9bv" Mar 18 16:45:17.289312 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:17.289116 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Mar 18 16:45:17.289312 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:17.289116 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-wwjb5\"" Mar 18 16:45:17.289312 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:17.289114 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Mar 18 16:45:17.289312 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:17.289120 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-97tcn\"" Mar 18 16:45:17.289312 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:17.289125 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Mar 18 16:45:17.909650 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:17.909606 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/df0141d5-7fdb-4d38-a11e-e2f21fffe1bb-cert\") pod \"ingress-canary-cx9cp\" (UID: \"df0141d5-7fdb-4d38-a11e-e2f21fffe1bb\") " pod="openshift-ingress-canary/ingress-canary-cx9cp" Mar 18 16:45:17.910060 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:17.909692 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7443181f-d141-4218-bcc9-ca8fbafa0034-metrics-tls\") pod \"dns-default-697ns\" (UID: \"7443181f-d141-4218-bcc9-ca8fbafa0034\") " pod="openshift-dns/dns-default-697ns" Mar 18 16:45:17.910060 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:45:17.909726 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Mar 18 16:45:17.910060 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:45:17.909803 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Mar 18 16:45:17.910060 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:45:17.909820 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/df0141d5-7fdb-4d38-a11e-e2f21fffe1bb-cert podName:df0141d5-7fdb-4d38-a11e-e2f21fffe1bb nodeName:}" failed. No retries permitted until 2026-03-18 16:45:19.909790868 +0000 UTC m=+38.221043786 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/df0141d5-7fdb-4d38-a11e-e2f21fffe1bb-cert") pod "ingress-canary-cx9cp" (UID: "df0141d5-7fdb-4d38-a11e-e2f21fffe1bb") : secret "canary-serving-cert" not found Mar 18 16:45:17.910060 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:45:17.909838 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7443181f-d141-4218-bcc9-ca8fbafa0034-metrics-tls podName:7443181f-d141-4218-bcc9-ca8fbafa0034 nodeName:}" failed. No retries permitted until 2026-03-18 16:45:19.909827778 +0000 UTC m=+38.221080678 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7443181f-d141-4218-bcc9-ca8fbafa0034-metrics-tls") pod "dns-default-697ns" (UID: "7443181f-d141-4218-bcc9-ca8fbafa0034") : secret "dns-default-metrics-tls" not found Mar 18 16:45:19.469320 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:19.469293 2575 generic.go:358] "Generic (PLEG): container finished" podID="f8f81e7e-bafc-43ea-a249-a4269a4090b1" containerID="48ebd81e49a5125d76099d14311868774cb83db99c395488bb88fdffe08b286b" exitCode=0 Mar 18 16:45:19.469689 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:19.469353 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-w2ddv" event={"ID":"f8f81e7e-bafc-43ea-a249-a4269a4090b1","Type":"ContainerDied","Data":"48ebd81e49a5125d76099d14311868774cb83db99c395488bb88fdffe08b286b"} Mar 18 16:45:19.925183 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:19.925129 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7443181f-d141-4218-bcc9-ca8fbafa0034-metrics-tls\") pod \"dns-default-697ns\" (UID: \"7443181f-d141-4218-bcc9-ca8fbafa0034\") " pod="openshift-dns/dns-default-697ns" Mar 18 16:45:19.925416 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:19.925203 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/df0141d5-7fdb-4d38-a11e-e2f21fffe1bb-cert\") pod \"ingress-canary-cx9cp\" (UID: \"df0141d5-7fdb-4d38-a11e-e2f21fffe1bb\") " pod="openshift-ingress-canary/ingress-canary-cx9cp" Mar 18 16:45:19.925416 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:45:19.925283 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Mar 18 16:45:19.925416 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:45:19.925283 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Mar 18 16:45:19.925416 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:45:19.925351 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7443181f-d141-4218-bcc9-ca8fbafa0034-metrics-tls podName:7443181f-d141-4218-bcc9-ca8fbafa0034 nodeName:}" failed. No retries permitted until 2026-03-18 16:45:23.925332111 +0000 UTC m=+42.236585025 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7443181f-d141-4218-bcc9-ca8fbafa0034-metrics-tls") pod "dns-default-697ns" (UID: "7443181f-d141-4218-bcc9-ca8fbafa0034") : secret "dns-default-metrics-tls" not found Mar 18 16:45:19.925416 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:45:19.925393 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/df0141d5-7fdb-4d38-a11e-e2f21fffe1bb-cert podName:df0141d5-7fdb-4d38-a11e-e2f21fffe1bb nodeName:}" failed. No retries permitted until 2026-03-18 16:45:23.92538566 +0000 UTC m=+42.236638560 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/df0141d5-7fdb-4d38-a11e-e2f21fffe1bb-cert") pod "ingress-canary-cx9cp" (UID: "df0141d5-7fdb-4d38-a11e-e2f21fffe1bb") : secret "canary-serving-cert" not found Mar 18 16:45:20.473744 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:20.473715 2575 generic.go:358] "Generic (PLEG): container finished" podID="f8f81e7e-bafc-43ea-a249-a4269a4090b1" containerID="6c74be721b1331f75459967cc49ebfbc575b7b742309fcc50c944bbb1f50ed36" exitCode=0 Mar 18 16:45:20.474097 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:20.473771 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-w2ddv" event={"ID":"f8f81e7e-bafc-43ea-a249-a4269a4090b1","Type":"ContainerDied","Data":"6c74be721b1331f75459967cc49ebfbc575b7b742309fcc50c944bbb1f50ed36"} Mar 18 16:45:21.478790 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:21.478755 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-w2ddv" event={"ID":"f8f81e7e-bafc-43ea-a249-a4269a4090b1","Type":"ContainerStarted","Data":"650f1d968356c398db1ff31d6474c42bf6740c173aed2972141a84cb6899a705"} Mar 18 16:45:21.500159 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:21.500114 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-w2ddv" podStartSLOduration=5.716617042 podStartE2EDuration="39.500101614s" podCreationTimestamp="2026-03-18 16:44:42 +0000 UTC" firstStartedPulling="2026-03-18 16:44:44.81149264 +0000 UTC m=+3.122745552" lastFinishedPulling="2026-03-18 16:45:18.59497721 +0000 UTC m=+36.906230124" observedRunningTime="2026-03-18 16:45:21.498839665 +0000 UTC m=+39.810092614" watchObservedRunningTime="2026-03-18 16:45:21.500101614 +0000 UTC m=+39.811354536" Mar 18 16:45:23.954196 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:23.954156 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/df0141d5-7fdb-4d38-a11e-e2f21fffe1bb-cert\") pod \"ingress-canary-cx9cp\" (UID: \"df0141d5-7fdb-4d38-a11e-e2f21fffe1bb\") " pod="openshift-ingress-canary/ingress-canary-cx9cp" Mar 18 16:45:23.954702 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:23.954209 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7443181f-d141-4218-bcc9-ca8fbafa0034-metrics-tls\") pod \"dns-default-697ns\" (UID: \"7443181f-d141-4218-bcc9-ca8fbafa0034\") " pod="openshift-dns/dns-default-697ns" Mar 18 16:45:23.954702 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:45:23.954296 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Mar 18 16:45:23.954702 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:45:23.954311 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Mar 18 16:45:23.954702 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:45:23.954348 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7443181f-d141-4218-bcc9-ca8fbafa0034-metrics-tls podName:7443181f-d141-4218-bcc9-ca8fbafa0034 nodeName:}" failed. No retries permitted until 2026-03-18 16:45:31.954333877 +0000 UTC m=+50.265586776 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7443181f-d141-4218-bcc9-ca8fbafa0034-metrics-tls") pod "dns-default-697ns" (UID: "7443181f-d141-4218-bcc9-ca8fbafa0034") : secret "dns-default-metrics-tls" not found Mar 18 16:45:23.954702 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:45:23.954398 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/df0141d5-7fdb-4d38-a11e-e2f21fffe1bb-cert podName:df0141d5-7fdb-4d38-a11e-e2f21fffe1bb nodeName:}" failed. No retries permitted until 2026-03-18 16:45:31.95438017 +0000 UTC m=+50.265633086 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/df0141d5-7fdb-4d38-a11e-e2f21fffe1bb-cert") pod "ingress-canary-cx9cp" (UID: "df0141d5-7fdb-4d38-a11e-e2f21fffe1bb") : secret "canary-serving-cert" not found Mar 18 16:45:32.007991 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:32.007948 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/df0141d5-7fdb-4d38-a11e-e2f21fffe1bb-cert\") pod \"ingress-canary-cx9cp\" (UID: \"df0141d5-7fdb-4d38-a11e-e2f21fffe1bb\") " pod="openshift-ingress-canary/ingress-canary-cx9cp" Mar 18 16:45:32.008337 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:32.008007 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7443181f-d141-4218-bcc9-ca8fbafa0034-metrics-tls\") pod \"dns-default-697ns\" (UID: \"7443181f-d141-4218-bcc9-ca8fbafa0034\") " pod="openshift-dns/dns-default-697ns" Mar 18 16:45:32.008337 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:45:32.008106 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Mar 18 16:45:32.008337 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:45:32.008108 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Mar 18 16:45:32.008337 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:45:32.008164 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7443181f-d141-4218-bcc9-ca8fbafa0034-metrics-tls podName:7443181f-d141-4218-bcc9-ca8fbafa0034 nodeName:}" failed. No retries permitted until 2026-03-18 16:45:48.008148867 +0000 UTC m=+66.319401765 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7443181f-d141-4218-bcc9-ca8fbafa0034-metrics-tls") pod "dns-default-697ns" (UID: "7443181f-d141-4218-bcc9-ca8fbafa0034") : secret "dns-default-metrics-tls" not found Mar 18 16:45:32.008337 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:45:32.008178 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/df0141d5-7fdb-4d38-a11e-e2f21fffe1bb-cert podName:df0141d5-7fdb-4d38-a11e-e2f21fffe1bb nodeName:}" failed. No retries permitted until 2026-03-18 16:45:48.008171735 +0000 UTC m=+66.319424634 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/df0141d5-7fdb-4d38-a11e-e2f21fffe1bb-cert") pod "ingress-canary-cx9cp" (UID: "df0141d5-7fdb-4d38-a11e-e2f21fffe1bb") : secret "canary-serving-cert" not found Mar 18 16:45:41.467067 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:41.467040 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qktrm" Mar 18 16:45:47.917685 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:47.917633 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d6948911-017b-4b29-b362-5520b984c273-metrics-certs\") pod \"network-metrics-daemon-nbj8s\" (UID: \"d6948911-017b-4b29-b362-5520b984c273\") " pod="openshift-multus/network-metrics-daemon-nbj8s" Mar 18 16:45:47.920394 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:47.920377 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Mar 18 16:45:47.928254 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:45:47.928240 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Mar 18 16:45:47.928299 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:45:47.928291 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d6948911-017b-4b29-b362-5520b984c273-metrics-certs podName:d6948911-017b-4b29-b362-5520b984c273 nodeName:}" failed. No retries permitted until 2026-03-18 16:46:51.9282774 +0000 UTC m=+130.239530302 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d6948911-017b-4b29-b362-5520b984c273-metrics-certs") pod "network-metrics-daemon-nbj8s" (UID: "d6948911-017b-4b29-b362-5520b984c273") : secret "metrics-daemon-secret" not found Mar 18 16:45:48.018256 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:48.018224 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7443181f-d141-4218-bcc9-ca8fbafa0034-metrics-tls\") pod \"dns-default-697ns\" (UID: \"7443181f-d141-4218-bcc9-ca8fbafa0034\") " pod="openshift-dns/dns-default-697ns" Mar 18 16:45:48.018396 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:48.018264 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rpztg\" (UniqueName: \"kubernetes.io/projected/35d62bb3-aced-4698-ba43-4c7f828a050d-kube-api-access-rpztg\") pod \"network-check-target-hr9bv\" (UID: \"35d62bb3-aced-4698-ba43-4c7f828a050d\") " pod="openshift-network-diagnostics/network-check-target-hr9bv" Mar 18 16:45:48.018396 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:48.018298 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/df0141d5-7fdb-4d38-a11e-e2f21fffe1bb-cert\") pod \"ingress-canary-cx9cp\" (UID: \"df0141d5-7fdb-4d38-a11e-e2f21fffe1bb\") " pod="openshift-ingress-canary/ingress-canary-cx9cp" Mar 18 16:45:48.018471 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:45:48.018418 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Mar 18 16:45:48.018471 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:45:48.018440 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Mar 18 16:45:48.018530 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:45:48.018477 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/df0141d5-7fdb-4d38-a11e-e2f21fffe1bb-cert podName:df0141d5-7fdb-4d38-a11e-e2f21fffe1bb nodeName:}" failed. No retries permitted until 2026-03-18 16:46:20.018456596 +0000 UTC m=+98.329709496 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/df0141d5-7fdb-4d38-a11e-e2f21fffe1bb-cert") pod "ingress-canary-cx9cp" (UID: "df0141d5-7fdb-4d38-a11e-e2f21fffe1bb") : secret "canary-serving-cert" not found Mar 18 16:45:48.018530 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:45:48.018498 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7443181f-d141-4218-bcc9-ca8fbafa0034-metrics-tls podName:7443181f-d141-4218-bcc9-ca8fbafa0034 nodeName:}" failed. No retries permitted until 2026-03-18 16:46:20.018483374 +0000 UTC m=+98.329736273 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7443181f-d141-4218-bcc9-ca8fbafa0034-metrics-tls") pod "dns-default-697ns" (UID: "7443181f-d141-4218-bcc9-ca8fbafa0034") : secret "dns-default-metrics-tls" not found Mar 18 16:45:48.020754 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:48.020740 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Mar 18 16:45:48.030598 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:48.030580 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Mar 18 16:45:48.042485 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:48.042460 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rpztg\" (UniqueName: \"kubernetes.io/projected/35d62bb3-aced-4698-ba43-4c7f828a050d-kube-api-access-rpztg\") pod \"network-check-target-hr9bv\" (UID: \"35d62bb3-aced-4698-ba43-4c7f828a050d\") " pod="openshift-network-diagnostics/network-check-target-hr9bv" Mar 18 16:45:48.205861 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:48.205796 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-wwjb5\"" Mar 18 16:45:48.214509 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:48.214494 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hr9bv" Mar 18 16:45:48.337479 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:48.337446 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-hr9bv"] Mar 18 16:45:48.341127 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:45:48.341100 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod35d62bb3_aced_4698_ba43_4c7f828a050d.slice/crio-a7ffcc4fe849c7ef57985db2ce0d86c062eab17f2f024418e78bb62ba76f503e WatchSource:0}: Error finding container a7ffcc4fe849c7ef57985db2ce0d86c062eab17f2f024418e78bb62ba76f503e: Status 404 returned error can't find the container with id a7ffcc4fe849c7ef57985db2ce0d86c062eab17f2f024418e78bb62ba76f503e Mar 18 16:45:48.529476 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:48.529394 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-hr9bv" event={"ID":"35d62bb3-aced-4698-ba43-4c7f828a050d","Type":"ContainerStarted","Data":"a7ffcc4fe849c7ef57985db2ce0d86c062eab17f2f024418e78bb62ba76f503e"} Mar 18 16:45:51.536291 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:51.536253 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-hr9bv" event={"ID":"35d62bb3-aced-4698-ba43-4c7f828a050d","Type":"ContainerStarted","Data":"a33c750871a823d13329d2a0fe7189395e2a3ab4bf7cd13c61758e80168b1d6d"} Mar 18 16:45:51.536689 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:51.536394 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-hr9bv" Mar 18 16:45:51.551168 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:45:51.551122 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-hr9bv" podStartSLOduration=66.898561259 podStartE2EDuration="1m9.551106599s" podCreationTimestamp="2026-03-18 16:44:42 +0000 UTC" firstStartedPulling="2026-03-18 16:45:48.342951873 +0000 UTC m=+66.654204776" lastFinishedPulling="2026-03-18 16:45:50.995497218 +0000 UTC m=+69.306750116" observedRunningTime="2026-03-18 16:45:51.55063862 +0000 UTC m=+69.861891547" watchObservedRunningTime="2026-03-18 16:45:51.551106599 +0000 UTC m=+69.862359513" Mar 18 16:46:20.036076 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:20.036031 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/df0141d5-7fdb-4d38-a11e-e2f21fffe1bb-cert\") pod \"ingress-canary-cx9cp\" (UID: \"df0141d5-7fdb-4d38-a11e-e2f21fffe1bb\") " pod="openshift-ingress-canary/ingress-canary-cx9cp" Mar 18 16:46:20.036526 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:20.036090 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7443181f-d141-4218-bcc9-ca8fbafa0034-metrics-tls\") pod \"dns-default-697ns\" (UID: \"7443181f-d141-4218-bcc9-ca8fbafa0034\") " pod="openshift-dns/dns-default-697ns" Mar 18 16:46:20.036526 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:46:20.036180 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Mar 18 16:46:20.036526 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:46:20.036182 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Mar 18 16:46:20.036526 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:46:20.036233 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7443181f-d141-4218-bcc9-ca8fbafa0034-metrics-tls podName:7443181f-d141-4218-bcc9-ca8fbafa0034 nodeName:}" failed. No retries permitted until 2026-03-18 16:47:24.036219647 +0000 UTC m=+162.347472547 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7443181f-d141-4218-bcc9-ca8fbafa0034-metrics-tls") pod "dns-default-697ns" (UID: "7443181f-d141-4218-bcc9-ca8fbafa0034") : secret "dns-default-metrics-tls" not found Mar 18 16:46:20.036526 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:46:20.036246 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/df0141d5-7fdb-4d38-a11e-e2f21fffe1bb-cert podName:df0141d5-7fdb-4d38-a11e-e2f21fffe1bb nodeName:}" failed. No retries permitted until 2026-03-18 16:47:24.036240262 +0000 UTC m=+162.347493162 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/df0141d5-7fdb-4d38-a11e-e2f21fffe1bb-cert") pod "ingress-canary-cx9cp" (UID: "df0141d5-7fdb-4d38-a11e-e2f21fffe1bb") : secret "canary-serving-cert" not found Mar 18 16:46:22.540843 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:22.540816 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-hr9bv" Mar 18 16:46:36.168859 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:36.168821 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-866f46547-cwhks"] Mar 18 16:46:36.173444 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:36.173423 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-67fdcb5769-mgr8h"] Mar 18 16:46:36.173588 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:36.173560 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-866f46547-cwhks" Mar 18 16:46:36.175601 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:36.175579 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Mar 18 16:46:36.175601 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:36.175589 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Mar 18 16:46:36.175601 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:36.175600 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Mar 18 16:46:36.175809 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:36.175579 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-hxgjd\"" Mar 18 16:46:36.175809 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:36.175579 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Mar 18 16:46:36.176262 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:36.176242 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-b58cd5d8d-rph27"] Mar 18 16:46:36.176414 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:36.176387 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-67fdcb5769-mgr8h" Mar 18 16:46:36.178377 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:36.178341 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Mar 18 16:46:36.178470 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:36.178460 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Mar 18 16:46:36.178535 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:36.178478 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-7kjrm\"" Mar 18 16:46:36.178997 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:36.178976 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-b58cd5d8d-rph27" Mar 18 16:46:36.180766 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:36.180743 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Mar 18 16:46:36.181460 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:36.181444 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-kwq6l\"" Mar 18 16:46:36.181708 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:36.181687 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Mar 18 16:46:36.181800 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:36.181697 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Mar 18 16:46:36.181800 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:36.181767 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Mar 18 16:46:36.186497 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:36.186475 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-866f46547-cwhks"] Mar 18 16:46:36.187319 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:36.187272 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-67fdcb5769-mgr8h"] Mar 18 16:46:36.187978 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:36.187958 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-b58cd5d8d-rph27"] Mar 18 16:46:36.236928 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:36.236901 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqtr8\" (UniqueName: \"kubernetes.io/projected/50735fa3-85e2-4bcb-ad67-9dd10af25a64-kube-api-access-wqtr8\") pod \"cluster-monitoring-operator-b58cd5d8d-rph27\" (UID: \"50735fa3-85e2-4bcb-ad67-9dd10af25a64\") " pod="openshift-monitoring/cluster-monitoring-operator-b58cd5d8d-rph27" Mar 18 16:46:36.236928 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:36.236929 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5w8f\" (UniqueName: \"kubernetes.io/projected/9cb162d5-5574-4cc5-86c3-c3e6f26276b5-kube-api-access-h5w8f\") pod \"volume-data-source-validator-67fdcb5769-mgr8h\" (UID: \"9cb162d5-5574-4cc5-86c3-c3e6f26276b5\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-67fdcb5769-mgr8h" Mar 18 16:46:36.237073 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:36.236978 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a6b6b964-0003-450e-8d89-0bc782c50559-serving-cert\") pod \"kube-storage-version-migrator-operator-866f46547-cwhks\" (UID: \"a6b6b964-0003-450e-8d89-0bc782c50559\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-866f46547-cwhks" Mar 18 16:46:36.237073 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:36.237006 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6b6b964-0003-450e-8d89-0bc782c50559-config\") pod \"kube-storage-version-migrator-operator-866f46547-cwhks\" (UID: \"a6b6b964-0003-450e-8d89-0bc782c50559\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-866f46547-cwhks" Mar 18 16:46:36.237073 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:36.237023 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/50735fa3-85e2-4bcb-ad67-9dd10af25a64-telemetry-config\") pod \"cluster-monitoring-operator-b58cd5d8d-rph27\" (UID: \"50735fa3-85e2-4bcb-ad67-9dd10af25a64\") " pod="openshift-monitoring/cluster-monitoring-operator-b58cd5d8d-rph27" Mar 18 16:46:36.237203 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:36.237066 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcl9x\" (UniqueName: \"kubernetes.io/projected/a6b6b964-0003-450e-8d89-0bc782c50559-kube-api-access-fcl9x\") pod \"kube-storage-version-migrator-operator-866f46547-cwhks\" (UID: \"a6b6b964-0003-450e-8d89-0bc782c50559\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-866f46547-cwhks" Mar 18 16:46:36.237203 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:36.237097 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/50735fa3-85e2-4bcb-ad67-9dd10af25a64-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-b58cd5d8d-rph27\" (UID: \"50735fa3-85e2-4bcb-ad67-9dd10af25a64\") " pod="openshift-monitoring/cluster-monitoring-operator-b58cd5d8d-rph27" Mar 18 16:46:36.265297 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:36.265272 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-d5df4776c-sr5rw"] Mar 18 16:46:36.268321 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:36.268307 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-d5df4776c-sr5rw" Mar 18 16:46:36.270164 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:36.270146 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-trscj\"" Mar 18 16:46:36.270266 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:36.270147 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Mar 18 16:46:36.270266 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:36.270180 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Mar 18 16:46:36.270465 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:36.270451 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Mar 18 16:46:36.274999 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:36.274974 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-d5df4776c-sr5rw"] Mar 18 16:46:36.337441 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:36.337412 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6b6b964-0003-450e-8d89-0bc782c50559-config\") pod \"kube-storage-version-migrator-operator-866f46547-cwhks\" (UID: \"a6b6b964-0003-450e-8d89-0bc782c50559\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-866f46547-cwhks" Mar 18 16:46:36.337644 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:36.337450 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/50735fa3-85e2-4bcb-ad67-9dd10af25a64-telemetry-config\") pod \"cluster-monitoring-operator-b58cd5d8d-rph27\" (UID: \"50735fa3-85e2-4bcb-ad67-9dd10af25a64\") " pod="openshift-monitoring/cluster-monitoring-operator-b58cd5d8d-rph27" Mar 18 16:46:36.337644 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:36.337475 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fcl9x\" (UniqueName: \"kubernetes.io/projected/a6b6b964-0003-450e-8d89-0bc782c50559-kube-api-access-fcl9x\") pod \"kube-storage-version-migrator-operator-866f46547-cwhks\" (UID: \"a6b6b964-0003-450e-8d89-0bc782c50559\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-866f46547-cwhks" Mar 18 16:46:36.337644 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:36.337600 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/50735fa3-85e2-4bcb-ad67-9dd10af25a64-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-b58cd5d8d-rph27\" (UID: \"50735fa3-85e2-4bcb-ad67-9dd10af25a64\") " pod="openshift-monitoring/cluster-monitoring-operator-b58cd5d8d-rph27" Mar 18 16:46:36.337820 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:36.337657 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4xcj\" (UniqueName: \"kubernetes.io/projected/9143e08e-771d-4436-b780-ab9b14656685-kube-api-access-t4xcj\") pod \"cluster-samples-operator-d5df4776c-sr5rw\" (UID: \"9143e08e-771d-4436-b780-ab9b14656685\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-d5df4776c-sr5rw" Mar 18 16:46:36.337820 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:46:36.337708 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 18 16:46:36.337820 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:36.337720 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wqtr8\" (UniqueName: \"kubernetes.io/projected/50735fa3-85e2-4bcb-ad67-9dd10af25a64-kube-api-access-wqtr8\") pod \"cluster-monitoring-operator-b58cd5d8d-rph27\" (UID: \"50735fa3-85e2-4bcb-ad67-9dd10af25a64\") " pod="openshift-monitoring/cluster-monitoring-operator-b58cd5d8d-rph27" Mar 18 16:46:36.337820 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:36.337748 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h5w8f\" (UniqueName: \"kubernetes.io/projected/9cb162d5-5574-4cc5-86c3-c3e6f26276b5-kube-api-access-h5w8f\") pod \"volume-data-source-validator-67fdcb5769-mgr8h\" (UID: \"9cb162d5-5574-4cc5-86c3-c3e6f26276b5\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-67fdcb5769-mgr8h" Mar 18 16:46:36.337820 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:46:36.337778 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/50735fa3-85e2-4bcb-ad67-9dd10af25a64-cluster-monitoring-operator-tls podName:50735fa3-85e2-4bcb-ad67-9dd10af25a64 nodeName:}" failed. No retries permitted until 2026-03-18 16:46:36.83775815 +0000 UTC m=+115.149011063 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/50735fa3-85e2-4bcb-ad67-9dd10af25a64-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-b58cd5d8d-rph27" (UID: "50735fa3-85e2-4bcb-ad67-9dd10af25a64") : secret "cluster-monitoring-operator-tls" not found Mar 18 16:46:36.338050 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:36.337822 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/9143e08e-771d-4436-b780-ab9b14656685-samples-operator-tls\") pod \"cluster-samples-operator-d5df4776c-sr5rw\" (UID: \"9143e08e-771d-4436-b780-ab9b14656685\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-d5df4776c-sr5rw" Mar 18 16:46:36.338050 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:36.337883 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a6b6b964-0003-450e-8d89-0bc782c50559-serving-cert\") pod \"kube-storage-version-migrator-operator-866f46547-cwhks\" (UID: \"a6b6b964-0003-450e-8d89-0bc782c50559\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-866f46547-cwhks" Mar 18 16:46:36.338050 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:36.337956 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6b6b964-0003-450e-8d89-0bc782c50559-config\") pod \"kube-storage-version-migrator-operator-866f46547-cwhks\" (UID: \"a6b6b964-0003-450e-8d89-0bc782c50559\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-866f46547-cwhks" Mar 18 16:46:36.338172 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:36.338133 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/50735fa3-85e2-4bcb-ad67-9dd10af25a64-telemetry-config\") pod \"cluster-monitoring-operator-b58cd5d8d-rph27\" (UID: \"50735fa3-85e2-4bcb-ad67-9dd10af25a64\") " pod="openshift-monitoring/cluster-monitoring-operator-b58cd5d8d-rph27" Mar 18 16:46:36.340101 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:36.340075 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a6b6b964-0003-450e-8d89-0bc782c50559-serving-cert\") pod \"kube-storage-version-migrator-operator-866f46547-cwhks\" (UID: \"a6b6b964-0003-450e-8d89-0bc782c50559\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-866f46547-cwhks" Mar 18 16:46:36.347674 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:36.347651 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcl9x\" (UniqueName: \"kubernetes.io/projected/a6b6b964-0003-450e-8d89-0bc782c50559-kube-api-access-fcl9x\") pod \"kube-storage-version-migrator-operator-866f46547-cwhks\" (UID: \"a6b6b964-0003-450e-8d89-0bc782c50559\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-866f46547-cwhks" Mar 18 16:46:36.347674 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:36.347662 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqtr8\" (UniqueName: \"kubernetes.io/projected/50735fa3-85e2-4bcb-ad67-9dd10af25a64-kube-api-access-wqtr8\") pod \"cluster-monitoring-operator-b58cd5d8d-rph27\" (UID: \"50735fa3-85e2-4bcb-ad67-9dd10af25a64\") " pod="openshift-monitoring/cluster-monitoring-operator-b58cd5d8d-rph27" Mar 18 16:46:36.347940 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:36.347922 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5w8f\" (UniqueName: \"kubernetes.io/projected/9cb162d5-5574-4cc5-86c3-c3e6f26276b5-kube-api-access-h5w8f\") pod \"volume-data-source-validator-67fdcb5769-mgr8h\" (UID: \"9cb162d5-5574-4cc5-86c3-c3e6f26276b5\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-67fdcb5769-mgr8h" Mar 18 16:46:36.438218 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:36.438129 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/9143e08e-771d-4436-b780-ab9b14656685-samples-operator-tls\") pod \"cluster-samples-operator-d5df4776c-sr5rw\" (UID: \"9143e08e-771d-4436-b780-ab9b14656685\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-d5df4776c-sr5rw" Mar 18 16:46:36.438395 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:36.438228 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t4xcj\" (UniqueName: \"kubernetes.io/projected/9143e08e-771d-4436-b780-ab9b14656685-kube-api-access-t4xcj\") pod \"cluster-samples-operator-d5df4776c-sr5rw\" (UID: \"9143e08e-771d-4436-b780-ab9b14656685\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-d5df4776c-sr5rw" Mar 18 16:46:36.438395 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:46:36.438279 2575 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Mar 18 16:46:36.438395 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:46:36.438376 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9143e08e-771d-4436-b780-ab9b14656685-samples-operator-tls podName:9143e08e-771d-4436-b780-ab9b14656685 nodeName:}" failed. No retries permitted until 2026-03-18 16:46:36.938341966 +0000 UTC m=+115.249594865 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/9143e08e-771d-4436-b780-ab9b14656685-samples-operator-tls") pod "cluster-samples-operator-d5df4776c-sr5rw" (UID: "9143e08e-771d-4436-b780-ab9b14656685") : secret "samples-operator-tls" not found Mar 18 16:46:36.446557 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:36.446535 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4xcj\" (UniqueName: \"kubernetes.io/projected/9143e08e-771d-4436-b780-ab9b14656685-kube-api-access-t4xcj\") pod \"cluster-samples-operator-d5df4776c-sr5rw\" (UID: \"9143e08e-771d-4436-b780-ab9b14656685\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-d5df4776c-sr5rw" Mar 18 16:46:36.483994 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:36.483975 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-866f46547-cwhks" Mar 18 16:46:36.491628 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:36.491602 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-67fdcb5769-mgr8h" Mar 18 16:46:36.606013 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:36.605974 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-866f46547-cwhks"] Mar 18 16:46:36.609059 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:46:36.609029 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6b6b964_0003_450e_8d89_0bc782c50559.slice/crio-8f557944c79407df0dbab560d0d044ba70868c6e20fca49dee8d37702797f805 WatchSource:0}: Error finding container 8f557944c79407df0dbab560d0d044ba70868c6e20fca49dee8d37702797f805: Status 404 returned error can't find the container with id 8f557944c79407df0dbab560d0d044ba70868c6e20fca49dee8d37702797f805 Mar 18 16:46:36.618480 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:36.618455 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-67fdcb5769-mgr8h"] Mar 18 16:46:36.620660 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:36.620630 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-866f46547-cwhks" event={"ID":"a6b6b964-0003-450e-8d89-0bc782c50559","Type":"ContainerStarted","Data":"8f557944c79407df0dbab560d0d044ba70868c6e20fca49dee8d37702797f805"} Mar 18 16:46:36.621309 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:46:36.621292 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9cb162d5_5574_4cc5_86c3_c3e6f26276b5.slice/crio-b93ccb40bad66dbd7843eb4a7ab34f126decdd9920e3e4c26008385959774b18 WatchSource:0}: Error finding container b93ccb40bad66dbd7843eb4a7ab34f126decdd9920e3e4c26008385959774b18: Status 404 returned error can't find the container with id b93ccb40bad66dbd7843eb4a7ab34f126decdd9920e3e4c26008385959774b18 Mar 18 16:46:36.842480 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:36.842376 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/50735fa3-85e2-4bcb-ad67-9dd10af25a64-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-b58cd5d8d-rph27\" (UID: \"50735fa3-85e2-4bcb-ad67-9dd10af25a64\") " pod="openshift-monitoring/cluster-monitoring-operator-b58cd5d8d-rph27" Mar 18 16:46:36.842625 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:46:36.842520 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 18 16:46:36.842625 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:46:36.842591 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/50735fa3-85e2-4bcb-ad67-9dd10af25a64-cluster-monitoring-operator-tls podName:50735fa3-85e2-4bcb-ad67-9dd10af25a64 nodeName:}" failed. No retries permitted until 2026-03-18 16:46:37.842575906 +0000 UTC m=+116.153828810 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/50735fa3-85e2-4bcb-ad67-9dd10af25a64-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-b58cd5d8d-rph27" (UID: "50735fa3-85e2-4bcb-ad67-9dd10af25a64") : secret "cluster-monitoring-operator-tls" not found Mar 18 16:46:36.943186 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:36.943153 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/9143e08e-771d-4436-b780-ab9b14656685-samples-operator-tls\") pod \"cluster-samples-operator-d5df4776c-sr5rw\" (UID: \"9143e08e-771d-4436-b780-ab9b14656685\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-d5df4776c-sr5rw" Mar 18 16:46:36.943318 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:46:36.943293 2575 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Mar 18 16:46:36.943395 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:46:36.943384 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9143e08e-771d-4436-b780-ab9b14656685-samples-operator-tls podName:9143e08e-771d-4436-b780-ab9b14656685 nodeName:}" failed. No retries permitted until 2026-03-18 16:46:37.943350305 +0000 UTC m=+116.254603204 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/9143e08e-771d-4436-b780-ab9b14656685-samples-operator-tls") pod "cluster-samples-operator-d5df4776c-sr5rw" (UID: "9143e08e-771d-4436-b780-ab9b14656685") : secret "samples-operator-tls" not found Mar 18 16:46:37.624529 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:37.624492 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-67fdcb5769-mgr8h" event={"ID":"9cb162d5-5574-4cc5-86c3-c3e6f26276b5","Type":"ContainerStarted","Data":"b93ccb40bad66dbd7843eb4a7ab34f126decdd9920e3e4c26008385959774b18"} Mar 18 16:46:37.851040 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:37.850998 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/50735fa3-85e2-4bcb-ad67-9dd10af25a64-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-b58cd5d8d-rph27\" (UID: \"50735fa3-85e2-4bcb-ad67-9dd10af25a64\") " pod="openshift-monitoring/cluster-monitoring-operator-b58cd5d8d-rph27" Mar 18 16:46:37.851214 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:46:37.851187 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 18 16:46:37.851282 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:46:37.851258 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/50735fa3-85e2-4bcb-ad67-9dd10af25a64-cluster-monitoring-operator-tls podName:50735fa3-85e2-4bcb-ad67-9dd10af25a64 nodeName:}" failed. No retries permitted until 2026-03-18 16:46:39.851235777 +0000 UTC m=+118.162488698 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/50735fa3-85e2-4bcb-ad67-9dd10af25a64-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-b58cd5d8d-rph27" (UID: "50735fa3-85e2-4bcb-ad67-9dd10af25a64") : secret "cluster-monitoring-operator-tls" not found Mar 18 16:46:37.951898 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:37.951858 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/9143e08e-771d-4436-b780-ab9b14656685-samples-operator-tls\") pod \"cluster-samples-operator-d5df4776c-sr5rw\" (UID: \"9143e08e-771d-4436-b780-ab9b14656685\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-d5df4776c-sr5rw" Mar 18 16:46:37.952091 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:46:37.952044 2575 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Mar 18 16:46:37.952155 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:46:37.952130 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9143e08e-771d-4436-b780-ab9b14656685-samples-operator-tls podName:9143e08e-771d-4436-b780-ab9b14656685 nodeName:}" failed. No retries permitted until 2026-03-18 16:46:39.952108522 +0000 UTC m=+118.263361426 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/9143e08e-771d-4436-b780-ab9b14656685-samples-operator-tls") pod "cluster-samples-operator-d5df4776c-sr5rw" (UID: "9143e08e-771d-4436-b780-ab9b14656685") : secret "samples-operator-tls" not found Mar 18 16:46:37.985476 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:37.985440 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-56f6f4cbcb-xrm6l"] Mar 18 16:46:37.988242 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:37.988220 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-56f6f4cbcb-xrm6l" Mar 18 16:46:37.990847 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:37.990705 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Mar 18 16:46:37.990847 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:37.990736 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Mar 18 16:46:37.991337 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:37.991318 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Mar 18 16:46:37.991437 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:37.991318 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-cn7p8\"" Mar 18 16:46:37.991821 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:37.991793 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Mar 18 16:46:38.002210 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:38.002183 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-56f6f4cbcb-xrm6l"] Mar 18 16:46:38.052680 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:38.052637 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/13e69639-78a2-49ad-ba0a-b3ea46d6a337-serving-cert\") pod \"service-ca-operator-56f6f4cbcb-xrm6l\" (UID: \"13e69639-78a2-49ad-ba0a-b3ea46d6a337\") " pod="openshift-service-ca-operator/service-ca-operator-56f6f4cbcb-xrm6l" Mar 18 16:46:38.052807 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:38.052722 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13e69639-78a2-49ad-ba0a-b3ea46d6a337-config\") pod \"service-ca-operator-56f6f4cbcb-xrm6l\" (UID: \"13e69639-78a2-49ad-ba0a-b3ea46d6a337\") " pod="openshift-service-ca-operator/service-ca-operator-56f6f4cbcb-xrm6l" Mar 18 16:46:38.052807 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:38.052787 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfhz6\" (UniqueName: \"kubernetes.io/projected/13e69639-78a2-49ad-ba0a-b3ea46d6a337-kube-api-access-qfhz6\") pod \"service-ca-operator-56f6f4cbcb-xrm6l\" (UID: \"13e69639-78a2-49ad-ba0a-b3ea46d6a337\") " pod="openshift-service-ca-operator/service-ca-operator-56f6f4cbcb-xrm6l" Mar 18 16:46:38.153833 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:38.153801 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/13e69639-78a2-49ad-ba0a-b3ea46d6a337-serving-cert\") pod \"service-ca-operator-56f6f4cbcb-xrm6l\" (UID: \"13e69639-78a2-49ad-ba0a-b3ea46d6a337\") " pod="openshift-service-ca-operator/service-ca-operator-56f6f4cbcb-xrm6l" Mar 18 16:46:38.153973 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:38.153883 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13e69639-78a2-49ad-ba0a-b3ea46d6a337-config\") pod \"service-ca-operator-56f6f4cbcb-xrm6l\" (UID: \"13e69639-78a2-49ad-ba0a-b3ea46d6a337\") " pod="openshift-service-ca-operator/service-ca-operator-56f6f4cbcb-xrm6l" Mar 18 16:46:38.153973 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:38.153925 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qfhz6\" (UniqueName: \"kubernetes.io/projected/13e69639-78a2-49ad-ba0a-b3ea46d6a337-kube-api-access-qfhz6\") pod \"service-ca-operator-56f6f4cbcb-xrm6l\" (UID: \"13e69639-78a2-49ad-ba0a-b3ea46d6a337\") " pod="openshift-service-ca-operator/service-ca-operator-56f6f4cbcb-xrm6l" Mar 18 16:46:38.154628 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:38.154602 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13e69639-78a2-49ad-ba0a-b3ea46d6a337-config\") pod \"service-ca-operator-56f6f4cbcb-xrm6l\" (UID: \"13e69639-78a2-49ad-ba0a-b3ea46d6a337\") " pod="openshift-service-ca-operator/service-ca-operator-56f6f4cbcb-xrm6l" Mar 18 16:46:38.156521 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:38.156500 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/13e69639-78a2-49ad-ba0a-b3ea46d6a337-serving-cert\") pod \"service-ca-operator-56f6f4cbcb-xrm6l\" (UID: \"13e69639-78a2-49ad-ba0a-b3ea46d6a337\") " pod="openshift-service-ca-operator/service-ca-operator-56f6f4cbcb-xrm6l" Mar 18 16:46:38.162469 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:38.162444 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfhz6\" (UniqueName: \"kubernetes.io/projected/13e69639-78a2-49ad-ba0a-b3ea46d6a337-kube-api-access-qfhz6\") pod \"service-ca-operator-56f6f4cbcb-xrm6l\" (UID: \"13e69639-78a2-49ad-ba0a-b3ea46d6a337\") " pod="openshift-service-ca-operator/service-ca-operator-56f6f4cbcb-xrm6l" Mar 18 16:46:38.299752 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:38.299677 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-56f6f4cbcb-xrm6l" Mar 18 16:46:38.627541 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:38.627514 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-67fdcb5769-mgr8h" event={"ID":"9cb162d5-5574-4cc5-86c3-c3e6f26276b5","Type":"ContainerStarted","Data":"a5a8d68914d86d7497b2a89e449c7ed8d3a3db6fdf465c0a058a2fd3328768b7"} Mar 18 16:46:38.655268 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:38.655229 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-67fdcb5769-mgr8h" podStartSLOduration=1.245124791 podStartE2EDuration="2.655214262s" podCreationTimestamp="2026-03-18 16:46:36 +0000 UTC" firstStartedPulling="2026-03-18 16:46:36.622948525 +0000 UTC m=+114.934201428" lastFinishedPulling="2026-03-18 16:46:38.033037874 +0000 UTC m=+116.344290899" observedRunningTime="2026-03-18 16:46:38.642619208 +0000 UTC m=+116.953872140" watchObservedRunningTime="2026-03-18 16:46:38.655214262 +0000 UTC m=+116.966467182" Mar 18 16:46:38.655595 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:38.655582 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-56f6f4cbcb-xrm6l"] Mar 18 16:46:38.658107 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:46:38.658083 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod13e69639_78a2_49ad_ba0a_b3ea46d6a337.slice/crio-f1ed453d07bdac07631ac6556362e504eacb206e76b1d6960a5e3fba89ca963a WatchSource:0}: Error finding container f1ed453d07bdac07631ac6556362e504eacb206e76b1d6960a5e3fba89ca963a: Status 404 returned error can't find the container with id f1ed453d07bdac07631ac6556362e504eacb206e76b1d6960a5e3fba89ca963a Mar 18 16:46:39.630635 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:39.630592 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-56f6f4cbcb-xrm6l" event={"ID":"13e69639-78a2-49ad-ba0a-b3ea46d6a337","Type":"ContainerStarted","Data":"f1ed453d07bdac07631ac6556362e504eacb206e76b1d6960a5e3fba89ca963a"} Mar 18 16:46:39.632340 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:39.632296 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-866f46547-cwhks" event={"ID":"a6b6b964-0003-450e-8d89-0bc782c50559","Type":"ContainerStarted","Data":"99a61843c492fdde3a0706c76a77c8312c92f197d4b8f294a5703f4421cd7280"} Mar 18 16:46:39.648788 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:39.648712 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-866f46547-cwhks" podStartSLOduration=1.666387574 podStartE2EDuration="3.648698092s" podCreationTimestamp="2026-03-18 16:46:36 +0000 UTC" firstStartedPulling="2026-03-18 16:46:36.610796633 +0000 UTC m=+114.922049534" lastFinishedPulling="2026-03-18 16:46:38.593107154 +0000 UTC m=+116.904360052" observedRunningTime="2026-03-18 16:46:39.64833679 +0000 UTC m=+117.959589713" watchObservedRunningTime="2026-03-18 16:46:39.648698092 +0000 UTC m=+117.959951014" Mar 18 16:46:39.867976 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:39.867945 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/50735fa3-85e2-4bcb-ad67-9dd10af25a64-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-b58cd5d8d-rph27\" (UID: \"50735fa3-85e2-4bcb-ad67-9dd10af25a64\") " pod="openshift-monitoring/cluster-monitoring-operator-b58cd5d8d-rph27" Mar 18 16:46:39.868148 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:46:39.868095 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 18 16:46:39.868225 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:46:39.868165 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/50735fa3-85e2-4bcb-ad67-9dd10af25a64-cluster-monitoring-operator-tls podName:50735fa3-85e2-4bcb-ad67-9dd10af25a64 nodeName:}" failed. No retries permitted until 2026-03-18 16:46:43.868148588 +0000 UTC m=+122.179401496 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/50735fa3-85e2-4bcb-ad67-9dd10af25a64-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-b58cd5d8d-rph27" (UID: "50735fa3-85e2-4bcb-ad67-9dd10af25a64") : secret "cluster-monitoring-operator-tls" not found Mar 18 16:46:39.968764 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:39.968732 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/9143e08e-771d-4436-b780-ab9b14656685-samples-operator-tls\") pod \"cluster-samples-operator-d5df4776c-sr5rw\" (UID: \"9143e08e-771d-4436-b780-ab9b14656685\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-d5df4776c-sr5rw" Mar 18 16:46:39.968957 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:46:39.968903 2575 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Mar 18 16:46:39.969018 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:46:39.968984 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9143e08e-771d-4436-b780-ab9b14656685-samples-operator-tls podName:9143e08e-771d-4436-b780-ab9b14656685 nodeName:}" failed. No retries permitted until 2026-03-18 16:46:43.96896497 +0000 UTC m=+122.280217873 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/9143e08e-771d-4436-b780-ab9b14656685-samples-operator-tls") pod "cluster-samples-operator-d5df4776c-sr5rw" (UID: "9143e08e-771d-4436-b780-ab9b14656685") : secret "samples-operator-tls" not found Mar 18 16:46:39.986902 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:39.984839 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-6b589cdcc-q79g8"] Mar 18 16:46:39.991176 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:39.991141 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-6b589cdcc-q79g8" Mar 18 16:46:39.993228 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:39.993206 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Mar 18 16:46:39.993379 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:39.993273 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-zsm86\"" Mar 18 16:46:39.993548 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:39.993532 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Mar 18 16:46:39.996128 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:39.996108 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-6b589cdcc-q79g8"] Mar 18 16:46:40.070172 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:40.070143 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zs8qn\" (UniqueName: \"kubernetes.io/projected/45a755dc-3ea3-453d-b63c-eb0cb38273c6-kube-api-access-zs8qn\") pod \"migrator-6b589cdcc-q79g8\" (UID: \"45a755dc-3ea3-453d-b63c-eb0cb38273c6\") " pod="openshift-kube-storage-version-migrator/migrator-6b589cdcc-q79g8" Mar 18 16:46:40.171376 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:40.171313 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zs8qn\" (UniqueName: \"kubernetes.io/projected/45a755dc-3ea3-453d-b63c-eb0cb38273c6-kube-api-access-zs8qn\") pod \"migrator-6b589cdcc-q79g8\" (UID: \"45a755dc-3ea3-453d-b63c-eb0cb38273c6\") " pod="openshift-kube-storage-version-migrator/migrator-6b589cdcc-q79g8" Mar 18 16:46:40.180070 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:40.180040 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zs8qn\" (UniqueName: \"kubernetes.io/projected/45a755dc-3ea3-453d-b63c-eb0cb38273c6-kube-api-access-zs8qn\") pod \"migrator-6b589cdcc-q79g8\" (UID: \"45a755dc-3ea3-453d-b63c-eb0cb38273c6\") " pod="openshift-kube-storage-version-migrator/migrator-6b589cdcc-q79g8" Mar 18 16:46:40.303385 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:40.303283 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-6b589cdcc-q79g8" Mar 18 16:46:40.489672 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:40.489646 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-6b589cdcc-q79g8"] Mar 18 16:46:40.494017 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:46:40.493980 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod45a755dc_3ea3_453d_b63c_eb0cb38273c6.slice/crio-44fbb2fc1038e3a88a4150bde45182fc07de4478d3dff8bf011919380e2588e7 WatchSource:0}: Error finding container 44fbb2fc1038e3a88a4150bde45182fc07de4478d3dff8bf011919380e2588e7: Status 404 returned error can't find the container with id 44fbb2fc1038e3a88a4150bde45182fc07de4478d3dff8bf011919380e2588e7 Mar 18 16:46:40.636216 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:40.636130 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-56f6f4cbcb-xrm6l" event={"ID":"13e69639-78a2-49ad-ba0a-b3ea46d6a337","Type":"ContainerStarted","Data":"1395c2c998099790122485fcb471064cc470e85c2b81bbf003e73d85fda38e4b"} Mar 18 16:46:40.637259 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:40.637226 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-6b589cdcc-q79g8" event={"ID":"45a755dc-3ea3-453d-b63c-eb0cb38273c6","Type":"ContainerStarted","Data":"44fbb2fc1038e3a88a4150bde45182fc07de4478d3dff8bf011919380e2588e7"} Mar 18 16:46:40.653087 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:40.653045 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-56f6f4cbcb-xrm6l" podStartSLOduration=1.8909303 podStartE2EDuration="3.653031798s" podCreationTimestamp="2026-03-18 16:46:37 +0000 UTC" firstStartedPulling="2026-03-18 16:46:38.660096961 +0000 UTC m=+116.971349860" lastFinishedPulling="2026-03-18 16:46:40.422198456 +0000 UTC m=+118.733451358" observedRunningTime="2026-03-18 16:46:40.651915287 +0000 UTC m=+118.963168208" watchObservedRunningTime="2026-03-18 16:46:40.653031798 +0000 UTC m=+118.964284718" Mar 18 16:46:41.642017 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:41.641990 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-6b589cdcc-q79g8" event={"ID":"45a755dc-3ea3-453d-b63c-eb0cb38273c6","Type":"ContainerStarted","Data":"1ef32cf0b5863f33785a3f8a7ae53e7a4798c6905bbc1640ec2a4127d49eca99"} Mar 18 16:46:41.642390 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:41.642027 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-6b589cdcc-q79g8" event={"ID":"45a755dc-3ea3-453d-b63c-eb0cb38273c6","Type":"ContainerStarted","Data":"4e4007935395df75d61dc1006dc4dd31d39a4b0c4354968f87a69d10074cbb1c"} Mar 18 16:46:41.658691 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:41.658649 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-6b589cdcc-q79g8" podStartSLOduration=1.693009684 podStartE2EDuration="2.65863501s" podCreationTimestamp="2026-03-18 16:46:39 +0000 UTC" firstStartedPulling="2026-03-18 16:46:40.495838938 +0000 UTC m=+118.807091837" lastFinishedPulling="2026-03-18 16:46:41.461464265 +0000 UTC m=+119.772717163" observedRunningTime="2026-03-18 16:46:41.657983092 +0000 UTC m=+119.969236017" watchObservedRunningTime="2026-03-18 16:46:41.65863501 +0000 UTC m=+119.969887932" Mar 18 16:46:41.728608 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:41.728545 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-6f67dbf4f-lgtj8"] Mar 18 16:46:41.731465 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:41.731450 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6f67dbf4f-lgtj8" Mar 18 16:46:41.733563 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:41.733544 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Mar 18 16:46:41.733711 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:41.733692 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Mar 18 16:46:41.733945 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:41.733923 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-s9hrx\"" Mar 18 16:46:41.734058 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:41.734043 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Mar 18 16:46:41.739522 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:41.739505 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Mar 18 16:46:41.744515 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:41.744496 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6f67dbf4f-lgtj8"] Mar 18 16:46:41.784961 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:41.784931 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/17876db9-c53e-40dd-80de-09a831ec5a49-ca-trust-extracted\") pod \"image-registry-6f67dbf4f-lgtj8\" (UID: \"17876db9-c53e-40dd-80de-09a831ec5a49\") " pod="openshift-image-registry/image-registry-6f67dbf4f-lgtj8" Mar 18 16:46:41.785063 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:41.784991 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/17876db9-c53e-40dd-80de-09a831ec5a49-registry-certificates\") pod \"image-registry-6f67dbf4f-lgtj8\" (UID: \"17876db9-c53e-40dd-80de-09a831ec5a49\") " pod="openshift-image-registry/image-registry-6f67dbf4f-lgtj8" Mar 18 16:46:41.785063 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:41.785039 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/17876db9-c53e-40dd-80de-09a831ec5a49-trusted-ca\") pod \"image-registry-6f67dbf4f-lgtj8\" (UID: \"17876db9-c53e-40dd-80de-09a831ec5a49\") " pod="openshift-image-registry/image-registry-6f67dbf4f-lgtj8" Mar 18 16:46:41.785170 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:41.785107 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/17876db9-c53e-40dd-80de-09a831ec5a49-registry-tls\") pod \"image-registry-6f67dbf4f-lgtj8\" (UID: \"17876db9-c53e-40dd-80de-09a831ec5a49\") " pod="openshift-image-registry/image-registry-6f67dbf4f-lgtj8" Mar 18 16:46:41.785238 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:41.785214 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/17876db9-c53e-40dd-80de-09a831ec5a49-image-registry-private-configuration\") pod \"image-registry-6f67dbf4f-lgtj8\" (UID: \"17876db9-c53e-40dd-80de-09a831ec5a49\") " pod="openshift-image-registry/image-registry-6f67dbf4f-lgtj8" Mar 18 16:46:41.785295 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:41.785258 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/17876db9-c53e-40dd-80de-09a831ec5a49-installation-pull-secrets\") pod \"image-registry-6f67dbf4f-lgtj8\" (UID: \"17876db9-c53e-40dd-80de-09a831ec5a49\") " pod="openshift-image-registry/image-registry-6f67dbf4f-lgtj8" Mar 18 16:46:41.785295 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:41.785279 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/17876db9-c53e-40dd-80de-09a831ec5a49-bound-sa-token\") pod \"image-registry-6f67dbf4f-lgtj8\" (UID: \"17876db9-c53e-40dd-80de-09a831ec5a49\") " pod="openshift-image-registry/image-registry-6f67dbf4f-lgtj8" Mar 18 16:46:41.785416 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:41.785310 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvwnq\" (UniqueName: \"kubernetes.io/projected/17876db9-c53e-40dd-80de-09a831ec5a49-kube-api-access-hvwnq\") pod \"image-registry-6f67dbf4f-lgtj8\" (UID: \"17876db9-c53e-40dd-80de-09a831ec5a49\") " pod="openshift-image-registry/image-registry-6f67dbf4f-lgtj8" Mar 18 16:46:41.886488 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:41.886448 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/17876db9-c53e-40dd-80de-09a831ec5a49-registry-tls\") pod \"image-registry-6f67dbf4f-lgtj8\" (UID: \"17876db9-c53e-40dd-80de-09a831ec5a49\") " pod="openshift-image-registry/image-registry-6f67dbf4f-lgtj8" Mar 18 16:46:41.886635 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:41.886510 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/17876db9-c53e-40dd-80de-09a831ec5a49-image-registry-private-configuration\") pod \"image-registry-6f67dbf4f-lgtj8\" (UID: \"17876db9-c53e-40dd-80de-09a831ec5a49\") " pod="openshift-image-registry/image-registry-6f67dbf4f-lgtj8" Mar 18 16:46:41.886635 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:41.886539 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/17876db9-c53e-40dd-80de-09a831ec5a49-installation-pull-secrets\") pod \"image-registry-6f67dbf4f-lgtj8\" (UID: \"17876db9-c53e-40dd-80de-09a831ec5a49\") " pod="openshift-image-registry/image-registry-6f67dbf4f-lgtj8" Mar 18 16:46:41.886635 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:41.886563 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/17876db9-c53e-40dd-80de-09a831ec5a49-bound-sa-token\") pod \"image-registry-6f67dbf4f-lgtj8\" (UID: \"17876db9-c53e-40dd-80de-09a831ec5a49\") " pod="openshift-image-registry/image-registry-6f67dbf4f-lgtj8" Mar 18 16:46:41.886635 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:46:41.886596 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Mar 18 16:46:41.886635 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:46:41.886632 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6f67dbf4f-lgtj8: secret "image-registry-tls" not found Mar 18 16:46:41.886859 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:46:41.886684 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/17876db9-c53e-40dd-80de-09a831ec5a49-registry-tls podName:17876db9-c53e-40dd-80de-09a831ec5a49 nodeName:}" failed. No retries permitted until 2026-03-18 16:46:42.386666564 +0000 UTC m=+120.697919478 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/17876db9-c53e-40dd-80de-09a831ec5a49-registry-tls") pod "image-registry-6f67dbf4f-lgtj8" (UID: "17876db9-c53e-40dd-80de-09a831ec5a49") : secret "image-registry-tls" not found Mar 18 16:46:41.886859 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:41.886597 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hvwnq\" (UniqueName: \"kubernetes.io/projected/17876db9-c53e-40dd-80de-09a831ec5a49-kube-api-access-hvwnq\") pod \"image-registry-6f67dbf4f-lgtj8\" (UID: \"17876db9-c53e-40dd-80de-09a831ec5a49\") " pod="openshift-image-registry/image-registry-6f67dbf4f-lgtj8" Mar 18 16:46:41.887262 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:41.887172 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/17876db9-c53e-40dd-80de-09a831ec5a49-ca-trust-extracted\") pod \"image-registry-6f67dbf4f-lgtj8\" (UID: \"17876db9-c53e-40dd-80de-09a831ec5a49\") " pod="openshift-image-registry/image-registry-6f67dbf4f-lgtj8" Mar 18 16:46:41.887392 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:41.887265 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/17876db9-c53e-40dd-80de-09a831ec5a49-registry-certificates\") pod \"image-registry-6f67dbf4f-lgtj8\" (UID: \"17876db9-c53e-40dd-80de-09a831ec5a49\") " pod="openshift-image-registry/image-registry-6f67dbf4f-lgtj8" Mar 18 16:46:41.887392 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:41.887313 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/17876db9-c53e-40dd-80de-09a831ec5a49-trusted-ca\") pod \"image-registry-6f67dbf4f-lgtj8\" (UID: \"17876db9-c53e-40dd-80de-09a831ec5a49\") " pod="openshift-image-registry/image-registry-6f67dbf4f-lgtj8" Mar 18 16:46:41.891240 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:41.887805 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/17876db9-c53e-40dd-80de-09a831ec5a49-ca-trust-extracted\") pod \"image-registry-6f67dbf4f-lgtj8\" (UID: \"17876db9-c53e-40dd-80de-09a831ec5a49\") " pod="openshift-image-registry/image-registry-6f67dbf4f-lgtj8" Mar 18 16:46:41.891240 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:41.887962 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/17876db9-c53e-40dd-80de-09a831ec5a49-registry-certificates\") pod \"image-registry-6f67dbf4f-lgtj8\" (UID: \"17876db9-c53e-40dd-80de-09a831ec5a49\") " pod="openshift-image-registry/image-registry-6f67dbf4f-lgtj8" Mar 18 16:46:41.892377 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:41.892325 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/17876db9-c53e-40dd-80de-09a831ec5a49-installation-pull-secrets\") pod \"image-registry-6f67dbf4f-lgtj8\" (UID: \"17876db9-c53e-40dd-80de-09a831ec5a49\") " pod="openshift-image-registry/image-registry-6f67dbf4f-lgtj8" Mar 18 16:46:41.892673 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:41.892651 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/17876db9-c53e-40dd-80de-09a831ec5a49-trusted-ca\") pod \"image-registry-6f67dbf4f-lgtj8\" (UID: \"17876db9-c53e-40dd-80de-09a831ec5a49\") " pod="openshift-image-registry/image-registry-6f67dbf4f-lgtj8" Mar 18 16:46:41.893741 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:41.893719 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/17876db9-c53e-40dd-80de-09a831ec5a49-image-registry-private-configuration\") pod \"image-registry-6f67dbf4f-lgtj8\" (UID: \"17876db9-c53e-40dd-80de-09a831ec5a49\") " pod="openshift-image-registry/image-registry-6f67dbf4f-lgtj8" Mar 18 16:46:41.897254 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:41.897230 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/17876db9-c53e-40dd-80de-09a831ec5a49-bound-sa-token\") pod \"image-registry-6f67dbf4f-lgtj8\" (UID: \"17876db9-c53e-40dd-80de-09a831ec5a49\") " pod="openshift-image-registry/image-registry-6f67dbf4f-lgtj8" Mar 18 16:46:41.897424 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:41.897409 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvwnq\" (UniqueName: \"kubernetes.io/projected/17876db9-c53e-40dd-80de-09a831ec5a49-kube-api-access-hvwnq\") pod \"image-registry-6f67dbf4f-lgtj8\" (UID: \"17876db9-c53e-40dd-80de-09a831ec5a49\") " pod="openshift-image-registry/image-registry-6f67dbf4f-lgtj8" Mar 18 16:46:42.391306 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:42.391272 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/17876db9-c53e-40dd-80de-09a831ec5a49-registry-tls\") pod \"image-registry-6f67dbf4f-lgtj8\" (UID: \"17876db9-c53e-40dd-80de-09a831ec5a49\") " pod="openshift-image-registry/image-registry-6f67dbf4f-lgtj8" Mar 18 16:46:42.391490 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:46:42.391430 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Mar 18 16:46:42.391490 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:46:42.391447 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6f67dbf4f-lgtj8: secret "image-registry-tls" not found Mar 18 16:46:42.391577 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:46:42.391508 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/17876db9-c53e-40dd-80de-09a831ec5a49-registry-tls podName:17876db9-c53e-40dd-80de-09a831ec5a49 nodeName:}" failed. No retries permitted until 2026-03-18 16:46:43.391493594 +0000 UTC m=+121.702746493 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/17876db9-c53e-40dd-80de-09a831ec5a49-registry-tls") pod "image-registry-6f67dbf4f-lgtj8" (UID: "17876db9-c53e-40dd-80de-09a831ec5a49") : secret "image-registry-tls" not found Mar 18 16:46:43.208948 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:43.208921 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-kmk8h_0452f967-d43e-4fc8-8591-3f8887d642ef/dns-node-resolver/0.log" Mar 18 16:46:43.397723 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:43.397690 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/17876db9-c53e-40dd-80de-09a831ec5a49-registry-tls\") pod \"image-registry-6f67dbf4f-lgtj8\" (UID: \"17876db9-c53e-40dd-80de-09a831ec5a49\") " pod="openshift-image-registry/image-registry-6f67dbf4f-lgtj8" Mar 18 16:46:43.397892 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:46:43.397800 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Mar 18 16:46:43.397892 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:46:43.397813 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6f67dbf4f-lgtj8: secret "image-registry-tls" not found Mar 18 16:46:43.397892 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:46:43.397861 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/17876db9-c53e-40dd-80de-09a831ec5a49-registry-tls podName:17876db9-c53e-40dd-80de-09a831ec5a49 nodeName:}" failed. No retries permitted until 2026-03-18 16:46:45.39784747 +0000 UTC m=+123.709100370 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/17876db9-c53e-40dd-80de-09a831ec5a49-registry-tls") pod "image-registry-6f67dbf4f-lgtj8" (UID: "17876db9-c53e-40dd-80de-09a831ec5a49") : secret "image-registry-tls" not found Mar 18 16:46:43.902104 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:43.902069 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/50735fa3-85e2-4bcb-ad67-9dd10af25a64-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-b58cd5d8d-rph27\" (UID: \"50735fa3-85e2-4bcb-ad67-9dd10af25a64\") " pod="openshift-monitoring/cluster-monitoring-operator-b58cd5d8d-rph27" Mar 18 16:46:43.902274 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:46:43.902214 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 18 16:46:43.902316 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:46:43.902285 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/50735fa3-85e2-4bcb-ad67-9dd10af25a64-cluster-monitoring-operator-tls podName:50735fa3-85e2-4bcb-ad67-9dd10af25a64 nodeName:}" failed. No retries permitted until 2026-03-18 16:46:51.902270122 +0000 UTC m=+130.213523020 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/50735fa3-85e2-4bcb-ad67-9dd10af25a64-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-b58cd5d8d-rph27" (UID: "50735fa3-85e2-4bcb-ad67-9dd10af25a64") : secret "cluster-monitoring-operator-tls" not found Mar 18 16:46:44.003403 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:44.003347 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/9143e08e-771d-4436-b780-ab9b14656685-samples-operator-tls\") pod \"cluster-samples-operator-d5df4776c-sr5rw\" (UID: \"9143e08e-771d-4436-b780-ab9b14656685\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-d5df4776c-sr5rw" Mar 18 16:46:44.003551 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:46:44.003467 2575 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Mar 18 16:46:44.003551 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:46:44.003531 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9143e08e-771d-4436-b780-ab9b14656685-samples-operator-tls podName:9143e08e-771d-4436-b780-ab9b14656685 nodeName:}" failed. No retries permitted until 2026-03-18 16:46:52.003515544 +0000 UTC m=+130.314768443 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/9143e08e-771d-4436-b780-ab9b14656685-samples-operator-tls") pod "cluster-samples-operator-d5df4776c-sr5rw" (UID: "9143e08e-771d-4436-b780-ab9b14656685") : secret "samples-operator-tls" not found Mar 18 16:46:44.008522 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:44.008504 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-lqcmn_1def291b-47ff-4e2f-b1a9-30275edc2bd9/node-ca/0.log" Mar 18 16:46:45.209858 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:45.209818 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-6b589cdcc-q79g8_45a755dc-3ea3-453d-b63c-eb0cb38273c6/migrator/0.log" Mar 18 16:46:45.408879 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:45.408843 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-6b589cdcc-q79g8_45a755dc-3ea3-453d-b63c-eb0cb38273c6/graceful-termination/0.log" Mar 18 16:46:45.413141 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:45.413121 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/17876db9-c53e-40dd-80de-09a831ec5a49-registry-tls\") pod \"image-registry-6f67dbf4f-lgtj8\" (UID: \"17876db9-c53e-40dd-80de-09a831ec5a49\") " pod="openshift-image-registry/image-registry-6f67dbf4f-lgtj8" Mar 18 16:46:45.413238 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:46:45.413226 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Mar 18 16:46:45.413278 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:46:45.413241 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6f67dbf4f-lgtj8: secret "image-registry-tls" not found Mar 18 16:46:45.413311 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:46:45.413285 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/17876db9-c53e-40dd-80de-09a831ec5a49-registry-tls podName:17876db9-c53e-40dd-80de-09a831ec5a49 nodeName:}" failed. No retries permitted until 2026-03-18 16:46:49.413272719 +0000 UTC m=+127.724525622 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/17876db9-c53e-40dd-80de-09a831ec5a49-registry-tls") pod "image-registry-6f67dbf4f-lgtj8" (UID: "17876db9-c53e-40dd-80de-09a831ec5a49") : secret "image-registry-tls" not found Mar 18 16:46:45.610688 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:45.610606 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-866f46547-cwhks_a6b6b964-0003-450e-8d89-0bc782c50559/kube-storage-version-migrator-operator/0.log" Mar 18 16:46:49.441861 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:49.441828 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/17876db9-c53e-40dd-80de-09a831ec5a49-registry-tls\") pod \"image-registry-6f67dbf4f-lgtj8\" (UID: \"17876db9-c53e-40dd-80de-09a831ec5a49\") " pod="openshift-image-registry/image-registry-6f67dbf4f-lgtj8" Mar 18 16:46:49.442325 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:46:49.441996 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Mar 18 16:46:49.442325 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:46:49.442021 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6f67dbf4f-lgtj8: secret "image-registry-tls" not found Mar 18 16:46:49.442325 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:46:49.442106 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/17876db9-c53e-40dd-80de-09a831ec5a49-registry-tls podName:17876db9-c53e-40dd-80de-09a831ec5a49 nodeName:}" failed. No retries permitted until 2026-03-18 16:46:57.442084792 +0000 UTC m=+135.753337699 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/17876db9-c53e-40dd-80de-09a831ec5a49-registry-tls") pod "image-registry-6f67dbf4f-lgtj8" (UID: "17876db9-c53e-40dd-80de-09a831ec5a49") : secret "image-registry-tls" not found Mar 18 16:46:51.962604 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:51.962570 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d6948911-017b-4b29-b362-5520b984c273-metrics-certs\") pod \"network-metrics-daemon-nbj8s\" (UID: \"d6948911-017b-4b29-b362-5520b984c273\") " pod="openshift-multus/network-metrics-daemon-nbj8s" Mar 18 16:46:51.963009 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:51.962642 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/50735fa3-85e2-4bcb-ad67-9dd10af25a64-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-b58cd5d8d-rph27\" (UID: \"50735fa3-85e2-4bcb-ad67-9dd10af25a64\") " pod="openshift-monitoring/cluster-monitoring-operator-b58cd5d8d-rph27" Mar 18 16:46:51.963009 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:46:51.962723 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Mar 18 16:46:51.963009 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:46:51.962741 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 18 16:46:51.963009 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:46:51.962796 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/50735fa3-85e2-4bcb-ad67-9dd10af25a64-cluster-monitoring-operator-tls podName:50735fa3-85e2-4bcb-ad67-9dd10af25a64 nodeName:}" failed. No retries permitted until 2026-03-18 16:47:07.962777267 +0000 UTC m=+146.274030167 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/50735fa3-85e2-4bcb-ad67-9dd10af25a64-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-b58cd5d8d-rph27" (UID: "50735fa3-85e2-4bcb-ad67-9dd10af25a64") : secret "cluster-monitoring-operator-tls" not found Mar 18 16:46:51.963009 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:46:51.962810 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d6948911-017b-4b29-b362-5520b984c273-metrics-certs podName:d6948911-017b-4b29-b362-5520b984c273 nodeName:}" failed. No retries permitted until 2026-03-18 16:48:53.962804355 +0000 UTC m=+252.274057254 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d6948911-017b-4b29-b362-5520b984c273-metrics-certs") pod "network-metrics-daemon-nbj8s" (UID: "d6948911-017b-4b29-b362-5520b984c273") : secret "metrics-daemon-secret" not found Mar 18 16:46:52.063307 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:52.063275 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/9143e08e-771d-4436-b780-ab9b14656685-samples-operator-tls\") pod \"cluster-samples-operator-d5df4776c-sr5rw\" (UID: \"9143e08e-771d-4436-b780-ab9b14656685\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-d5df4776c-sr5rw" Mar 18 16:46:52.065796 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:52.065769 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/9143e08e-771d-4436-b780-ab9b14656685-samples-operator-tls\") pod \"cluster-samples-operator-d5df4776c-sr5rw\" (UID: \"9143e08e-771d-4436-b780-ab9b14656685\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-d5df4776c-sr5rw" Mar 18 16:46:52.179286 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:52.179258 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-trscj\"" Mar 18 16:46:52.187853 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:52.187836 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-d5df4776c-sr5rw" Mar 18 16:46:52.303614 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:52.303582 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-d5df4776c-sr5rw"] Mar 18 16:46:52.670206 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:52.670172 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-d5df4776c-sr5rw" event={"ID":"9143e08e-771d-4436-b780-ab9b14656685","Type":"ContainerStarted","Data":"b3eb786c837b3ebfc143894ef55e5466bab65cc4e603421e31d70f2e7cbb52ce"} Mar 18 16:46:54.676203 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:54.676169 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-d5df4776c-sr5rw" event={"ID":"9143e08e-771d-4436-b780-ab9b14656685","Type":"ContainerStarted","Data":"02b05cfcdbe8f4c81f0ea976ffddbfc2d89baac1d731478c3b8ce614d48d692b"} Mar 18 16:46:54.676203 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:54.676200 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-d5df4776c-sr5rw" event={"ID":"9143e08e-771d-4436-b780-ab9b14656685","Type":"ContainerStarted","Data":"b31457f58dbf46fb34ade8924eb722a3bae4c58aedcf5804e4c3ba0d96112bfc"} Mar 18 16:46:54.692333 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:54.692285 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-d5df4776c-sr5rw" podStartSLOduration=17.336360297 podStartE2EDuration="18.692271268s" podCreationTimestamp="2026-03-18 16:46:36 +0000 UTC" firstStartedPulling="2026-03-18 16:46:52.348127645 +0000 UTC m=+130.659380544" lastFinishedPulling="2026-03-18 16:46:53.704038617 +0000 UTC m=+132.015291515" observedRunningTime="2026-03-18 16:46:54.691413028 +0000 UTC m=+133.002665952" watchObservedRunningTime="2026-03-18 16:46:54.692271268 +0000 UTC m=+133.003524188" Mar 18 16:46:57.501349 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:57.501298 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/17876db9-c53e-40dd-80de-09a831ec5a49-registry-tls\") pod \"image-registry-6f67dbf4f-lgtj8\" (UID: \"17876db9-c53e-40dd-80de-09a831ec5a49\") " pod="openshift-image-registry/image-registry-6f67dbf4f-lgtj8" Mar 18 16:46:57.503851 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:57.503831 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/17876db9-c53e-40dd-80de-09a831ec5a49-registry-tls\") pod \"image-registry-6f67dbf4f-lgtj8\" (UID: \"17876db9-c53e-40dd-80de-09a831ec5a49\") " pod="openshift-image-registry/image-registry-6f67dbf4f-lgtj8" Mar 18 16:46:57.642684 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:57.642628 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-s9hrx\"" Mar 18 16:46:57.650893 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:57.650868 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6f67dbf4f-lgtj8" Mar 18 16:46:57.772586 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:57.772517 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6f67dbf4f-lgtj8"] Mar 18 16:46:57.775321 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:46:57.775292 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17876db9_c53e_40dd_80de_09a831ec5a49.slice/crio-c149cc637e29dc8435cf89fab086c910ef3be6f101fd2a2c70fa51c25f3bed4a WatchSource:0}: Error finding container c149cc637e29dc8435cf89fab086c910ef3be6f101fd2a2c70fa51c25f3bed4a: Status 404 returned error can't find the container with id c149cc637e29dc8435cf89fab086c910ef3be6f101fd2a2c70fa51c25f3bed4a Mar 18 16:46:58.686787 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:58.686748 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6f67dbf4f-lgtj8" event={"ID":"17876db9-c53e-40dd-80de-09a831ec5a49","Type":"ContainerStarted","Data":"cf86cba8a39e5a4419a18745802177d628044be3135d81b111cfc387d584adaa"} Mar 18 16:46:58.686787 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:58.686794 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6f67dbf4f-lgtj8" event={"ID":"17876db9-c53e-40dd-80de-09a831ec5a49","Type":"ContainerStarted","Data":"c149cc637e29dc8435cf89fab086c910ef3be6f101fd2a2c70fa51c25f3bed4a"} Mar 18 16:46:58.687228 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:58.686912 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-6f67dbf4f-lgtj8" Mar 18 16:46:58.707632 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:46:58.707593 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-6f67dbf4f-lgtj8" podStartSLOduration=17.707578996 podStartE2EDuration="17.707578996s" podCreationTimestamp="2026-03-18 16:46:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:46:58.706624056 +0000 UTC m=+137.017876976" watchObservedRunningTime="2026-03-18 16:46:58.707578996 +0000 UTC m=+137.018831953" Mar 18 16:47:07.976203 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:07.976165 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/50735fa3-85e2-4bcb-ad67-9dd10af25a64-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-b58cd5d8d-rph27\" (UID: \"50735fa3-85e2-4bcb-ad67-9dd10af25a64\") " pod="openshift-monitoring/cluster-monitoring-operator-b58cd5d8d-rph27" Mar 18 16:47:07.978769 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:07.978743 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/50735fa3-85e2-4bcb-ad67-9dd10af25a64-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-b58cd5d8d-rph27\" (UID: \"50735fa3-85e2-4bcb-ad67-9dd10af25a64\") " pod="openshift-monitoring/cluster-monitoring-operator-b58cd5d8d-rph27" Mar 18 16:47:07.999499 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:07.999474 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-kwq6l\"" Mar 18 16:47:08.007321 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:08.007305 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-b58cd5d8d-rph27" Mar 18 16:47:08.122495 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:08.122470 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-b58cd5d8d-rph27"] Mar 18 16:47:08.124865 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:47:08.124831 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod50735fa3_85e2_4bcb_ad67_9dd10af25a64.slice/crio-cc4f37f5fb265c9101fbc550a121344cd02e671b5c3a5c7054e6cda0a537e307 WatchSource:0}: Error finding container cc4f37f5fb265c9101fbc550a121344cd02e671b5c3a5c7054e6cda0a537e307: Status 404 returned error can't find the container with id cc4f37f5fb265c9101fbc550a121344cd02e671b5c3a5c7054e6cda0a537e307 Mar 18 16:47:08.712076 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:08.712038 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-b58cd5d8d-rph27" event={"ID":"50735fa3-85e2-4bcb-ad67-9dd10af25a64","Type":"ContainerStarted","Data":"cc4f37f5fb265c9101fbc550a121344cd02e671b5c3a5c7054e6cda0a537e307"} Mar 18 16:47:10.573178 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:10.573143 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-6f67dbf4f-lgtj8"] Mar 18 16:47:10.620392 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:10.620344 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-678cf799c4-b9fk8"] Mar 18 16:47:10.622310 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:10.622293 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-678cf799c4-b9fk8" Mar 18 16:47:10.626296 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:10.626273 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-vpb6s"] Mar 18 16:47:10.628351 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:10.628304 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-vpb6s" Mar 18 16:47:10.630537 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:10.630517 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Mar 18 16:47:10.630845 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:10.630576 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Mar 18 16:47:10.631052 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:10.631025 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-h8gkt\"" Mar 18 16:47:10.631178 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:10.631096 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Mar 18 16:47:10.631452 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:10.631434 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Mar 18 16:47:10.636222 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:10.636200 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-678cf799c4-b9fk8"] Mar 18 16:47:10.639911 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:10.639891 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-vpb6s"] Mar 18 16:47:10.717800 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:10.717760 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-b58cd5d8d-rph27" event={"ID":"50735fa3-85e2-4bcb-ad67-9dd10af25a64","Type":"ContainerStarted","Data":"2bc14a5b925e6403cfe4ba9448d4b323f66b929e08dff15eb214bd47d498d785"} Mar 18 16:47:10.735654 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:10.735596 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-b58cd5d8d-rph27" podStartSLOduration=32.882556062 podStartE2EDuration="34.735577411s" podCreationTimestamp="2026-03-18 16:46:36 +0000 UTC" firstStartedPulling="2026-03-18 16:47:08.126760318 +0000 UTC m=+146.438013218" lastFinishedPulling="2026-03-18 16:47:09.979781667 +0000 UTC m=+148.291034567" observedRunningTime="2026-03-18 16:47:10.734202455 +0000 UTC m=+149.045455377" watchObservedRunningTime="2026-03-18 16:47:10.735577411 +0000 UTC m=+149.046830333" Mar 18 16:47:10.800046 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:10.800017 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e6fb9fc9-d5ea-480c-8389-da779da43ffd-bound-sa-token\") pod \"image-registry-678cf799c4-b9fk8\" (UID: \"e6fb9fc9-d5ea-480c-8389-da779da43ffd\") " pod="openshift-image-registry/image-registry-678cf799c4-b9fk8" Mar 18 16:47:10.800181 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:10.800053 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/87197265-a9f7-4fd6-bbde-1bec72074140-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-vpb6s\" (UID: \"87197265-a9f7-4fd6-bbde-1bec72074140\") " pod="openshift-insights/insights-runtime-extractor-vpb6s" Mar 18 16:47:10.800181 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:10.800097 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/87197265-a9f7-4fd6-bbde-1bec72074140-crio-socket\") pod \"insights-runtime-extractor-vpb6s\" (UID: \"87197265-a9f7-4fd6-bbde-1bec72074140\") " pod="openshift-insights/insights-runtime-extractor-vpb6s" Mar 18 16:47:10.800270 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:10.800205 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e6fb9fc9-d5ea-480c-8389-da779da43ffd-ca-trust-extracted\") pod \"image-registry-678cf799c4-b9fk8\" (UID: \"e6fb9fc9-d5ea-480c-8389-da779da43ffd\") " pod="openshift-image-registry/image-registry-678cf799c4-b9fk8" Mar 18 16:47:10.800270 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:10.800244 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/87197265-a9f7-4fd6-bbde-1bec72074140-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-vpb6s\" (UID: \"87197265-a9f7-4fd6-bbde-1bec72074140\") " pod="openshift-insights/insights-runtime-extractor-vpb6s" Mar 18 16:47:10.800347 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:10.800277 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e6fb9fc9-d5ea-480c-8389-da779da43ffd-registry-tls\") pod \"image-registry-678cf799c4-b9fk8\" (UID: \"e6fb9fc9-d5ea-480c-8389-da779da43ffd\") " pod="openshift-image-registry/image-registry-678cf799c4-b9fk8" Mar 18 16:47:10.800347 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:10.800303 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hs2ps\" (UniqueName: \"kubernetes.io/projected/e6fb9fc9-d5ea-480c-8389-da779da43ffd-kube-api-access-hs2ps\") pod \"image-registry-678cf799c4-b9fk8\" (UID: \"e6fb9fc9-d5ea-480c-8389-da779da43ffd\") " pod="openshift-image-registry/image-registry-678cf799c4-b9fk8" Mar 18 16:47:10.800347 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:10.800335 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e6fb9fc9-d5ea-480c-8389-da779da43ffd-installation-pull-secrets\") pod \"image-registry-678cf799c4-b9fk8\" (UID: \"e6fb9fc9-d5ea-480c-8389-da779da43ffd\") " pod="openshift-image-registry/image-registry-678cf799c4-b9fk8" Mar 18 16:47:10.800477 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:10.800394 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e6fb9fc9-d5ea-480c-8389-da779da43ffd-image-registry-private-configuration\") pod \"image-registry-678cf799c4-b9fk8\" (UID: \"e6fb9fc9-d5ea-480c-8389-da779da43ffd\") " pod="openshift-image-registry/image-registry-678cf799c4-b9fk8" Mar 18 16:47:10.800477 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:10.800437 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/87197265-a9f7-4fd6-bbde-1bec72074140-data-volume\") pod \"insights-runtime-extractor-vpb6s\" (UID: \"87197265-a9f7-4fd6-bbde-1bec72074140\") " pod="openshift-insights/insights-runtime-extractor-vpb6s" Mar 18 16:47:10.800477 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:10.800472 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-scwh6\" (UniqueName: \"kubernetes.io/projected/87197265-a9f7-4fd6-bbde-1bec72074140-kube-api-access-scwh6\") pod \"insights-runtime-extractor-vpb6s\" (UID: \"87197265-a9f7-4fd6-bbde-1bec72074140\") " pod="openshift-insights/insights-runtime-extractor-vpb6s" Mar 18 16:47:10.800594 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:10.800505 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e6fb9fc9-d5ea-480c-8389-da779da43ffd-registry-certificates\") pod \"image-registry-678cf799c4-b9fk8\" (UID: \"e6fb9fc9-d5ea-480c-8389-da779da43ffd\") " pod="openshift-image-registry/image-registry-678cf799c4-b9fk8" Mar 18 16:47:10.800594 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:10.800586 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e6fb9fc9-d5ea-480c-8389-da779da43ffd-trusted-ca\") pod \"image-registry-678cf799c4-b9fk8\" (UID: \"e6fb9fc9-d5ea-480c-8389-da779da43ffd\") " pod="openshift-image-registry/image-registry-678cf799c4-b9fk8" Mar 18 16:47:10.901966 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:10.901933 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e6fb9fc9-d5ea-480c-8389-da779da43ffd-installation-pull-secrets\") pod \"image-registry-678cf799c4-b9fk8\" (UID: \"e6fb9fc9-d5ea-480c-8389-da779da43ffd\") " pod="openshift-image-registry/image-registry-678cf799c4-b9fk8" Mar 18 16:47:10.902145 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:10.901984 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e6fb9fc9-d5ea-480c-8389-da779da43ffd-image-registry-private-configuration\") pod \"image-registry-678cf799c4-b9fk8\" (UID: \"e6fb9fc9-d5ea-480c-8389-da779da43ffd\") " pod="openshift-image-registry/image-registry-678cf799c4-b9fk8" Mar 18 16:47:10.902145 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:10.902031 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/87197265-a9f7-4fd6-bbde-1bec72074140-data-volume\") pod \"insights-runtime-extractor-vpb6s\" (UID: \"87197265-a9f7-4fd6-bbde-1bec72074140\") " pod="openshift-insights/insights-runtime-extractor-vpb6s" Mar 18 16:47:10.902145 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:10.902067 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-scwh6\" (UniqueName: \"kubernetes.io/projected/87197265-a9f7-4fd6-bbde-1bec72074140-kube-api-access-scwh6\") pod \"insights-runtime-extractor-vpb6s\" (UID: \"87197265-a9f7-4fd6-bbde-1bec72074140\") " pod="openshift-insights/insights-runtime-extractor-vpb6s" Mar 18 16:47:10.902145 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:10.902098 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e6fb9fc9-d5ea-480c-8389-da779da43ffd-registry-certificates\") pod \"image-registry-678cf799c4-b9fk8\" (UID: \"e6fb9fc9-d5ea-480c-8389-da779da43ffd\") " pod="openshift-image-registry/image-registry-678cf799c4-b9fk8" Mar 18 16:47:10.902145 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:10.902136 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e6fb9fc9-d5ea-480c-8389-da779da43ffd-trusted-ca\") pod \"image-registry-678cf799c4-b9fk8\" (UID: \"e6fb9fc9-d5ea-480c-8389-da779da43ffd\") " pod="openshift-image-registry/image-registry-678cf799c4-b9fk8" Mar 18 16:47:10.902424 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:10.902196 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e6fb9fc9-d5ea-480c-8389-da779da43ffd-bound-sa-token\") pod \"image-registry-678cf799c4-b9fk8\" (UID: \"e6fb9fc9-d5ea-480c-8389-da779da43ffd\") " pod="openshift-image-registry/image-registry-678cf799c4-b9fk8" Mar 18 16:47:10.902424 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:10.902229 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/87197265-a9f7-4fd6-bbde-1bec72074140-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-vpb6s\" (UID: \"87197265-a9f7-4fd6-bbde-1bec72074140\") " pod="openshift-insights/insights-runtime-extractor-vpb6s" Mar 18 16:47:10.902424 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:10.902260 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/87197265-a9f7-4fd6-bbde-1bec72074140-crio-socket\") pod \"insights-runtime-extractor-vpb6s\" (UID: \"87197265-a9f7-4fd6-bbde-1bec72074140\") " pod="openshift-insights/insights-runtime-extractor-vpb6s" Mar 18 16:47:10.902424 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:10.902301 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e6fb9fc9-d5ea-480c-8389-da779da43ffd-ca-trust-extracted\") pod \"image-registry-678cf799c4-b9fk8\" (UID: \"e6fb9fc9-d5ea-480c-8389-da779da43ffd\") " pod="openshift-image-registry/image-registry-678cf799c4-b9fk8" Mar 18 16:47:10.902424 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:10.902328 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/87197265-a9f7-4fd6-bbde-1bec72074140-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-vpb6s\" (UID: \"87197265-a9f7-4fd6-bbde-1bec72074140\") " pod="openshift-insights/insights-runtime-extractor-vpb6s" Mar 18 16:47:10.902424 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:10.902375 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/87197265-a9f7-4fd6-bbde-1bec72074140-data-volume\") pod \"insights-runtime-extractor-vpb6s\" (UID: \"87197265-a9f7-4fd6-bbde-1bec72074140\") " pod="openshift-insights/insights-runtime-extractor-vpb6s" Mar 18 16:47:10.902424 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:10.902387 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e6fb9fc9-d5ea-480c-8389-da779da43ffd-registry-tls\") pod \"image-registry-678cf799c4-b9fk8\" (UID: \"e6fb9fc9-d5ea-480c-8389-da779da43ffd\") " pod="openshift-image-registry/image-registry-678cf799c4-b9fk8" Mar 18 16:47:10.902424 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:10.902417 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hs2ps\" (UniqueName: \"kubernetes.io/projected/e6fb9fc9-d5ea-480c-8389-da779da43ffd-kube-api-access-hs2ps\") pod \"image-registry-678cf799c4-b9fk8\" (UID: \"e6fb9fc9-d5ea-480c-8389-da779da43ffd\") " pod="openshift-image-registry/image-registry-678cf799c4-b9fk8" Mar 18 16:47:10.902819 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:10.902471 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/87197265-a9f7-4fd6-bbde-1bec72074140-crio-socket\") pod \"insights-runtime-extractor-vpb6s\" (UID: \"87197265-a9f7-4fd6-bbde-1bec72074140\") " pod="openshift-insights/insights-runtime-extractor-vpb6s" Mar 18 16:47:10.902873 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:10.902841 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e6fb9fc9-d5ea-480c-8389-da779da43ffd-ca-trust-extracted\") pod \"image-registry-678cf799c4-b9fk8\" (UID: \"e6fb9fc9-d5ea-480c-8389-da779da43ffd\") " pod="openshift-image-registry/image-registry-678cf799c4-b9fk8" Mar 18 16:47:10.903538 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:10.903129 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e6fb9fc9-d5ea-480c-8389-da779da43ffd-registry-certificates\") pod \"image-registry-678cf799c4-b9fk8\" (UID: \"e6fb9fc9-d5ea-480c-8389-da779da43ffd\") " pod="openshift-image-registry/image-registry-678cf799c4-b9fk8" Mar 18 16:47:10.903538 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:10.903332 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/87197265-a9f7-4fd6-bbde-1bec72074140-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-vpb6s\" (UID: \"87197265-a9f7-4fd6-bbde-1bec72074140\") " pod="openshift-insights/insights-runtime-extractor-vpb6s" Mar 18 16:47:10.903828 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:10.903778 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e6fb9fc9-d5ea-480c-8389-da779da43ffd-trusted-ca\") pod \"image-registry-678cf799c4-b9fk8\" (UID: \"e6fb9fc9-d5ea-480c-8389-da779da43ffd\") " pod="openshift-image-registry/image-registry-678cf799c4-b9fk8" Mar 18 16:47:10.905408 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:10.905387 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e6fb9fc9-d5ea-480c-8389-da779da43ffd-installation-pull-secrets\") pod \"image-registry-678cf799c4-b9fk8\" (UID: \"e6fb9fc9-d5ea-480c-8389-da779da43ffd\") " pod="openshift-image-registry/image-registry-678cf799c4-b9fk8" Mar 18 16:47:10.905408 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:10.905390 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/87197265-a9f7-4fd6-bbde-1bec72074140-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-vpb6s\" (UID: \"87197265-a9f7-4fd6-bbde-1bec72074140\") " pod="openshift-insights/insights-runtime-extractor-vpb6s" Mar 18 16:47:10.905916 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:10.905894 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e6fb9fc9-d5ea-480c-8389-da779da43ffd-image-registry-private-configuration\") pod \"image-registry-678cf799c4-b9fk8\" (UID: \"e6fb9fc9-d5ea-480c-8389-da779da43ffd\") " pod="openshift-image-registry/image-registry-678cf799c4-b9fk8" Mar 18 16:47:10.905956 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:10.905894 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e6fb9fc9-d5ea-480c-8389-da779da43ffd-registry-tls\") pod \"image-registry-678cf799c4-b9fk8\" (UID: \"e6fb9fc9-d5ea-480c-8389-da779da43ffd\") " pod="openshift-image-registry/image-registry-678cf799c4-b9fk8" Mar 18 16:47:10.911146 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:10.911126 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e6fb9fc9-d5ea-480c-8389-da779da43ffd-bound-sa-token\") pod \"image-registry-678cf799c4-b9fk8\" (UID: \"e6fb9fc9-d5ea-480c-8389-da779da43ffd\") " pod="openshift-image-registry/image-registry-678cf799c4-b9fk8" Mar 18 16:47:10.912303 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:10.912274 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hs2ps\" (UniqueName: \"kubernetes.io/projected/e6fb9fc9-d5ea-480c-8389-da779da43ffd-kube-api-access-hs2ps\") pod \"image-registry-678cf799c4-b9fk8\" (UID: \"e6fb9fc9-d5ea-480c-8389-da779da43ffd\") " pod="openshift-image-registry/image-registry-678cf799c4-b9fk8" Mar 18 16:47:10.912618 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:10.912598 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-scwh6\" (UniqueName: \"kubernetes.io/projected/87197265-a9f7-4fd6-bbde-1bec72074140-kube-api-access-scwh6\") pod \"insights-runtime-extractor-vpb6s\" (UID: \"87197265-a9f7-4fd6-bbde-1bec72074140\") " pod="openshift-insights/insights-runtime-extractor-vpb6s" Mar 18 16:47:10.932762 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:10.932735 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-678cf799c4-b9fk8" Mar 18 16:47:10.942263 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:10.942145 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-vpb6s" Mar 18 16:47:11.066287 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:11.066223 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-678cf799c4-b9fk8"] Mar 18 16:47:11.076932 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:47:11.076875 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6fb9fc9_d5ea_480c_8389_da779da43ffd.slice/crio-98f258cfea8aeb2d82b411b4a5b8f69decb498cc331cf63c6baed26f59532fb5 WatchSource:0}: Error finding container 98f258cfea8aeb2d82b411b4a5b8f69decb498cc331cf63c6baed26f59532fb5: Status 404 returned error can't find the container with id 98f258cfea8aeb2d82b411b4a5b8f69decb498cc331cf63c6baed26f59532fb5 Mar 18 16:47:11.101196 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:11.101167 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-vpb6s"] Mar 18 16:47:11.110569 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:47:11.110538 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod87197265_a9f7_4fd6_bbde_1bec72074140.slice/crio-7b410cd87f7662a722c7818a875ddbe1cfa85e13436a1fdc81c279c795dbe4a6 WatchSource:0}: Error finding container 7b410cd87f7662a722c7818a875ddbe1cfa85e13436a1fdc81c279c795dbe4a6: Status 404 returned error can't find the container with id 7b410cd87f7662a722c7818a875ddbe1cfa85e13436a1fdc81c279c795dbe4a6 Mar 18 16:47:11.722067 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:11.722030 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-678cf799c4-b9fk8" event={"ID":"e6fb9fc9-d5ea-480c-8389-da779da43ffd","Type":"ContainerStarted","Data":"2ea611e3a71e912e79db06b96959b62c84997d47ef3ba4e925bec35a0e324297"} Mar 18 16:47:11.722067 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:11.722070 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-678cf799c4-b9fk8" event={"ID":"e6fb9fc9-d5ea-480c-8389-da779da43ffd","Type":"ContainerStarted","Data":"98f258cfea8aeb2d82b411b4a5b8f69decb498cc331cf63c6baed26f59532fb5"} Mar 18 16:47:11.722598 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:11.722150 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-678cf799c4-b9fk8" Mar 18 16:47:11.723618 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:11.723593 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-vpb6s" event={"ID":"87197265-a9f7-4fd6-bbde-1bec72074140","Type":"ContainerStarted","Data":"cac91b0844f7ca2fb1f27a9ad79b471c1013c2c33298e1aa6ec5b2f0c2fce382"} Mar 18 16:47:11.723729 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:11.723620 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-vpb6s" event={"ID":"87197265-a9f7-4fd6-bbde-1bec72074140","Type":"ContainerStarted","Data":"7b410cd87f7662a722c7818a875ddbe1cfa85e13436a1fdc81c279c795dbe4a6"} Mar 18 16:47:11.748255 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:11.746468 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-678cf799c4-b9fk8" podStartSLOduration=1.7464506910000002 podStartE2EDuration="1.746450691s" podCreationTimestamp="2026-03-18 16:47:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:47:11.745177596 +0000 UTC m=+150.056430518" watchObservedRunningTime="2026-03-18 16:47:11.746450691 +0000 UTC m=+150.057703616" Mar 18 16:47:12.728552 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:12.728512 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-vpb6s" event={"ID":"87197265-a9f7-4fd6-bbde-1bec72074140","Type":"ContainerStarted","Data":"b449b82a2cc4e05cd3d3b5b8905613990515f64e123defab0036e4d32634d40d"} Mar 18 16:47:13.732341 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:13.732307 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-vpb6s" event={"ID":"87197265-a9f7-4fd6-bbde-1bec72074140","Type":"ContainerStarted","Data":"0b54b8abb4fb53fef85f32f35be7de08b2bafca585e8cc6c8ee24638291fc7ae"} Mar 18 16:47:13.752741 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:13.752697 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-vpb6s" podStartSLOduration=1.730023266 podStartE2EDuration="3.752683653s" podCreationTimestamp="2026-03-18 16:47:10 +0000 UTC" firstStartedPulling="2026-03-18 16:47:11.17591324 +0000 UTC m=+149.487166142" lastFinishedPulling="2026-03-18 16:47:13.198573627 +0000 UTC m=+151.509826529" observedRunningTime="2026-03-18 16:47:13.752096252 +0000 UTC m=+152.063349198" watchObservedRunningTime="2026-03-18 16:47:13.752683653 +0000 UTC m=+152.063936571" Mar 18 16:47:14.663714 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:14.663681 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-6b948c769-kphbn"] Mar 18 16:47:14.665936 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:14.665922 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-6b948c769-kphbn" Mar 18 16:47:14.667880 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:14.667856 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-5nqrn\"" Mar 18 16:47:14.668471 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:14.668436 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Mar 18 16:47:14.668726 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:14.668694 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Mar 18 16:47:14.668948 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:14.668934 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Mar 18 16:47:14.677258 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:14.677239 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-6b948c769-kphbn"] Mar 18 16:47:14.833913 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:14.833887 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/a090e3ca-196c-44e4-8c04-ba9e61392e3c-prometheus-operator-tls\") pod \"prometheus-operator-6b948c769-kphbn\" (UID: \"a090e3ca-196c-44e4-8c04-ba9e61392e3c\") " pod="openshift-monitoring/prometheus-operator-6b948c769-kphbn" Mar 18 16:47:14.834251 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:14.833965 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a090e3ca-196c-44e4-8c04-ba9e61392e3c-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-6b948c769-kphbn\" (UID: \"a090e3ca-196c-44e4-8c04-ba9e61392e3c\") " pod="openshift-monitoring/prometheus-operator-6b948c769-kphbn" Mar 18 16:47:14.834251 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:14.834024 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhg9j\" (UniqueName: \"kubernetes.io/projected/a090e3ca-196c-44e4-8c04-ba9e61392e3c-kube-api-access-lhg9j\") pod \"prometheus-operator-6b948c769-kphbn\" (UID: \"a090e3ca-196c-44e4-8c04-ba9e61392e3c\") " pod="openshift-monitoring/prometheus-operator-6b948c769-kphbn" Mar 18 16:47:14.834251 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:14.834073 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a090e3ca-196c-44e4-8c04-ba9e61392e3c-metrics-client-ca\") pod \"prometheus-operator-6b948c769-kphbn\" (UID: \"a090e3ca-196c-44e4-8c04-ba9e61392e3c\") " pod="openshift-monitoring/prometheus-operator-6b948c769-kphbn" Mar 18 16:47:14.935079 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:14.934977 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/a090e3ca-196c-44e4-8c04-ba9e61392e3c-prometheus-operator-tls\") pod \"prometheus-operator-6b948c769-kphbn\" (UID: \"a090e3ca-196c-44e4-8c04-ba9e61392e3c\") " pod="openshift-monitoring/prometheus-operator-6b948c769-kphbn" Mar 18 16:47:14.935079 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:14.935051 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a090e3ca-196c-44e4-8c04-ba9e61392e3c-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-6b948c769-kphbn\" (UID: \"a090e3ca-196c-44e4-8c04-ba9e61392e3c\") " pod="openshift-monitoring/prometheus-operator-6b948c769-kphbn" Mar 18 16:47:14.935264 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:14.935103 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lhg9j\" (UniqueName: \"kubernetes.io/projected/a090e3ca-196c-44e4-8c04-ba9e61392e3c-kube-api-access-lhg9j\") pod \"prometheus-operator-6b948c769-kphbn\" (UID: \"a090e3ca-196c-44e4-8c04-ba9e61392e3c\") " pod="openshift-monitoring/prometheus-operator-6b948c769-kphbn" Mar 18 16:47:14.935264 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:14.935130 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a090e3ca-196c-44e4-8c04-ba9e61392e3c-metrics-client-ca\") pod \"prometheus-operator-6b948c769-kphbn\" (UID: \"a090e3ca-196c-44e4-8c04-ba9e61392e3c\") " pod="openshift-monitoring/prometheus-operator-6b948c769-kphbn" Mar 18 16:47:14.935842 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:14.935818 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a090e3ca-196c-44e4-8c04-ba9e61392e3c-metrics-client-ca\") pod \"prometheus-operator-6b948c769-kphbn\" (UID: \"a090e3ca-196c-44e4-8c04-ba9e61392e3c\") " pod="openshift-monitoring/prometheus-operator-6b948c769-kphbn" Mar 18 16:47:14.937589 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:14.937565 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a090e3ca-196c-44e4-8c04-ba9e61392e3c-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-6b948c769-kphbn\" (UID: \"a090e3ca-196c-44e4-8c04-ba9e61392e3c\") " pod="openshift-monitoring/prometheus-operator-6b948c769-kphbn" Mar 18 16:47:14.937686 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:14.937641 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/a090e3ca-196c-44e4-8c04-ba9e61392e3c-prometheus-operator-tls\") pod \"prometheus-operator-6b948c769-kphbn\" (UID: \"a090e3ca-196c-44e4-8c04-ba9e61392e3c\") " pod="openshift-monitoring/prometheus-operator-6b948c769-kphbn" Mar 18 16:47:14.943860 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:14.943836 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhg9j\" (UniqueName: \"kubernetes.io/projected/a090e3ca-196c-44e4-8c04-ba9e61392e3c-kube-api-access-lhg9j\") pod \"prometheus-operator-6b948c769-kphbn\" (UID: \"a090e3ca-196c-44e4-8c04-ba9e61392e3c\") " pod="openshift-monitoring/prometheus-operator-6b948c769-kphbn" Mar 18 16:47:14.974940 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:14.974905 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-6b948c769-kphbn" Mar 18 16:47:15.091161 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:15.091128 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-6b948c769-kphbn"] Mar 18 16:47:15.093691 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:47:15.093663 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda090e3ca_196c_44e4_8c04_ba9e61392e3c.slice/crio-60a5247c9f3a65194d6203bef12f49bf75ddc9aa93a9edd508fd4c3237294d77 WatchSource:0}: Error finding container 60a5247c9f3a65194d6203bef12f49bf75ddc9aa93a9edd508fd4c3237294d77: Status 404 returned error can't find the container with id 60a5247c9f3a65194d6203bef12f49bf75ddc9aa93a9edd508fd4c3237294d77 Mar 18 16:47:15.739156 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:15.739124 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-6b948c769-kphbn" event={"ID":"a090e3ca-196c-44e4-8c04-ba9e61392e3c","Type":"ContainerStarted","Data":"60a5247c9f3a65194d6203bef12f49bf75ddc9aa93a9edd508fd4c3237294d77"} Mar 18 16:47:16.743482 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:16.743446 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-6b948c769-kphbn" event={"ID":"a090e3ca-196c-44e4-8c04-ba9e61392e3c","Type":"ContainerStarted","Data":"63ddd9066d2aa00023085c28b405890a7d4be083795c9c2f5a99b1ec291ba08a"} Mar 18 16:47:16.743847 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:16.743488 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-6b948c769-kphbn" event={"ID":"a090e3ca-196c-44e4-8c04-ba9e61392e3c","Type":"ContainerStarted","Data":"12328cdd40834e8b56ec8e93f38fd1c45187d87d612ae5d4128e6575cf877a5a"} Mar 18 16:47:16.762642 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:16.762597 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-6b948c769-kphbn" podStartSLOduration=1.486240778 podStartE2EDuration="2.762578784s" podCreationTimestamp="2026-03-18 16:47:14 +0000 UTC" firstStartedPulling="2026-03-18 16:47:15.095479294 +0000 UTC m=+153.406732194" lastFinishedPulling="2026-03-18 16:47:16.371817301 +0000 UTC m=+154.683070200" observedRunningTime="2026-03-18 16:47:16.761738194 +0000 UTC m=+155.072991151" watchObservedRunningTime="2026-03-18 16:47:16.762578784 +0000 UTC m=+155.073831739" Mar 18 16:47:19.154986 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:47:19.154948 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-697ns" podUID="7443181f-d141-4218-bcc9-ca8fbafa0034" Mar 18 16:47:19.168127 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:47:19.168093 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-cx9cp" podUID="df0141d5-7fdb-4d38-a11e-e2f21fffe1bb" Mar 18 16:47:19.401991 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:19.401955 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-2m2rc"] Mar 18 16:47:19.404350 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:19.404335 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-2m2rc" Mar 18 16:47:19.410025 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:19.410004 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Mar 18 16:47:19.410854 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:19.410534 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Mar 18 16:47:19.411798 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:19.411780 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-775mt\"" Mar 18 16:47:19.415061 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:19.415046 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Mar 18 16:47:19.574378 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:19.574329 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/1ee329a3-8fbd-4f66-b69a-0534ee4fe51c-node-exporter-accelerators-collector-config\") pod \"node-exporter-2m2rc\" (UID: \"1ee329a3-8fbd-4f66-b69a-0534ee4fe51c\") " pod="openshift-monitoring/node-exporter-2m2rc" Mar 18 16:47:19.574571 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:19.574394 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1ee329a3-8fbd-4f66-b69a-0534ee4fe51c-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-2m2rc\" (UID: \"1ee329a3-8fbd-4f66-b69a-0534ee4fe51c\") " pod="openshift-monitoring/node-exporter-2m2rc" Mar 18 16:47:19.574571 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:19.574445 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrxx7\" (UniqueName: \"kubernetes.io/projected/1ee329a3-8fbd-4f66-b69a-0534ee4fe51c-kube-api-access-jrxx7\") pod \"node-exporter-2m2rc\" (UID: \"1ee329a3-8fbd-4f66-b69a-0534ee4fe51c\") " pod="openshift-monitoring/node-exporter-2m2rc" Mar 18 16:47:19.574571 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:19.574499 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/1ee329a3-8fbd-4f66-b69a-0534ee4fe51c-node-exporter-wtmp\") pod \"node-exporter-2m2rc\" (UID: \"1ee329a3-8fbd-4f66-b69a-0534ee4fe51c\") " pod="openshift-monitoring/node-exporter-2m2rc" Mar 18 16:47:19.574571 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:19.574533 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/1ee329a3-8fbd-4f66-b69a-0534ee4fe51c-node-exporter-textfile\") pod \"node-exporter-2m2rc\" (UID: \"1ee329a3-8fbd-4f66-b69a-0534ee4fe51c\") " pod="openshift-monitoring/node-exporter-2m2rc" Mar 18 16:47:19.574571 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:19.574558 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/1ee329a3-8fbd-4f66-b69a-0534ee4fe51c-root\") pod \"node-exporter-2m2rc\" (UID: \"1ee329a3-8fbd-4f66-b69a-0534ee4fe51c\") " pod="openshift-monitoring/node-exporter-2m2rc" Mar 18 16:47:19.574800 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:19.574585 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1ee329a3-8fbd-4f66-b69a-0534ee4fe51c-sys\") pod \"node-exporter-2m2rc\" (UID: \"1ee329a3-8fbd-4f66-b69a-0534ee4fe51c\") " pod="openshift-monitoring/node-exporter-2m2rc" Mar 18 16:47:19.574800 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:19.574613 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1ee329a3-8fbd-4f66-b69a-0534ee4fe51c-metrics-client-ca\") pod \"node-exporter-2m2rc\" (UID: \"1ee329a3-8fbd-4f66-b69a-0534ee4fe51c\") " pod="openshift-monitoring/node-exporter-2m2rc" Mar 18 16:47:19.574800 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:19.574643 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/1ee329a3-8fbd-4f66-b69a-0534ee4fe51c-node-exporter-tls\") pod \"node-exporter-2m2rc\" (UID: \"1ee329a3-8fbd-4f66-b69a-0534ee4fe51c\") " pod="openshift-monitoring/node-exporter-2m2rc" Mar 18 16:47:19.675737 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:19.675660 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/1ee329a3-8fbd-4f66-b69a-0534ee4fe51c-node-exporter-accelerators-collector-config\") pod \"node-exporter-2m2rc\" (UID: \"1ee329a3-8fbd-4f66-b69a-0534ee4fe51c\") " pod="openshift-monitoring/node-exporter-2m2rc" Mar 18 16:47:19.675737 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:19.675723 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1ee329a3-8fbd-4f66-b69a-0534ee4fe51c-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-2m2rc\" (UID: \"1ee329a3-8fbd-4f66-b69a-0534ee4fe51c\") " pod="openshift-monitoring/node-exporter-2m2rc" Mar 18 16:47:19.675922 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:19.675756 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jrxx7\" (UniqueName: \"kubernetes.io/projected/1ee329a3-8fbd-4f66-b69a-0534ee4fe51c-kube-api-access-jrxx7\") pod \"node-exporter-2m2rc\" (UID: \"1ee329a3-8fbd-4f66-b69a-0534ee4fe51c\") " pod="openshift-monitoring/node-exporter-2m2rc" Mar 18 16:47:19.675922 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:19.675787 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/1ee329a3-8fbd-4f66-b69a-0534ee4fe51c-node-exporter-wtmp\") pod \"node-exporter-2m2rc\" (UID: \"1ee329a3-8fbd-4f66-b69a-0534ee4fe51c\") " pod="openshift-monitoring/node-exporter-2m2rc" Mar 18 16:47:19.675922 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:19.675818 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/1ee329a3-8fbd-4f66-b69a-0534ee4fe51c-node-exporter-textfile\") pod \"node-exporter-2m2rc\" (UID: \"1ee329a3-8fbd-4f66-b69a-0534ee4fe51c\") " pod="openshift-monitoring/node-exporter-2m2rc" Mar 18 16:47:19.675922 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:19.675843 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/1ee329a3-8fbd-4f66-b69a-0534ee4fe51c-root\") pod \"node-exporter-2m2rc\" (UID: \"1ee329a3-8fbd-4f66-b69a-0534ee4fe51c\") " pod="openshift-monitoring/node-exporter-2m2rc" Mar 18 16:47:19.675922 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:19.675873 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1ee329a3-8fbd-4f66-b69a-0534ee4fe51c-sys\") pod \"node-exporter-2m2rc\" (UID: \"1ee329a3-8fbd-4f66-b69a-0534ee4fe51c\") " pod="openshift-monitoring/node-exporter-2m2rc" Mar 18 16:47:19.675922 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:19.675903 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1ee329a3-8fbd-4f66-b69a-0534ee4fe51c-metrics-client-ca\") pod \"node-exporter-2m2rc\" (UID: \"1ee329a3-8fbd-4f66-b69a-0534ee4fe51c\") " pod="openshift-monitoring/node-exporter-2m2rc" Mar 18 16:47:19.676203 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:19.675933 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/1ee329a3-8fbd-4f66-b69a-0534ee4fe51c-node-exporter-tls\") pod \"node-exporter-2m2rc\" (UID: \"1ee329a3-8fbd-4f66-b69a-0534ee4fe51c\") " pod="openshift-monitoring/node-exporter-2m2rc" Mar 18 16:47:19.676203 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:19.675959 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/1ee329a3-8fbd-4f66-b69a-0534ee4fe51c-node-exporter-wtmp\") pod \"node-exporter-2m2rc\" (UID: \"1ee329a3-8fbd-4f66-b69a-0534ee4fe51c\") " pod="openshift-monitoring/node-exporter-2m2rc" Mar 18 16:47:19.676203 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:19.675974 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/1ee329a3-8fbd-4f66-b69a-0534ee4fe51c-root\") pod \"node-exporter-2m2rc\" (UID: \"1ee329a3-8fbd-4f66-b69a-0534ee4fe51c\") " pod="openshift-monitoring/node-exporter-2m2rc" Mar 18 16:47:19.676203 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:19.676046 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1ee329a3-8fbd-4f66-b69a-0534ee4fe51c-sys\") pod \"node-exporter-2m2rc\" (UID: \"1ee329a3-8fbd-4f66-b69a-0534ee4fe51c\") " pod="openshift-monitoring/node-exporter-2m2rc" Mar 18 16:47:19.676203 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:19.676164 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/1ee329a3-8fbd-4f66-b69a-0534ee4fe51c-node-exporter-textfile\") pod \"node-exporter-2m2rc\" (UID: \"1ee329a3-8fbd-4f66-b69a-0534ee4fe51c\") " pod="openshift-monitoring/node-exporter-2m2rc" Mar 18 16:47:19.676471 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:19.676422 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1ee329a3-8fbd-4f66-b69a-0534ee4fe51c-metrics-client-ca\") pod \"node-exporter-2m2rc\" (UID: \"1ee329a3-8fbd-4f66-b69a-0534ee4fe51c\") " pod="openshift-monitoring/node-exporter-2m2rc" Mar 18 16:47:19.676947 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:19.676924 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/1ee329a3-8fbd-4f66-b69a-0534ee4fe51c-node-exporter-accelerators-collector-config\") pod \"node-exporter-2m2rc\" (UID: \"1ee329a3-8fbd-4f66-b69a-0534ee4fe51c\") " pod="openshift-monitoring/node-exporter-2m2rc" Mar 18 16:47:19.678241 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:19.678218 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1ee329a3-8fbd-4f66-b69a-0534ee4fe51c-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-2m2rc\" (UID: \"1ee329a3-8fbd-4f66-b69a-0534ee4fe51c\") " pod="openshift-monitoring/node-exporter-2m2rc" Mar 18 16:47:19.678430 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:19.678414 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/1ee329a3-8fbd-4f66-b69a-0534ee4fe51c-node-exporter-tls\") pod \"node-exporter-2m2rc\" (UID: \"1ee329a3-8fbd-4f66-b69a-0534ee4fe51c\") " pod="openshift-monitoring/node-exporter-2m2rc" Mar 18 16:47:19.686015 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:19.685995 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrxx7\" (UniqueName: \"kubernetes.io/projected/1ee329a3-8fbd-4f66-b69a-0534ee4fe51c-kube-api-access-jrxx7\") pod \"node-exporter-2m2rc\" (UID: \"1ee329a3-8fbd-4f66-b69a-0534ee4fe51c\") " pod="openshift-monitoring/node-exporter-2m2rc" Mar 18 16:47:19.713001 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:19.712981 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-2m2rc" Mar 18 16:47:19.722074 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:47:19.722049 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ee329a3_8fbd_4f66_b69a_0534ee4fe51c.slice/crio-a7d5db036900588fffe27ccea0c3ae81d5a71df4a367cbd0826db5d1e296397a WatchSource:0}: Error finding container a7d5db036900588fffe27ccea0c3ae81d5a71df4a367cbd0826db5d1e296397a: Status 404 returned error can't find the container with id a7d5db036900588fffe27ccea0c3ae81d5a71df4a367cbd0826db5d1e296397a Mar 18 16:47:19.751633 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:19.751604 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-2m2rc" event={"ID":"1ee329a3-8fbd-4f66-b69a-0534ee4fe51c","Type":"ContainerStarted","Data":"a7d5db036900588fffe27ccea0c3ae81d5a71df4a367cbd0826db5d1e296397a"} Mar 18 16:47:19.751708 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:19.751654 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-697ns" Mar 18 16:47:20.297667 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:47:20.297634 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-nbj8s" podUID="d6948911-017b-4b29-b362-5520b984c273" Mar 18 16:47:20.580060 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:20.579715 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-6f67dbf4f-lgtj8" Mar 18 16:47:20.756042 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:20.756004 2575 generic.go:358] "Generic (PLEG): container finished" podID="1ee329a3-8fbd-4f66-b69a-0534ee4fe51c" containerID="7a66f15a25852aeffcae548b3447491e9360126df1ff283ebe2cd8bbf7ccf6fd" exitCode=0 Mar 18 16:47:20.756188 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:20.756066 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-2m2rc" event={"ID":"1ee329a3-8fbd-4f66-b69a-0534ee4fe51c","Type":"ContainerDied","Data":"7a66f15a25852aeffcae548b3447491e9360126df1ff283ebe2cd8bbf7ccf6fd"} Mar 18 16:47:21.761209 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:21.761179 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-2m2rc" event={"ID":"1ee329a3-8fbd-4f66-b69a-0534ee4fe51c","Type":"ContainerStarted","Data":"57eb89fc75b86aa142755e60cc694fbc3009016859b39de6c11372c72e6c2eea"} Mar 18 16:47:21.761691 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:21.761216 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-2m2rc" event={"ID":"1ee329a3-8fbd-4f66-b69a-0534ee4fe51c","Type":"ContainerStarted","Data":"83c64f92acc1f105e9ee6839b2cdb68f2f1ef693520703392239a3edcc4ae533"} Mar 18 16:47:21.803982 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:21.803940 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-2m2rc" podStartSLOduration=2.087008935 podStartE2EDuration="2.803926407s" podCreationTimestamp="2026-03-18 16:47:19 +0000 UTC" firstStartedPulling="2026-03-18 16:47:19.724175234 +0000 UTC m=+158.035428134" lastFinishedPulling="2026-03-18 16:47:20.441092706 +0000 UTC m=+158.752345606" observedRunningTime="2026-03-18 16:47:21.802535005 +0000 UTC m=+160.113787926" watchObservedRunningTime="2026-03-18 16:47:21.803926407 +0000 UTC m=+160.115179328" Mar 18 16:47:24.112011 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:24.111976 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7443181f-d141-4218-bcc9-ca8fbafa0034-metrics-tls\") pod \"dns-default-697ns\" (UID: \"7443181f-d141-4218-bcc9-ca8fbafa0034\") " pod="openshift-dns/dns-default-697ns" Mar 18 16:47:24.112407 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:24.112059 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/df0141d5-7fdb-4d38-a11e-e2f21fffe1bb-cert\") pod \"ingress-canary-cx9cp\" (UID: \"df0141d5-7fdb-4d38-a11e-e2f21fffe1bb\") " pod="openshift-ingress-canary/ingress-canary-cx9cp" Mar 18 16:47:24.114584 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:24.114560 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/df0141d5-7fdb-4d38-a11e-e2f21fffe1bb-cert\") pod \"ingress-canary-cx9cp\" (UID: \"df0141d5-7fdb-4d38-a11e-e2f21fffe1bb\") " pod="openshift-ingress-canary/ingress-canary-cx9cp" Mar 18 16:47:24.114584 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:24.114575 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7443181f-d141-4218-bcc9-ca8fbafa0034-metrics-tls\") pod \"dns-default-697ns\" (UID: \"7443181f-d141-4218-bcc9-ca8fbafa0034\") " pod="openshift-dns/dns-default-697ns" Mar 18 16:47:24.255453 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:24.255423 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-zpjpl\"" Mar 18 16:47:24.263989 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:24.263966 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-697ns" Mar 18 16:47:24.393690 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:24.393667 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-697ns"] Mar 18 16:47:24.398696 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:47:24.398664 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7443181f_d141_4218_bcc9_ca8fbafa0034.slice/crio-0f50c823e8b5e1d7e23f6263bd454e126224afbb132ace246f3ac58ceef40720 WatchSource:0}: Error finding container 0f50c823e8b5e1d7e23f6263bd454e126224afbb132ace246f3ac58ceef40720: Status 404 returned error can't find the container with id 0f50c823e8b5e1d7e23f6263bd454e126224afbb132ace246f3ac58ceef40720 Mar 18 16:47:24.769998 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:24.769911 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-697ns" event={"ID":"7443181f-d141-4218-bcc9-ca8fbafa0034","Type":"ContainerStarted","Data":"0f50c823e8b5e1d7e23f6263bd454e126224afbb132ace246f3ac58ceef40720"} Mar 18 16:47:26.775797 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:26.775764 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-697ns" event={"ID":"7443181f-d141-4218-bcc9-ca8fbafa0034","Type":"ContainerStarted","Data":"0ac218488bbbf2d325f9731561d5b374d7ec2906f7c32f2cd4561e2362df36a6"} Mar 18 16:47:26.775797 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:26.775801 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-697ns" event={"ID":"7443181f-d141-4218-bcc9-ca8fbafa0034","Type":"ContainerStarted","Data":"b6a4630c77246c6ddf57e4dcbfa5d584522c20950190f4f15e2bc1fa056d8040"} Mar 18 16:47:26.776181 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:26.775891 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-697ns" Mar 18 16:47:26.804815 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:26.804771 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-697ns" podStartSLOduration=129.41661045 podStartE2EDuration="2m10.804757143s" podCreationTimestamp="2026-03-18 16:45:16 +0000 UTC" firstStartedPulling="2026-03-18 16:47:24.400714963 +0000 UTC m=+162.711967863" lastFinishedPulling="2026-03-18 16:47:25.78886164 +0000 UTC m=+164.100114556" observedRunningTime="2026-03-18 16:47:26.803757048 +0000 UTC m=+165.115009969" watchObservedRunningTime="2026-03-18 16:47:26.804757143 +0000 UTC m=+165.116010064" Mar 18 16:47:30.286368 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:30.286324 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-cx9cp" Mar 18 16:47:30.289000 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:30.288983 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-9v9vx\"" Mar 18 16:47:30.297695 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:30.297679 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-cx9cp" Mar 18 16:47:30.420816 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:30.420785 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-cx9cp"] Mar 18 16:47:30.425679 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:47:30.425651 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf0141d5_7fdb_4d38_a11e_e2f21fffe1bb.slice/crio-a5dee11a4e91e033b09ae8d0e3f18dee08dbcb1174c4f49d8118f83af9758fb4 WatchSource:0}: Error finding container a5dee11a4e91e033b09ae8d0e3f18dee08dbcb1174c4f49d8118f83af9758fb4: Status 404 returned error can't find the container with id a5dee11a4e91e033b09ae8d0e3f18dee08dbcb1174c4f49d8118f83af9758fb4 Mar 18 16:47:30.788473 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:30.788444 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-cx9cp" event={"ID":"df0141d5-7fdb-4d38-a11e-e2f21fffe1bb","Type":"ContainerStarted","Data":"a5dee11a4e91e033b09ae8d0e3f18dee08dbcb1174c4f49d8118f83af9758fb4"} Mar 18 16:47:32.732868 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:32.732837 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-678cf799c4-b9fk8" Mar 18 16:47:32.795197 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:32.795165 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-cx9cp" event={"ID":"df0141d5-7fdb-4d38-a11e-e2f21fffe1bb","Type":"ContainerStarted","Data":"6bba25ac68c06cc449377febd66929d72823712b9e304d7cf3fa266ce15f08e1"} Mar 18 16:47:32.810399 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:32.810332 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-cx9cp" podStartSLOduration=135.278411587 podStartE2EDuration="2m16.810316922s" podCreationTimestamp="2026-03-18 16:45:16 +0000 UTC" firstStartedPulling="2026-03-18 16:47:30.427733427 +0000 UTC m=+168.738986330" lastFinishedPulling="2026-03-18 16:47:31.959638765 +0000 UTC m=+170.270891665" observedRunningTime="2026-03-18 16:47:32.81005055 +0000 UTC m=+171.121303472" watchObservedRunningTime="2026-03-18 16:47:32.810316922 +0000 UTC m=+171.121569844" Mar 18 16:47:35.285677 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:35.285643 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nbj8s" Mar 18 16:47:35.458184 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:35.458153 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-5b85974fd6-jcxw6"] Mar 18 16:47:35.461529 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:35.461513 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-5b85974fd6-jcxw6" Mar 18 16:47:35.463597 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:35.463573 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Mar 18 16:47:35.463730 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:35.463609 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Mar 18 16:47:35.463730 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:35.463607 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-lfxtw\"" Mar 18 16:47:35.472865 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:35.472846 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-5b85974fd6-jcxw6"] Mar 18 16:47:35.506701 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:35.506676 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfz7f\" (UniqueName: \"kubernetes.io/projected/d6c35e1e-b5a5-4e45-a199-26600d617988-kube-api-access-mfz7f\") pod \"downloads-5b85974fd6-jcxw6\" (UID: \"d6c35e1e-b5a5-4e45-a199-26600d617988\") " pod="openshift-console/downloads-5b85974fd6-jcxw6" Mar 18 16:47:35.592399 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:35.592323 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-6f67dbf4f-lgtj8" podUID="17876db9-c53e-40dd-80de-09a831ec5a49" containerName="registry" containerID="cri-o://cf86cba8a39e5a4419a18745802177d628044be3135d81b111cfc387d584adaa" gracePeriod=30 Mar 18 16:47:35.607976 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:35.607956 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mfz7f\" (UniqueName: \"kubernetes.io/projected/d6c35e1e-b5a5-4e45-a199-26600d617988-kube-api-access-mfz7f\") pod \"downloads-5b85974fd6-jcxw6\" (UID: \"d6c35e1e-b5a5-4e45-a199-26600d617988\") " pod="openshift-console/downloads-5b85974fd6-jcxw6" Mar 18 16:47:35.616098 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:35.616071 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfz7f\" (UniqueName: \"kubernetes.io/projected/d6c35e1e-b5a5-4e45-a199-26600d617988-kube-api-access-mfz7f\") pod \"downloads-5b85974fd6-jcxw6\" (UID: \"d6c35e1e-b5a5-4e45-a199-26600d617988\") " pod="openshift-console/downloads-5b85974fd6-jcxw6" Mar 18 16:47:35.770785 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:35.770762 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-5b85974fd6-jcxw6" Mar 18 16:47:35.808348 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:35.808314 2575 generic.go:358] "Generic (PLEG): container finished" podID="17876db9-c53e-40dd-80de-09a831ec5a49" containerID="cf86cba8a39e5a4419a18745802177d628044be3135d81b111cfc387d584adaa" exitCode=0 Mar 18 16:47:35.808537 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:35.808391 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6f67dbf4f-lgtj8" event={"ID":"17876db9-c53e-40dd-80de-09a831ec5a49","Type":"ContainerDied","Data":"cf86cba8a39e5a4419a18745802177d628044be3135d81b111cfc387d584adaa"} Mar 18 16:47:35.822183 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:35.822157 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6f67dbf4f-lgtj8" Mar 18 16:47:35.898386 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:35.898348 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-5b85974fd6-jcxw6"] Mar 18 16:47:35.900739 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:47:35.900713 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6c35e1e_b5a5_4e45_a199_26600d617988.slice/crio-69e52513c0df0c76c8cde283dba15cdf54d161e925c64007fc5d0a799bcabfaf WatchSource:0}: Error finding container 69e52513c0df0c76c8cde283dba15cdf54d161e925c64007fc5d0a799bcabfaf: Status 404 returned error can't find the container with id 69e52513c0df0c76c8cde283dba15cdf54d161e925c64007fc5d0a799bcabfaf Mar 18 16:47:35.909945 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:35.909927 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hvwnq\" (UniqueName: \"kubernetes.io/projected/17876db9-c53e-40dd-80de-09a831ec5a49-kube-api-access-hvwnq\") pod \"17876db9-c53e-40dd-80de-09a831ec5a49\" (UID: \"17876db9-c53e-40dd-80de-09a831ec5a49\") " Mar 18 16:47:35.910040 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:35.909977 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/17876db9-c53e-40dd-80de-09a831ec5a49-image-registry-private-configuration\") pod \"17876db9-c53e-40dd-80de-09a831ec5a49\" (UID: \"17876db9-c53e-40dd-80de-09a831ec5a49\") " Mar 18 16:47:35.910040 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:35.910017 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/17876db9-c53e-40dd-80de-09a831ec5a49-registry-tls\") pod \"17876db9-c53e-40dd-80de-09a831ec5a49\" (UID: \"17876db9-c53e-40dd-80de-09a831ec5a49\") " Mar 18 16:47:35.910201 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:35.910078 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/17876db9-c53e-40dd-80de-09a831ec5a49-ca-trust-extracted\") pod \"17876db9-c53e-40dd-80de-09a831ec5a49\" (UID: \"17876db9-c53e-40dd-80de-09a831ec5a49\") " Mar 18 16:47:35.910201 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:35.910120 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/17876db9-c53e-40dd-80de-09a831ec5a49-bound-sa-token\") pod \"17876db9-c53e-40dd-80de-09a831ec5a49\" (UID: \"17876db9-c53e-40dd-80de-09a831ec5a49\") " Mar 18 16:47:35.910201 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:35.910145 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/17876db9-c53e-40dd-80de-09a831ec5a49-installation-pull-secrets\") pod \"17876db9-c53e-40dd-80de-09a831ec5a49\" (UID: \"17876db9-c53e-40dd-80de-09a831ec5a49\") " Mar 18 16:47:35.910201 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:35.910174 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/17876db9-c53e-40dd-80de-09a831ec5a49-trusted-ca\") pod \"17876db9-c53e-40dd-80de-09a831ec5a49\" (UID: \"17876db9-c53e-40dd-80de-09a831ec5a49\") " Mar 18 16:47:35.910411 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:35.910202 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/17876db9-c53e-40dd-80de-09a831ec5a49-registry-certificates\") pod \"17876db9-c53e-40dd-80de-09a831ec5a49\" (UID: \"17876db9-c53e-40dd-80de-09a831ec5a49\") " Mar 18 16:47:35.911002 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:35.910961 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17876db9-c53e-40dd-80de-09a831ec5a49-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "17876db9-c53e-40dd-80de-09a831ec5a49" (UID: "17876db9-c53e-40dd-80de-09a831ec5a49"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 16:47:35.911523 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:35.911499 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17876db9-c53e-40dd-80de-09a831ec5a49-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "17876db9-c53e-40dd-80de-09a831ec5a49" (UID: "17876db9-c53e-40dd-80de-09a831ec5a49"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 16:47:35.912561 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:35.912532 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17876db9-c53e-40dd-80de-09a831ec5a49-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "17876db9-c53e-40dd-80de-09a831ec5a49" (UID: "17876db9-c53e-40dd-80de-09a831ec5a49"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 16:47:35.912651 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:35.912564 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17876db9-c53e-40dd-80de-09a831ec5a49-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "17876db9-c53e-40dd-80de-09a831ec5a49" (UID: "17876db9-c53e-40dd-80de-09a831ec5a49"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:47:35.912651 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:35.912642 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17876db9-c53e-40dd-80de-09a831ec5a49-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "17876db9-c53e-40dd-80de-09a831ec5a49" (UID: "17876db9-c53e-40dd-80de-09a831ec5a49"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:47:35.912743 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:35.912666 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17876db9-c53e-40dd-80de-09a831ec5a49-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "17876db9-c53e-40dd-80de-09a831ec5a49" (UID: "17876db9-c53e-40dd-80de-09a831ec5a49"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 16:47:35.912743 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:35.912701 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17876db9-c53e-40dd-80de-09a831ec5a49-kube-api-access-hvwnq" (OuterVolumeSpecName: "kube-api-access-hvwnq") pod "17876db9-c53e-40dd-80de-09a831ec5a49" (UID: "17876db9-c53e-40dd-80de-09a831ec5a49"). InnerVolumeSpecName "kube-api-access-hvwnq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 16:47:35.919136 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:35.919115 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17876db9-c53e-40dd-80de-09a831ec5a49-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "17876db9-c53e-40dd-80de-09a831ec5a49" (UID: "17876db9-c53e-40dd-80de-09a831ec5a49"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 18 16:47:36.011095 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:36.011073 2575 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/17876db9-c53e-40dd-80de-09a831ec5a49-ca-trust-extracted\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 16:47:36.011095 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:36.011093 2575 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/17876db9-c53e-40dd-80de-09a831ec5a49-bound-sa-token\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 16:47:36.011201 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:36.011105 2575 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/17876db9-c53e-40dd-80de-09a831ec5a49-installation-pull-secrets\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 16:47:36.011201 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:36.011114 2575 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/17876db9-c53e-40dd-80de-09a831ec5a49-trusted-ca\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 16:47:36.011201 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:36.011122 2575 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/17876db9-c53e-40dd-80de-09a831ec5a49-registry-certificates\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 16:47:36.011201 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:36.011130 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hvwnq\" (UniqueName: \"kubernetes.io/projected/17876db9-c53e-40dd-80de-09a831ec5a49-kube-api-access-hvwnq\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 16:47:36.011201 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:36.011139 2575 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/17876db9-c53e-40dd-80de-09a831ec5a49-image-registry-private-configuration\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 16:47:36.011201 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:36.011148 2575 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/17876db9-c53e-40dd-80de-09a831ec5a49-registry-tls\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 16:47:36.780845 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:36.780810 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-697ns" Mar 18 16:47:36.812814 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:36.812784 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6f67dbf4f-lgtj8" event={"ID":"17876db9-c53e-40dd-80de-09a831ec5a49","Type":"ContainerDied","Data":"c149cc637e29dc8435cf89fab086c910ef3be6f101fd2a2c70fa51c25f3bed4a"} Mar 18 16:47:36.812974 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:36.812826 2575 scope.go:117] "RemoveContainer" containerID="cf86cba8a39e5a4419a18745802177d628044be3135d81b111cfc387d584adaa" Mar 18 16:47:36.812974 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:36.812839 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6f67dbf4f-lgtj8" Mar 18 16:47:36.814407 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:36.814386 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-5b85974fd6-jcxw6" event={"ID":"d6c35e1e-b5a5-4e45-a199-26600d617988","Type":"ContainerStarted","Data":"69e52513c0df0c76c8cde283dba15cdf54d161e925c64007fc5d0a799bcabfaf"} Mar 18 16:47:36.827642 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:36.827620 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-6f67dbf4f-lgtj8"] Mar 18 16:47:36.830801 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:36.830781 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-6f67dbf4f-lgtj8"] Mar 18 16:47:38.290966 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:38.290929 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17876db9-c53e-40dd-80de-09a831ec5a49" path="/var/lib/kubelet/pods/17876db9-c53e-40dd-80de-09a831ec5a49/volumes" Mar 18 16:47:46.552081 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:46.552044 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-798979bb8b-mnh29"] Mar 18 16:47:46.552627 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:46.552396 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="17876db9-c53e-40dd-80de-09a831ec5a49" containerName="registry" Mar 18 16:47:46.552627 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:46.552412 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="17876db9-c53e-40dd-80de-09a831ec5a49" containerName="registry" Mar 18 16:47:46.552627 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:46.552472 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="17876db9-c53e-40dd-80de-09a831ec5a49" containerName="registry" Mar 18 16:47:46.555205 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:46.555177 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-798979bb8b-mnh29" Mar 18 16:47:46.557328 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:46.557190 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Mar 18 16:47:46.557328 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:46.557278 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Mar 18 16:47:46.557521 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:46.557473 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-fj66z\"" Mar 18 16:47:46.557587 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:46.557560 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Mar 18 16:47:46.557669 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:46.557646 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Mar 18 16:47:46.557731 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:46.557718 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Mar 18 16:47:46.566271 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:46.566246 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-798979bb8b-mnh29"] Mar 18 16:47:46.702125 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:46.702092 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1121bff0-0f26-4c36-b74b-447e2c0a6d43-console-config\") pod \"console-798979bb8b-mnh29\" (UID: \"1121bff0-0f26-4c36-b74b-447e2c0a6d43\") " pod="openshift-console/console-798979bb8b-mnh29" Mar 18 16:47:46.702293 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:46.702133 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1121bff0-0f26-4c36-b74b-447e2c0a6d43-service-ca\") pod \"console-798979bb8b-mnh29\" (UID: \"1121bff0-0f26-4c36-b74b-447e2c0a6d43\") " pod="openshift-console/console-798979bb8b-mnh29" Mar 18 16:47:46.702293 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:46.702164 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1121bff0-0f26-4c36-b74b-447e2c0a6d43-console-serving-cert\") pod \"console-798979bb8b-mnh29\" (UID: \"1121bff0-0f26-4c36-b74b-447e2c0a6d43\") " pod="openshift-console/console-798979bb8b-mnh29" Mar 18 16:47:46.702293 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:46.702184 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gxpj\" (UniqueName: \"kubernetes.io/projected/1121bff0-0f26-4c36-b74b-447e2c0a6d43-kube-api-access-7gxpj\") pod \"console-798979bb8b-mnh29\" (UID: \"1121bff0-0f26-4c36-b74b-447e2c0a6d43\") " pod="openshift-console/console-798979bb8b-mnh29" Mar 18 16:47:46.702293 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:46.702219 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1121bff0-0f26-4c36-b74b-447e2c0a6d43-console-oauth-config\") pod \"console-798979bb8b-mnh29\" (UID: \"1121bff0-0f26-4c36-b74b-447e2c0a6d43\") " pod="openshift-console/console-798979bb8b-mnh29" Mar 18 16:47:46.702496 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:46.702330 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1121bff0-0f26-4c36-b74b-447e2c0a6d43-oauth-serving-cert\") pod \"console-798979bb8b-mnh29\" (UID: \"1121bff0-0f26-4c36-b74b-447e2c0a6d43\") " pod="openshift-console/console-798979bb8b-mnh29" Mar 18 16:47:46.803558 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:46.803467 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1121bff0-0f26-4c36-b74b-447e2c0a6d43-oauth-serving-cert\") pod \"console-798979bb8b-mnh29\" (UID: \"1121bff0-0f26-4c36-b74b-447e2c0a6d43\") " pod="openshift-console/console-798979bb8b-mnh29" Mar 18 16:47:46.803558 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:46.803540 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1121bff0-0f26-4c36-b74b-447e2c0a6d43-console-config\") pod \"console-798979bb8b-mnh29\" (UID: \"1121bff0-0f26-4c36-b74b-447e2c0a6d43\") " pod="openshift-console/console-798979bb8b-mnh29" Mar 18 16:47:46.803781 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:46.803573 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1121bff0-0f26-4c36-b74b-447e2c0a6d43-service-ca\") pod \"console-798979bb8b-mnh29\" (UID: \"1121bff0-0f26-4c36-b74b-447e2c0a6d43\") " pod="openshift-console/console-798979bb8b-mnh29" Mar 18 16:47:46.803781 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:46.803610 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1121bff0-0f26-4c36-b74b-447e2c0a6d43-console-serving-cert\") pod \"console-798979bb8b-mnh29\" (UID: \"1121bff0-0f26-4c36-b74b-447e2c0a6d43\") " pod="openshift-console/console-798979bb8b-mnh29" Mar 18 16:47:46.803781 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:46.803642 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7gxpj\" (UniqueName: \"kubernetes.io/projected/1121bff0-0f26-4c36-b74b-447e2c0a6d43-kube-api-access-7gxpj\") pod \"console-798979bb8b-mnh29\" (UID: \"1121bff0-0f26-4c36-b74b-447e2c0a6d43\") " pod="openshift-console/console-798979bb8b-mnh29" Mar 18 16:47:46.803781 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:46.803679 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1121bff0-0f26-4c36-b74b-447e2c0a6d43-console-oauth-config\") pod \"console-798979bb8b-mnh29\" (UID: \"1121bff0-0f26-4c36-b74b-447e2c0a6d43\") " pod="openshift-console/console-798979bb8b-mnh29" Mar 18 16:47:46.804321 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:46.804281 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1121bff0-0f26-4c36-b74b-447e2c0a6d43-oauth-serving-cert\") pod \"console-798979bb8b-mnh29\" (UID: \"1121bff0-0f26-4c36-b74b-447e2c0a6d43\") " pod="openshift-console/console-798979bb8b-mnh29" Mar 18 16:47:46.804321 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:46.804281 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1121bff0-0f26-4c36-b74b-447e2c0a6d43-console-config\") pod \"console-798979bb8b-mnh29\" (UID: \"1121bff0-0f26-4c36-b74b-447e2c0a6d43\") " pod="openshift-console/console-798979bb8b-mnh29" Mar 18 16:47:46.804526 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:46.804412 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1121bff0-0f26-4c36-b74b-447e2c0a6d43-service-ca\") pod \"console-798979bb8b-mnh29\" (UID: \"1121bff0-0f26-4c36-b74b-447e2c0a6d43\") " pod="openshift-console/console-798979bb8b-mnh29" Mar 18 16:47:46.806550 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:46.806526 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1121bff0-0f26-4c36-b74b-447e2c0a6d43-console-oauth-config\") pod \"console-798979bb8b-mnh29\" (UID: \"1121bff0-0f26-4c36-b74b-447e2c0a6d43\") " pod="openshift-console/console-798979bb8b-mnh29" Mar 18 16:47:46.806736 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:46.806684 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1121bff0-0f26-4c36-b74b-447e2c0a6d43-console-serving-cert\") pod \"console-798979bb8b-mnh29\" (UID: \"1121bff0-0f26-4c36-b74b-447e2c0a6d43\") " pod="openshift-console/console-798979bb8b-mnh29" Mar 18 16:47:46.811859 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:46.811835 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gxpj\" (UniqueName: \"kubernetes.io/projected/1121bff0-0f26-4c36-b74b-447e2c0a6d43-kube-api-access-7gxpj\") pod \"console-798979bb8b-mnh29\" (UID: \"1121bff0-0f26-4c36-b74b-447e2c0a6d43\") " pod="openshift-console/console-798979bb8b-mnh29" Mar 18 16:47:46.866304 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:46.866272 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-798979bb8b-mnh29" Mar 18 16:47:51.357547 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:51.357523 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-798979bb8b-mnh29"] Mar 18 16:47:51.365515 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:47:51.365490 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1121bff0_0f26_4c36_b74b_447e2c0a6d43.slice/crio-531ba48238e37414befdc6734bd02e0e6cda8e4ae99e9e0f6f2c9f5f8dd835f5 WatchSource:0}: Error finding container 531ba48238e37414befdc6734bd02e0e6cda8e4ae99e9e0f6f2c9f5f8dd835f5: Status 404 returned error can't find the container with id 531ba48238e37414befdc6734bd02e0e6cda8e4ae99e9e0f6f2c9f5f8dd835f5 Mar 18 16:47:51.868038 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:51.866205 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-5b85974fd6-jcxw6" event={"ID":"d6c35e1e-b5a5-4e45-a199-26600d617988","Type":"ContainerStarted","Data":"21fc83fb0f5731dcecd259fef56d0ee4b991c89ac2deb7388ba65cb6faafacad"} Mar 18 16:47:51.868038 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:51.866887 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-5b85974fd6-jcxw6" Mar 18 16:47:51.868756 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:51.868727 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-798979bb8b-mnh29" event={"ID":"1121bff0-0f26-4c36-b74b-447e2c0a6d43","Type":"ContainerStarted","Data":"531ba48238e37414befdc6734bd02e0e6cda8e4ae99e9e0f6f2c9f5f8dd835f5"} Mar 18 16:47:51.884703 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:51.884292 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-5b85974fd6-jcxw6" podStartSLOduration=1.454674388 podStartE2EDuration="16.884280144s" podCreationTimestamp="2026-03-18 16:47:35 +0000 UTC" firstStartedPulling="2026-03-18 16:47:35.90254292 +0000 UTC m=+174.213795819" lastFinishedPulling="2026-03-18 16:47:51.332148674 +0000 UTC m=+189.643401575" observedRunningTime="2026-03-18 16:47:51.883428606 +0000 UTC m=+190.194681523" watchObservedRunningTime="2026-03-18 16:47:51.884280144 +0000 UTC m=+190.195533065" Mar 18 16:47:51.888242 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:51.888218 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-5b85974fd6-jcxw6" Mar 18 16:47:53.355071 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:53.355031 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-b58cd5d8d-rph27_50735fa3-85e2-4bcb-ad67-9dd10af25a64/cluster-monitoring-operator/0.log" Mar 18 16:47:54.550565 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:54.550542 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-2m2rc_1ee329a3-8fbd-4f66-b69a-0534ee4fe51c/init-textfile/0.log" Mar 18 16:47:54.751419 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:54.751310 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-2m2rc_1ee329a3-8fbd-4f66-b69a-0534ee4fe51c/node-exporter/0.log" Mar 18 16:47:54.880154 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:54.880116 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-798979bb8b-mnh29" event={"ID":"1121bff0-0f26-4c36-b74b-447e2c0a6d43","Type":"ContainerStarted","Data":"41668fa781866175e6868a80e0e4811deabde9b9b6993f7a559342855037e758"} Mar 18 16:47:54.881683 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:54.881647 2575 generic.go:358] "Generic (PLEG): container finished" podID="a6b6b964-0003-450e-8d89-0bc782c50559" containerID="99a61843c492fdde3a0706c76a77c8312c92f197d4b8f294a5703f4421cd7280" exitCode=0 Mar 18 16:47:54.881818 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:54.881701 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-866f46547-cwhks" event={"ID":"a6b6b964-0003-450e-8d89-0bc782c50559","Type":"ContainerDied","Data":"99a61843c492fdde3a0706c76a77c8312c92f197d4b8f294a5703f4421cd7280"} Mar 18 16:47:54.882005 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:54.881988 2575 scope.go:117] "RemoveContainer" containerID="99a61843c492fdde3a0706c76a77c8312c92f197d4b8f294a5703f4421cd7280" Mar 18 16:47:54.900664 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:54.900629 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-798979bb8b-mnh29" podStartSLOduration=5.785308253 podStartE2EDuration="8.900615978s" podCreationTimestamp="2026-03-18 16:47:46 +0000 UTC" firstStartedPulling="2026-03-18 16:47:51.367375868 +0000 UTC m=+189.678628777" lastFinishedPulling="2026-03-18 16:47:54.482683599 +0000 UTC m=+192.793936502" observedRunningTime="2026-03-18 16:47:54.899505106 +0000 UTC m=+193.210758029" watchObservedRunningTime="2026-03-18 16:47:54.900615978 +0000 UTC m=+193.211868898" Mar 18 16:47:54.949924 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:54.949877 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-2m2rc_1ee329a3-8fbd-4f66-b69a-0534ee4fe51c/kube-rbac-proxy/0.log" Mar 18 16:47:55.767263 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:55.767033 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7cd5c4bdbf-6qdv6"] Mar 18 16:47:55.790086 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:55.790054 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7cd5c4bdbf-6qdv6"] Mar 18 16:47:55.790237 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:55.790193 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7cd5c4bdbf-6qdv6" Mar 18 16:47:55.797325 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:55.797300 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Mar 18 16:47:55.886601 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:55.886566 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-866f46547-cwhks" event={"ID":"a6b6b964-0003-450e-8d89-0bc782c50559","Type":"ContainerStarted","Data":"a43366ba684a6482cb162044c3e0ef13c46b9de98034699816ec6fbbe7785e6b"} Mar 18 16:47:55.886786 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:55.886603 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/680b3441-1e68-45ab-963c-4079511d3591-console-config\") pod \"console-7cd5c4bdbf-6qdv6\" (UID: \"680b3441-1e68-45ab-963c-4079511d3591\") " pod="openshift-console/console-7cd5c4bdbf-6qdv6" Mar 18 16:47:55.886786 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:55.886665 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/680b3441-1e68-45ab-963c-4079511d3591-service-ca\") pod \"console-7cd5c4bdbf-6qdv6\" (UID: \"680b3441-1e68-45ab-963c-4079511d3591\") " pod="openshift-console/console-7cd5c4bdbf-6qdv6" Mar 18 16:47:55.886786 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:55.886707 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/680b3441-1e68-45ab-963c-4079511d3591-oauth-serving-cert\") pod \"console-7cd5c4bdbf-6qdv6\" (UID: \"680b3441-1e68-45ab-963c-4079511d3591\") " pod="openshift-console/console-7cd5c4bdbf-6qdv6" Mar 18 16:47:55.886786 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:55.886761 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/680b3441-1e68-45ab-963c-4079511d3591-console-serving-cert\") pod \"console-7cd5c4bdbf-6qdv6\" (UID: \"680b3441-1e68-45ab-963c-4079511d3591\") " pod="openshift-console/console-7cd5c4bdbf-6qdv6" Mar 18 16:47:55.886965 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:55.886798 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/680b3441-1e68-45ab-963c-4079511d3591-trusted-ca-bundle\") pod \"console-7cd5c4bdbf-6qdv6\" (UID: \"680b3441-1e68-45ab-963c-4079511d3591\") " pod="openshift-console/console-7cd5c4bdbf-6qdv6" Mar 18 16:47:55.886965 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:55.886837 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/680b3441-1e68-45ab-963c-4079511d3591-console-oauth-config\") pod \"console-7cd5c4bdbf-6qdv6\" (UID: \"680b3441-1e68-45ab-963c-4079511d3591\") " pod="openshift-console/console-7cd5c4bdbf-6qdv6" Mar 18 16:47:55.886965 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:55.886865 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2nqqv\" (UniqueName: \"kubernetes.io/projected/680b3441-1e68-45ab-963c-4079511d3591-kube-api-access-2nqqv\") pod \"console-7cd5c4bdbf-6qdv6\" (UID: \"680b3441-1e68-45ab-963c-4079511d3591\") " pod="openshift-console/console-7cd5c4bdbf-6qdv6" Mar 18 16:47:55.987553 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:55.987512 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/680b3441-1e68-45ab-963c-4079511d3591-service-ca\") pod \"console-7cd5c4bdbf-6qdv6\" (UID: \"680b3441-1e68-45ab-963c-4079511d3591\") " pod="openshift-console/console-7cd5c4bdbf-6qdv6" Mar 18 16:47:55.987725 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:55.987700 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/680b3441-1e68-45ab-963c-4079511d3591-oauth-serving-cert\") pod \"console-7cd5c4bdbf-6qdv6\" (UID: \"680b3441-1e68-45ab-963c-4079511d3591\") " pod="openshift-console/console-7cd5c4bdbf-6qdv6" Mar 18 16:47:55.987845 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:55.987811 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/680b3441-1e68-45ab-963c-4079511d3591-console-serving-cert\") pod \"console-7cd5c4bdbf-6qdv6\" (UID: \"680b3441-1e68-45ab-963c-4079511d3591\") " pod="openshift-console/console-7cd5c4bdbf-6qdv6" Mar 18 16:47:55.987940 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:55.987899 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/680b3441-1e68-45ab-963c-4079511d3591-trusted-ca-bundle\") pod \"console-7cd5c4bdbf-6qdv6\" (UID: \"680b3441-1e68-45ab-963c-4079511d3591\") " pod="openshift-console/console-7cd5c4bdbf-6qdv6" Mar 18 16:47:55.987997 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:55.987938 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/680b3441-1e68-45ab-963c-4079511d3591-console-oauth-config\") pod \"console-7cd5c4bdbf-6qdv6\" (UID: \"680b3441-1e68-45ab-963c-4079511d3591\") " pod="openshift-console/console-7cd5c4bdbf-6qdv6" Mar 18 16:47:55.987997 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:55.987989 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2nqqv\" (UniqueName: \"kubernetes.io/projected/680b3441-1e68-45ab-963c-4079511d3591-kube-api-access-2nqqv\") pod \"console-7cd5c4bdbf-6qdv6\" (UID: \"680b3441-1e68-45ab-963c-4079511d3591\") " pod="openshift-console/console-7cd5c4bdbf-6qdv6" Mar 18 16:47:55.988091 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:55.988072 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/680b3441-1e68-45ab-963c-4079511d3591-console-config\") pod \"console-7cd5c4bdbf-6qdv6\" (UID: \"680b3441-1e68-45ab-963c-4079511d3591\") " pod="openshift-console/console-7cd5c4bdbf-6qdv6" Mar 18 16:47:55.988266 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:55.988242 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/680b3441-1e68-45ab-963c-4079511d3591-service-ca\") pod \"console-7cd5c4bdbf-6qdv6\" (UID: \"680b3441-1e68-45ab-963c-4079511d3591\") " pod="openshift-console/console-7cd5c4bdbf-6qdv6" Mar 18 16:47:55.988473 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:55.988447 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/680b3441-1e68-45ab-963c-4079511d3591-oauth-serving-cert\") pod \"console-7cd5c4bdbf-6qdv6\" (UID: \"680b3441-1e68-45ab-963c-4079511d3591\") " pod="openshift-console/console-7cd5c4bdbf-6qdv6" Mar 18 16:47:55.988786 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:55.988757 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/680b3441-1e68-45ab-963c-4079511d3591-console-config\") pod \"console-7cd5c4bdbf-6qdv6\" (UID: \"680b3441-1e68-45ab-963c-4079511d3591\") " pod="openshift-console/console-7cd5c4bdbf-6qdv6" Mar 18 16:47:55.989043 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:55.989011 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/680b3441-1e68-45ab-963c-4079511d3591-trusted-ca-bundle\") pod \"console-7cd5c4bdbf-6qdv6\" (UID: \"680b3441-1e68-45ab-963c-4079511d3591\") " pod="openshift-console/console-7cd5c4bdbf-6qdv6" Mar 18 16:47:55.990908 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:55.990876 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/680b3441-1e68-45ab-963c-4079511d3591-console-serving-cert\") pod \"console-7cd5c4bdbf-6qdv6\" (UID: \"680b3441-1e68-45ab-963c-4079511d3591\") " pod="openshift-console/console-7cd5c4bdbf-6qdv6" Mar 18 16:47:55.990908 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:55.990896 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/680b3441-1e68-45ab-963c-4079511d3591-console-oauth-config\") pod \"console-7cd5c4bdbf-6qdv6\" (UID: \"680b3441-1e68-45ab-963c-4079511d3591\") " pod="openshift-console/console-7cd5c4bdbf-6qdv6" Mar 18 16:47:55.997001 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:55.996983 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2nqqv\" (UniqueName: \"kubernetes.io/projected/680b3441-1e68-45ab-963c-4079511d3591-kube-api-access-2nqqv\") pod \"console-7cd5c4bdbf-6qdv6\" (UID: \"680b3441-1e68-45ab-963c-4079511d3591\") " pod="openshift-console/console-7cd5c4bdbf-6qdv6" Mar 18 16:47:56.101924 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:56.101836 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7cd5c4bdbf-6qdv6" Mar 18 16:47:56.238031 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:56.238002 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7cd5c4bdbf-6qdv6"] Mar 18 16:47:56.240896 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:47:56.240862 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod680b3441_1e68_45ab_963c_4079511d3591.slice/crio-43e1b9168cdf2eda450c1bb9b346544fd9734798f1c5080e5990e5b05017ae05 WatchSource:0}: Error finding container 43e1b9168cdf2eda450c1bb9b346544fd9734798f1c5080e5990e5b05017ae05: Status 404 returned error can't find the container with id 43e1b9168cdf2eda450c1bb9b346544fd9734798f1c5080e5990e5b05017ae05 Mar 18 16:47:56.866942 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:56.866902 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-798979bb8b-mnh29" Mar 18 16:47:56.867316 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:56.866966 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-798979bb8b-mnh29" Mar 18 16:47:56.871818 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:56.871792 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-798979bb8b-mnh29" Mar 18 16:47:56.890502 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:56.890468 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7cd5c4bdbf-6qdv6" event={"ID":"680b3441-1e68-45ab-963c-4079511d3591","Type":"ContainerStarted","Data":"06a17d8c23cae4d97422e9744a8d1a5a47a66d245f293358b4973d253770b574"} Mar 18 16:47:56.890632 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:56.890505 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7cd5c4bdbf-6qdv6" event={"ID":"680b3441-1e68-45ab-963c-4079511d3591","Type":"ContainerStarted","Data":"43e1b9168cdf2eda450c1bb9b346544fd9734798f1c5080e5990e5b05017ae05"} Mar 18 16:47:56.891851 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:56.891828 2575 generic.go:358] "Generic (PLEG): container finished" podID="13e69639-78a2-49ad-ba0a-b3ea46d6a337" containerID="1395c2c998099790122485fcb471064cc470e85c2b81bbf003e73d85fda38e4b" exitCode=0 Mar 18 16:47:56.891956 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:56.891899 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-56f6f4cbcb-xrm6l" event={"ID":"13e69639-78a2-49ad-ba0a-b3ea46d6a337","Type":"ContainerDied","Data":"1395c2c998099790122485fcb471064cc470e85c2b81bbf003e73d85fda38e4b"} Mar 18 16:47:56.892237 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:56.892220 2575 scope.go:117] "RemoveContainer" containerID="1395c2c998099790122485fcb471064cc470e85c2b81bbf003e73d85fda38e4b" Mar 18 16:47:56.895846 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:56.895829 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-798979bb8b-mnh29" Mar 18 16:47:56.907627 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:56.907594 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7cd5c4bdbf-6qdv6" podStartSLOduration=1.907584025 podStartE2EDuration="1.907584025s" podCreationTimestamp="2026-03-18 16:47:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:47:56.906513006 +0000 UTC m=+195.217765931" watchObservedRunningTime="2026-03-18 16:47:56.907584025 +0000 UTC m=+195.218836946" Mar 18 16:47:57.896619 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:57.896583 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-56f6f4cbcb-xrm6l" event={"ID":"13e69639-78a2-49ad-ba0a-b3ea46d6a337","Type":"ContainerStarted","Data":"3daae2ba432bb75086fe04a380178187a2edb3bcffb496cc1baaf05ef3ebca03"} Mar 18 16:47:58.352994 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:58.352910 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-6b948c769-kphbn_a090e3ca-196c-44e4-8c04-ba9e61392e3c/prometheus-operator/0.log" Mar 18 16:47:58.549781 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:47:58.549746 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-6b948c769-kphbn_a090e3ca-196c-44e4-8c04-ba9e61392e3c/kube-rbac-proxy/0.log" Mar 18 16:48:01.351651 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:48:01.351623 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-798979bb8b-mnh29_1121bff0-0f26-4c36-b74b-447e2c0a6d43/console/0.log" Mar 18 16:48:01.551790 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:48:01.550898 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-5b85974fd6-jcxw6_d6c35e1e-b5a5-4e45-a199-26600d617988/download-server/0.log" Mar 18 16:48:02.349982 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:48:02.349957 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-cx9cp_df0141d5-7fdb-4d38-a11e-e2f21fffe1bb/serve-healthcheck-canary/0.log" Mar 18 16:48:06.102557 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:48:06.102518 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7cd5c4bdbf-6qdv6" Mar 18 16:48:06.102557 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:48:06.102567 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7cd5c4bdbf-6qdv6" Mar 18 16:48:06.107209 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:48:06.107185 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7cd5c4bdbf-6qdv6" Mar 18 16:48:06.924799 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:48:06.924767 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7cd5c4bdbf-6qdv6" Mar 18 16:48:06.971569 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:48:06.971540 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-798979bb8b-mnh29"] Mar 18 16:48:31.996521 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:48:31.996460 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-798979bb8b-mnh29" podUID="1121bff0-0f26-4c36-b74b-447e2c0a6d43" containerName="console" containerID="cri-o://41668fa781866175e6868a80e0e4811deabde9b9b6993f7a559342855037e758" gracePeriod=15 Mar 18 16:48:32.227510 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:48:32.227490 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-798979bb8b-mnh29_1121bff0-0f26-4c36-b74b-447e2c0a6d43/console/0.log" Mar 18 16:48:32.227621 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:48:32.227548 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-798979bb8b-mnh29" Mar 18 16:48:32.276942 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:48:32.276876 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1121bff0-0f26-4c36-b74b-447e2c0a6d43-console-serving-cert\") pod \"1121bff0-0f26-4c36-b74b-447e2c0a6d43\" (UID: \"1121bff0-0f26-4c36-b74b-447e2c0a6d43\") " Mar 18 16:48:32.276942 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:48:32.276909 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1121bff0-0f26-4c36-b74b-447e2c0a6d43-console-config\") pod \"1121bff0-0f26-4c36-b74b-447e2c0a6d43\" (UID: \"1121bff0-0f26-4c36-b74b-447e2c0a6d43\") " Mar 18 16:48:32.276942 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:48:32.276935 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1121bff0-0f26-4c36-b74b-447e2c0a6d43-console-oauth-config\") pod \"1121bff0-0f26-4c36-b74b-447e2c0a6d43\" (UID: \"1121bff0-0f26-4c36-b74b-447e2c0a6d43\") " Mar 18 16:48:32.277185 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:48:32.276956 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1121bff0-0f26-4c36-b74b-447e2c0a6d43-service-ca\") pod \"1121bff0-0f26-4c36-b74b-447e2c0a6d43\" (UID: \"1121bff0-0f26-4c36-b74b-447e2c0a6d43\") " Mar 18 16:48:32.277185 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:48:32.276988 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1121bff0-0f26-4c36-b74b-447e2c0a6d43-oauth-serving-cert\") pod \"1121bff0-0f26-4c36-b74b-447e2c0a6d43\" (UID: \"1121bff0-0f26-4c36-b74b-447e2c0a6d43\") " Mar 18 16:48:32.277185 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:48:32.277009 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7gxpj\" (UniqueName: \"kubernetes.io/projected/1121bff0-0f26-4c36-b74b-447e2c0a6d43-kube-api-access-7gxpj\") pod \"1121bff0-0f26-4c36-b74b-447e2c0a6d43\" (UID: \"1121bff0-0f26-4c36-b74b-447e2c0a6d43\") " Mar 18 16:48:32.277377 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:48:32.277333 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1121bff0-0f26-4c36-b74b-447e2c0a6d43-console-config" (OuterVolumeSpecName: "console-config") pod "1121bff0-0f26-4c36-b74b-447e2c0a6d43" (UID: "1121bff0-0f26-4c36-b74b-447e2c0a6d43"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 16:48:32.277437 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:48:32.277384 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1121bff0-0f26-4c36-b74b-447e2c0a6d43-service-ca" (OuterVolumeSpecName: "service-ca") pod "1121bff0-0f26-4c36-b74b-447e2c0a6d43" (UID: "1121bff0-0f26-4c36-b74b-447e2c0a6d43"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 16:48:32.277437 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:48:32.277391 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1121bff0-0f26-4c36-b74b-447e2c0a6d43-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "1121bff0-0f26-4c36-b74b-447e2c0a6d43" (UID: "1121bff0-0f26-4c36-b74b-447e2c0a6d43"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 16:48:32.279227 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:48:32.279206 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1121bff0-0f26-4c36-b74b-447e2c0a6d43-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "1121bff0-0f26-4c36-b74b-447e2c0a6d43" (UID: "1121bff0-0f26-4c36-b74b-447e2c0a6d43"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:48:32.279323 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:48:32.279299 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1121bff0-0f26-4c36-b74b-447e2c0a6d43-kube-api-access-7gxpj" (OuterVolumeSpecName: "kube-api-access-7gxpj") pod "1121bff0-0f26-4c36-b74b-447e2c0a6d43" (UID: "1121bff0-0f26-4c36-b74b-447e2c0a6d43"). InnerVolumeSpecName "kube-api-access-7gxpj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 16:48:32.279392 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:48:32.279322 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1121bff0-0f26-4c36-b74b-447e2c0a6d43-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "1121bff0-0f26-4c36-b74b-447e2c0a6d43" (UID: "1121bff0-0f26-4c36-b74b-447e2c0a6d43"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:48:32.377867 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:48:32.377832 2575 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1121bff0-0f26-4c36-b74b-447e2c0a6d43-console-oauth-config\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 16:48:32.377867 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:48:32.377861 2575 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1121bff0-0f26-4c36-b74b-447e2c0a6d43-service-ca\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 16:48:32.377867 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:48:32.377871 2575 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1121bff0-0f26-4c36-b74b-447e2c0a6d43-oauth-serving-cert\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 16:48:32.378073 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:48:32.377880 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7gxpj\" (UniqueName: \"kubernetes.io/projected/1121bff0-0f26-4c36-b74b-447e2c0a6d43-kube-api-access-7gxpj\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 16:48:32.378073 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:48:32.377890 2575 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1121bff0-0f26-4c36-b74b-447e2c0a6d43-console-serving-cert\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 16:48:32.378073 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:48:32.377899 2575 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1121bff0-0f26-4c36-b74b-447e2c0a6d43-console-config\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 16:48:32.990680 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:48:32.990650 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-798979bb8b-mnh29_1121bff0-0f26-4c36-b74b-447e2c0a6d43/console/0.log" Mar 18 16:48:32.990827 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:48:32.990696 2575 generic.go:358] "Generic (PLEG): container finished" podID="1121bff0-0f26-4c36-b74b-447e2c0a6d43" containerID="41668fa781866175e6868a80e0e4811deabde9b9b6993f7a559342855037e758" exitCode=2 Mar 18 16:48:32.990827 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:48:32.990731 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-798979bb8b-mnh29" event={"ID":"1121bff0-0f26-4c36-b74b-447e2c0a6d43","Type":"ContainerDied","Data":"41668fa781866175e6868a80e0e4811deabde9b9b6993f7a559342855037e758"} Mar 18 16:48:32.990827 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:48:32.990770 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-798979bb8b-mnh29" event={"ID":"1121bff0-0f26-4c36-b74b-447e2c0a6d43","Type":"ContainerDied","Data":"531ba48238e37414befdc6734bd02e0e6cda8e4ae99e9e0f6f2c9f5f8dd835f5"} Mar 18 16:48:32.990827 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:48:32.990778 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-798979bb8b-mnh29" Mar 18 16:48:32.991002 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:48:32.990785 2575 scope.go:117] "RemoveContainer" containerID="41668fa781866175e6868a80e0e4811deabde9b9b6993f7a559342855037e758" Mar 18 16:48:32.998904 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:48:32.998680 2575 scope.go:117] "RemoveContainer" containerID="41668fa781866175e6868a80e0e4811deabde9b9b6993f7a559342855037e758" Mar 18 16:48:32.999154 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:48:32.998908 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41668fa781866175e6868a80e0e4811deabde9b9b6993f7a559342855037e758\": container with ID starting with 41668fa781866175e6868a80e0e4811deabde9b9b6993f7a559342855037e758 not found: ID does not exist" containerID="41668fa781866175e6868a80e0e4811deabde9b9b6993f7a559342855037e758" Mar 18 16:48:32.999154 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:48:32.998949 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41668fa781866175e6868a80e0e4811deabde9b9b6993f7a559342855037e758"} err="failed to get container status \"41668fa781866175e6868a80e0e4811deabde9b9b6993f7a559342855037e758\": rpc error: code = NotFound desc = could not find container \"41668fa781866175e6868a80e0e4811deabde9b9b6993f7a559342855037e758\": container with ID starting with 41668fa781866175e6868a80e0e4811deabde9b9b6993f7a559342855037e758 not found: ID does not exist" Mar 18 16:48:33.004903 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:48:33.004878 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-798979bb8b-mnh29"] Mar 18 16:48:33.008380 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:48:33.008346 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-798979bb8b-mnh29"] Mar 18 16:48:34.289788 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:48:34.289757 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1121bff0-0f26-4c36-b74b-447e2c0a6d43" path="/var/lib/kubelet/pods/1121bff0-0f26-4c36-b74b-447e2c0a6d43/volumes" Mar 18 16:48:43.608950 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:48:43.608918 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-65ff667744-cfz6r"] Mar 18 16:48:43.609333 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:48:43.609166 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1121bff0-0f26-4c36-b74b-447e2c0a6d43" containerName="console" Mar 18 16:48:43.609333 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:48:43.609176 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="1121bff0-0f26-4c36-b74b-447e2c0a6d43" containerName="console" Mar 18 16:48:43.609333 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:48:43.609232 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="1121bff0-0f26-4c36-b74b-447e2c0a6d43" containerName="console" Mar 18 16:48:43.613660 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:48:43.613644 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-65ff667744-cfz6r" Mar 18 16:48:43.624743 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:48:43.624717 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-65ff667744-cfz6r"] Mar 18 16:48:43.763600 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:48:43.763570 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0a41a65a-dd92-4261-8e3b-b795190bc745-console-serving-cert\") pod \"console-65ff667744-cfz6r\" (UID: \"0a41a65a-dd92-4261-8e3b-b795190bc745\") " pod="openshift-console/console-65ff667744-cfz6r" Mar 18 16:48:43.763600 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:48:43.763606 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0a41a65a-dd92-4261-8e3b-b795190bc745-console-oauth-config\") pod \"console-65ff667744-cfz6r\" (UID: \"0a41a65a-dd92-4261-8e3b-b795190bc745\") " pod="openshift-console/console-65ff667744-cfz6r" Mar 18 16:48:43.763786 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:48:43.763644 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0a41a65a-dd92-4261-8e3b-b795190bc745-trusted-ca-bundle\") pod \"console-65ff667744-cfz6r\" (UID: \"0a41a65a-dd92-4261-8e3b-b795190bc745\") " pod="openshift-console/console-65ff667744-cfz6r" Mar 18 16:48:43.763786 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:48:43.763674 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0a41a65a-dd92-4261-8e3b-b795190bc745-oauth-serving-cert\") pod \"console-65ff667744-cfz6r\" (UID: \"0a41a65a-dd92-4261-8e3b-b795190bc745\") " pod="openshift-console/console-65ff667744-cfz6r" Mar 18 16:48:43.763786 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:48:43.763729 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0a41a65a-dd92-4261-8e3b-b795190bc745-console-config\") pod \"console-65ff667744-cfz6r\" (UID: \"0a41a65a-dd92-4261-8e3b-b795190bc745\") " pod="openshift-console/console-65ff667744-cfz6r" Mar 18 16:48:43.763786 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:48:43.763749 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkh49\" (UniqueName: \"kubernetes.io/projected/0a41a65a-dd92-4261-8e3b-b795190bc745-kube-api-access-pkh49\") pod \"console-65ff667744-cfz6r\" (UID: \"0a41a65a-dd92-4261-8e3b-b795190bc745\") " pod="openshift-console/console-65ff667744-cfz6r" Mar 18 16:48:43.763786 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:48:43.763768 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0a41a65a-dd92-4261-8e3b-b795190bc745-service-ca\") pod \"console-65ff667744-cfz6r\" (UID: \"0a41a65a-dd92-4261-8e3b-b795190bc745\") " pod="openshift-console/console-65ff667744-cfz6r" Mar 18 16:48:43.864936 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:48:43.864856 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0a41a65a-dd92-4261-8e3b-b795190bc745-trusted-ca-bundle\") pod \"console-65ff667744-cfz6r\" (UID: \"0a41a65a-dd92-4261-8e3b-b795190bc745\") " pod="openshift-console/console-65ff667744-cfz6r" Mar 18 16:48:43.864936 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:48:43.864887 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0a41a65a-dd92-4261-8e3b-b795190bc745-oauth-serving-cert\") pod \"console-65ff667744-cfz6r\" (UID: \"0a41a65a-dd92-4261-8e3b-b795190bc745\") " pod="openshift-console/console-65ff667744-cfz6r" Mar 18 16:48:43.864936 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:48:43.864919 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0a41a65a-dd92-4261-8e3b-b795190bc745-console-config\") pod \"console-65ff667744-cfz6r\" (UID: \"0a41a65a-dd92-4261-8e3b-b795190bc745\") " pod="openshift-console/console-65ff667744-cfz6r" Mar 18 16:48:43.864936 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:48:43.864935 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pkh49\" (UniqueName: \"kubernetes.io/projected/0a41a65a-dd92-4261-8e3b-b795190bc745-kube-api-access-pkh49\") pod \"console-65ff667744-cfz6r\" (UID: \"0a41a65a-dd92-4261-8e3b-b795190bc745\") " pod="openshift-console/console-65ff667744-cfz6r" Mar 18 16:48:43.865250 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:48:43.865048 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0a41a65a-dd92-4261-8e3b-b795190bc745-service-ca\") pod \"console-65ff667744-cfz6r\" (UID: \"0a41a65a-dd92-4261-8e3b-b795190bc745\") " pod="openshift-console/console-65ff667744-cfz6r" Mar 18 16:48:43.865250 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:48:43.865113 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0a41a65a-dd92-4261-8e3b-b795190bc745-console-serving-cert\") pod \"console-65ff667744-cfz6r\" (UID: \"0a41a65a-dd92-4261-8e3b-b795190bc745\") " pod="openshift-console/console-65ff667744-cfz6r" Mar 18 16:48:43.865250 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:48:43.865140 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0a41a65a-dd92-4261-8e3b-b795190bc745-console-oauth-config\") pod \"console-65ff667744-cfz6r\" (UID: \"0a41a65a-dd92-4261-8e3b-b795190bc745\") " pod="openshift-console/console-65ff667744-cfz6r" Mar 18 16:48:43.865724 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:48:43.865691 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0a41a65a-dd92-4261-8e3b-b795190bc745-console-config\") pod \"console-65ff667744-cfz6r\" (UID: \"0a41a65a-dd92-4261-8e3b-b795190bc745\") " pod="openshift-console/console-65ff667744-cfz6r" Mar 18 16:48:43.865848 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:48:43.865764 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0a41a65a-dd92-4261-8e3b-b795190bc745-oauth-serving-cert\") pod \"console-65ff667744-cfz6r\" (UID: \"0a41a65a-dd92-4261-8e3b-b795190bc745\") " pod="openshift-console/console-65ff667744-cfz6r" Mar 18 16:48:43.865848 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:48:43.865808 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0a41a65a-dd92-4261-8e3b-b795190bc745-service-ca\") pod \"console-65ff667744-cfz6r\" (UID: \"0a41a65a-dd92-4261-8e3b-b795190bc745\") " pod="openshift-console/console-65ff667744-cfz6r" Mar 18 16:48:43.865848 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:48:43.865837 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0a41a65a-dd92-4261-8e3b-b795190bc745-trusted-ca-bundle\") pod \"console-65ff667744-cfz6r\" (UID: \"0a41a65a-dd92-4261-8e3b-b795190bc745\") " pod="openshift-console/console-65ff667744-cfz6r" Mar 18 16:48:43.867675 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:48:43.867653 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0a41a65a-dd92-4261-8e3b-b795190bc745-console-serving-cert\") pod \"console-65ff667744-cfz6r\" (UID: \"0a41a65a-dd92-4261-8e3b-b795190bc745\") " pod="openshift-console/console-65ff667744-cfz6r" Mar 18 16:48:43.867675 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:48:43.867673 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0a41a65a-dd92-4261-8e3b-b795190bc745-console-oauth-config\") pod \"console-65ff667744-cfz6r\" (UID: \"0a41a65a-dd92-4261-8e3b-b795190bc745\") " pod="openshift-console/console-65ff667744-cfz6r" Mar 18 16:48:43.872515 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:48:43.872484 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkh49\" (UniqueName: \"kubernetes.io/projected/0a41a65a-dd92-4261-8e3b-b795190bc745-kube-api-access-pkh49\") pod \"console-65ff667744-cfz6r\" (UID: \"0a41a65a-dd92-4261-8e3b-b795190bc745\") " pod="openshift-console/console-65ff667744-cfz6r" Mar 18 16:48:43.922352 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:48:43.922328 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-65ff667744-cfz6r" Mar 18 16:48:44.042129 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:48:44.042100 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-65ff667744-cfz6r"] Mar 18 16:48:44.044827 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:48:44.044795 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a41a65a_dd92_4261_8e3b_b795190bc745.slice/crio-a4526effcb02b384e5aca7c826817250cc807989131b429f7a3b23da1e75164d WatchSource:0}: Error finding container a4526effcb02b384e5aca7c826817250cc807989131b429f7a3b23da1e75164d: Status 404 returned error can't find the container with id a4526effcb02b384e5aca7c826817250cc807989131b429f7a3b23da1e75164d Mar 18 16:48:45.024088 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:48:45.024055 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-65ff667744-cfz6r" event={"ID":"0a41a65a-dd92-4261-8e3b-b795190bc745","Type":"ContainerStarted","Data":"d002d22eba6e0c3194ef241d861390378a7e05fb7720c6c0f90c71d23267eff7"} Mar 18 16:48:45.024088 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:48:45.024088 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-65ff667744-cfz6r" event={"ID":"0a41a65a-dd92-4261-8e3b-b795190bc745","Type":"ContainerStarted","Data":"a4526effcb02b384e5aca7c826817250cc807989131b429f7a3b23da1e75164d"} Mar 18 16:48:45.042677 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:48:45.042631 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-65ff667744-cfz6r" podStartSLOduration=2.042617916 podStartE2EDuration="2.042617916s" podCreationTimestamp="2026-03-18 16:48:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:48:45.041816105 +0000 UTC m=+243.353069026" watchObservedRunningTime="2026-03-18 16:48:45.042617916 +0000 UTC m=+243.353870837" Mar 18 16:48:53.922812 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:48:53.922722 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-65ff667744-cfz6r" Mar 18 16:48:53.922812 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:48:53.922778 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-65ff667744-cfz6r" Mar 18 16:48:53.927237 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:48:53.927217 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-65ff667744-cfz6r" Mar 18 16:48:54.049767 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:48:54.049716 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d6948911-017b-4b29-b362-5520b984c273-metrics-certs\") pod \"network-metrics-daemon-nbj8s\" (UID: \"d6948911-017b-4b29-b362-5520b984c273\") " pod="openshift-multus/network-metrics-daemon-nbj8s" Mar 18 16:48:54.052178 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:48:54.052147 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d6948911-017b-4b29-b362-5520b984c273-metrics-certs\") pod \"network-metrics-daemon-nbj8s\" (UID: \"d6948911-017b-4b29-b362-5520b984c273\") " pod="openshift-multus/network-metrics-daemon-nbj8s" Mar 18 16:48:54.052884 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:48:54.052869 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-65ff667744-cfz6r" Mar 18 16:48:54.102556 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:48:54.102520 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7cd5c4bdbf-6qdv6"] Mar 18 16:48:54.187989 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:48:54.187912 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-97tcn\"" Mar 18 16:48:54.196640 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:48:54.196617 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nbj8s" Mar 18 16:48:54.317837 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:48:54.317738 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-nbj8s"] Mar 18 16:48:54.320455 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:48:54.320427 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6948911_017b_4b29_b362_5520b984c273.slice/crio-4469ddd568840c471bf1c1629e6c30ee943d70240eb45c9144f222fb939b7f28 WatchSource:0}: Error finding container 4469ddd568840c471bf1c1629e6c30ee943d70240eb45c9144f222fb939b7f28: Status 404 returned error can't find the container with id 4469ddd568840c471bf1c1629e6c30ee943d70240eb45c9144f222fb939b7f28 Mar 18 16:48:55.053031 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:48:55.052984 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-nbj8s" event={"ID":"d6948911-017b-4b29-b362-5520b984c273","Type":"ContainerStarted","Data":"4469ddd568840c471bf1c1629e6c30ee943d70240eb45c9144f222fb939b7f28"} Mar 18 16:48:56.057155 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:48:56.057115 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-nbj8s" event={"ID":"d6948911-017b-4b29-b362-5520b984c273","Type":"ContainerStarted","Data":"6818dfbf09f53fc113544596e82c9d233bc7d11992410fe74d8493c50e8163bf"} Mar 18 16:48:56.057155 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:48:56.057151 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-nbj8s" event={"ID":"d6948911-017b-4b29-b362-5520b984c273","Type":"ContainerStarted","Data":"e966c45366b79f8e24b72178941c36bc665b7d4d98e8094cfc4a69f058714ee9"} Mar 18 16:48:56.073944 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:48:56.073893 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-nbj8s" podStartSLOduration=252.898465932 podStartE2EDuration="4m14.073879575s" podCreationTimestamp="2026-03-18 16:44:42 +0000 UTC" firstStartedPulling="2026-03-18 16:48:54.32225072 +0000 UTC m=+252.633503619" lastFinishedPulling="2026-03-18 16:48:55.497664359 +0000 UTC m=+253.808917262" observedRunningTime="2026-03-18 16:48:56.072464678 +0000 UTC m=+254.383717612" watchObservedRunningTime="2026-03-18 16:48:56.073879575 +0000 UTC m=+254.385132509" Mar 18 16:49:06.485986 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:06.485950 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-gdm4b"] Mar 18 16:49:06.489192 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:06.489171 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gdm4b" Mar 18 16:49:06.491074 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:06.491056 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Mar 18 16:49:06.496572 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:06.496553 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-gdm4b"] Mar 18 16:49:06.638789 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:06.638758 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4e6ee6d9-0f25-4ecd-b286-3b4f51bd62a5-original-pull-secret\") pod \"global-pull-secret-syncer-gdm4b\" (UID: \"4e6ee6d9-0f25-4ecd-b286-3b4f51bd62a5\") " pod="kube-system/global-pull-secret-syncer-gdm4b" Mar 18 16:49:06.638955 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:06.638809 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/4e6ee6d9-0f25-4ecd-b286-3b4f51bd62a5-kubelet-config\") pod \"global-pull-secret-syncer-gdm4b\" (UID: \"4e6ee6d9-0f25-4ecd-b286-3b4f51bd62a5\") " pod="kube-system/global-pull-secret-syncer-gdm4b" Mar 18 16:49:06.638955 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:06.638914 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/4e6ee6d9-0f25-4ecd-b286-3b4f51bd62a5-dbus\") pod \"global-pull-secret-syncer-gdm4b\" (UID: \"4e6ee6d9-0f25-4ecd-b286-3b4f51bd62a5\") " pod="kube-system/global-pull-secret-syncer-gdm4b" Mar 18 16:49:06.739521 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:06.739443 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/4e6ee6d9-0f25-4ecd-b286-3b4f51bd62a5-dbus\") pod \"global-pull-secret-syncer-gdm4b\" (UID: \"4e6ee6d9-0f25-4ecd-b286-3b4f51bd62a5\") " pod="kube-system/global-pull-secret-syncer-gdm4b" Mar 18 16:49:06.739521 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:06.739480 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4e6ee6d9-0f25-4ecd-b286-3b4f51bd62a5-original-pull-secret\") pod \"global-pull-secret-syncer-gdm4b\" (UID: \"4e6ee6d9-0f25-4ecd-b286-3b4f51bd62a5\") " pod="kube-system/global-pull-secret-syncer-gdm4b" Mar 18 16:49:06.739680 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:06.739611 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/4e6ee6d9-0f25-4ecd-b286-3b4f51bd62a5-kubelet-config\") pod \"global-pull-secret-syncer-gdm4b\" (UID: \"4e6ee6d9-0f25-4ecd-b286-3b4f51bd62a5\") " pod="kube-system/global-pull-secret-syncer-gdm4b" Mar 18 16:49:06.739680 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:06.739643 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/4e6ee6d9-0f25-4ecd-b286-3b4f51bd62a5-dbus\") pod \"global-pull-secret-syncer-gdm4b\" (UID: \"4e6ee6d9-0f25-4ecd-b286-3b4f51bd62a5\") " pod="kube-system/global-pull-secret-syncer-gdm4b" Mar 18 16:49:06.739758 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:06.739734 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/4e6ee6d9-0f25-4ecd-b286-3b4f51bd62a5-kubelet-config\") pod \"global-pull-secret-syncer-gdm4b\" (UID: \"4e6ee6d9-0f25-4ecd-b286-3b4f51bd62a5\") " pod="kube-system/global-pull-secret-syncer-gdm4b" Mar 18 16:49:06.741895 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:06.741875 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4e6ee6d9-0f25-4ecd-b286-3b4f51bd62a5-original-pull-secret\") pod \"global-pull-secret-syncer-gdm4b\" (UID: \"4e6ee6d9-0f25-4ecd-b286-3b4f51bd62a5\") " pod="kube-system/global-pull-secret-syncer-gdm4b" Mar 18 16:49:06.798598 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:06.798576 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gdm4b" Mar 18 16:49:06.912513 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:06.912482 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-gdm4b"] Mar 18 16:49:06.915557 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:49:06.915530 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e6ee6d9_0f25_4ecd_b286_3b4f51bd62a5.slice/crio-e1ede5a9f42a1bf9420de3f422930e60834cf3d657a4e2a24fde03eab68900c1 WatchSource:0}: Error finding container e1ede5a9f42a1bf9420de3f422930e60834cf3d657a4e2a24fde03eab68900c1: Status 404 returned error can't find the container with id e1ede5a9f42a1bf9420de3f422930e60834cf3d657a4e2a24fde03eab68900c1 Mar 18 16:49:07.091776 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:07.091701 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-gdm4b" event={"ID":"4e6ee6d9-0f25-4ecd-b286-3b4f51bd62a5","Type":"ContainerStarted","Data":"e1ede5a9f42a1bf9420de3f422930e60834cf3d657a4e2a24fde03eab68900c1"} Mar 18 16:49:11.107615 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:11.107577 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-gdm4b" event={"ID":"4e6ee6d9-0f25-4ecd-b286-3b4f51bd62a5","Type":"ContainerStarted","Data":"317f46fcbe1827cc131578021c3349c8ca1d64b97891fb5a445151119863372b"} Mar 18 16:49:19.121634 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:19.121586 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-7cd5c4bdbf-6qdv6" podUID="680b3441-1e68-45ab-963c-4079511d3591" containerName="console" containerID="cri-o://06a17d8c23cae4d97422e9744a8d1a5a47a66d245f293358b4973d253770b574" gracePeriod=15 Mar 18 16:49:19.366292 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:19.366235 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7cd5c4bdbf-6qdv6_680b3441-1e68-45ab-963c-4079511d3591/console/0.log" Mar 18 16:49:19.366434 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:19.366303 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7cd5c4bdbf-6qdv6" Mar 18 16:49:19.383337 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:19.383249 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-gdm4b" podStartSLOduration=9.694374917 podStartE2EDuration="13.383234981s" podCreationTimestamp="2026-03-18 16:49:06 +0000 UTC" firstStartedPulling="2026-03-18 16:49:06.917171736 +0000 UTC m=+265.228424641" lastFinishedPulling="2026-03-18 16:49:10.606031805 +0000 UTC m=+268.917284705" observedRunningTime="2026-03-18 16:49:11.125297394 +0000 UTC m=+269.436550318" watchObservedRunningTime="2026-03-18 16:49:19.383234981 +0000 UTC m=+277.694487903" Mar 18 16:49:19.433890 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:19.433863 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/680b3441-1e68-45ab-963c-4079511d3591-oauth-serving-cert\") pod \"680b3441-1e68-45ab-963c-4079511d3591\" (UID: \"680b3441-1e68-45ab-963c-4079511d3591\") " Mar 18 16:49:19.433890 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:19.433901 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/680b3441-1e68-45ab-963c-4079511d3591-trusted-ca-bundle\") pod \"680b3441-1e68-45ab-963c-4079511d3591\" (UID: \"680b3441-1e68-45ab-963c-4079511d3591\") " Mar 18 16:49:19.434115 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:19.433930 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/680b3441-1e68-45ab-963c-4079511d3591-console-serving-cert\") pod \"680b3441-1e68-45ab-963c-4079511d3591\" (UID: \"680b3441-1e68-45ab-963c-4079511d3591\") " Mar 18 16:49:19.434115 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:19.433965 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2nqqv\" (UniqueName: \"kubernetes.io/projected/680b3441-1e68-45ab-963c-4079511d3591-kube-api-access-2nqqv\") pod \"680b3441-1e68-45ab-963c-4079511d3591\" (UID: \"680b3441-1e68-45ab-963c-4079511d3591\") " Mar 18 16:49:19.434115 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:19.433989 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/680b3441-1e68-45ab-963c-4079511d3591-console-oauth-config\") pod \"680b3441-1e68-45ab-963c-4079511d3591\" (UID: \"680b3441-1e68-45ab-963c-4079511d3591\") " Mar 18 16:49:19.434115 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:19.434041 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/680b3441-1e68-45ab-963c-4079511d3591-console-config\") pod \"680b3441-1e68-45ab-963c-4079511d3591\" (UID: \"680b3441-1e68-45ab-963c-4079511d3591\") " Mar 18 16:49:19.434115 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:19.434091 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/680b3441-1e68-45ab-963c-4079511d3591-service-ca\") pod \"680b3441-1e68-45ab-963c-4079511d3591\" (UID: \"680b3441-1e68-45ab-963c-4079511d3591\") " Mar 18 16:49:19.434342 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:19.434301 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/680b3441-1e68-45ab-963c-4079511d3591-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "680b3441-1e68-45ab-963c-4079511d3591" (UID: "680b3441-1e68-45ab-963c-4079511d3591"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 16:49:19.434508 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:19.434465 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/680b3441-1e68-45ab-963c-4079511d3591-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "680b3441-1e68-45ab-963c-4079511d3591" (UID: "680b3441-1e68-45ab-963c-4079511d3591"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 16:49:19.434627 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:19.434538 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/680b3441-1e68-45ab-963c-4079511d3591-console-config" (OuterVolumeSpecName: "console-config") pod "680b3441-1e68-45ab-963c-4079511d3591" (UID: "680b3441-1e68-45ab-963c-4079511d3591"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 16:49:19.434627 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:19.434574 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/680b3441-1e68-45ab-963c-4079511d3591-service-ca" (OuterVolumeSpecName: "service-ca") pod "680b3441-1e68-45ab-963c-4079511d3591" (UID: "680b3441-1e68-45ab-963c-4079511d3591"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 16:49:19.436429 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:19.436397 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/680b3441-1e68-45ab-963c-4079511d3591-kube-api-access-2nqqv" (OuterVolumeSpecName: "kube-api-access-2nqqv") pod "680b3441-1e68-45ab-963c-4079511d3591" (UID: "680b3441-1e68-45ab-963c-4079511d3591"). InnerVolumeSpecName "kube-api-access-2nqqv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 16:49:19.436529 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:19.436397 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/680b3441-1e68-45ab-963c-4079511d3591-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "680b3441-1e68-45ab-963c-4079511d3591" (UID: "680b3441-1e68-45ab-963c-4079511d3591"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:49:19.436587 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:19.436540 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/680b3441-1e68-45ab-963c-4079511d3591-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "680b3441-1e68-45ab-963c-4079511d3591" (UID: "680b3441-1e68-45ab-963c-4079511d3591"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:49:19.535251 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:19.535208 2575 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/680b3441-1e68-45ab-963c-4079511d3591-service-ca\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 16:49:19.535251 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:19.535243 2575 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/680b3441-1e68-45ab-963c-4079511d3591-oauth-serving-cert\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 16:49:19.535251 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:19.535253 2575 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/680b3441-1e68-45ab-963c-4079511d3591-trusted-ca-bundle\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 16:49:19.535251 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:19.535262 2575 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/680b3441-1e68-45ab-963c-4079511d3591-console-serving-cert\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 16:49:19.535541 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:19.535271 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2nqqv\" (UniqueName: \"kubernetes.io/projected/680b3441-1e68-45ab-963c-4079511d3591-kube-api-access-2nqqv\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 16:49:19.535541 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:19.535280 2575 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/680b3441-1e68-45ab-963c-4079511d3591-console-oauth-config\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 16:49:19.535541 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:19.535288 2575 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/680b3441-1e68-45ab-963c-4079511d3591-console-config\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 16:49:20.133522 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:20.133495 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7cd5c4bdbf-6qdv6_680b3441-1e68-45ab-963c-4079511d3591/console/0.log" Mar 18 16:49:20.133995 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:20.133537 2575 generic.go:358] "Generic (PLEG): container finished" podID="680b3441-1e68-45ab-963c-4079511d3591" containerID="06a17d8c23cae4d97422e9744a8d1a5a47a66d245f293358b4973d253770b574" exitCode=2 Mar 18 16:49:20.133995 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:20.133568 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7cd5c4bdbf-6qdv6" event={"ID":"680b3441-1e68-45ab-963c-4079511d3591","Type":"ContainerDied","Data":"06a17d8c23cae4d97422e9744a8d1a5a47a66d245f293358b4973d253770b574"} Mar 18 16:49:20.133995 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:20.133616 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7cd5c4bdbf-6qdv6" event={"ID":"680b3441-1e68-45ab-963c-4079511d3591","Type":"ContainerDied","Data":"43e1b9168cdf2eda450c1bb9b346544fd9734798f1c5080e5990e5b05017ae05"} Mar 18 16:49:20.133995 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:20.133625 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7cd5c4bdbf-6qdv6" Mar 18 16:49:20.133995 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:20.133632 2575 scope.go:117] "RemoveContainer" containerID="06a17d8c23cae4d97422e9744a8d1a5a47a66d245f293358b4973d253770b574" Mar 18 16:49:20.142813 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:20.142798 2575 scope.go:117] "RemoveContainer" containerID="06a17d8c23cae4d97422e9744a8d1a5a47a66d245f293358b4973d253770b574" Mar 18 16:49:20.143057 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:49:20.143036 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06a17d8c23cae4d97422e9744a8d1a5a47a66d245f293358b4973d253770b574\": container with ID starting with 06a17d8c23cae4d97422e9744a8d1a5a47a66d245f293358b4973d253770b574 not found: ID does not exist" containerID="06a17d8c23cae4d97422e9744a8d1a5a47a66d245f293358b4973d253770b574" Mar 18 16:49:20.143121 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:20.143067 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06a17d8c23cae4d97422e9744a8d1a5a47a66d245f293358b4973d253770b574"} err="failed to get container status \"06a17d8c23cae4d97422e9744a8d1a5a47a66d245f293358b4973d253770b574\": rpc error: code = NotFound desc = could not find container \"06a17d8c23cae4d97422e9744a8d1a5a47a66d245f293358b4973d253770b574\": container with ID starting with 06a17d8c23cae4d97422e9744a8d1a5a47a66d245f293358b4973d253770b574 not found: ID does not exist" Mar 18 16:49:20.153722 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:20.153694 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7cd5c4bdbf-6qdv6"] Mar 18 16:49:20.157317 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:20.157286 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-7cd5c4bdbf-6qdv6"] Mar 18 16:49:20.289432 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:20.289348 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="680b3441-1e68-45ab-963c-4079511d3591" path="/var/lib/kubelet/pods/680b3441-1e68-45ab-963c-4079511d3591/volumes" Mar 18 16:49:31.523079 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:31.523046 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cj8g6f"] Mar 18 16:49:31.523492 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:31.523324 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="680b3441-1e68-45ab-963c-4079511d3591" containerName="console" Mar 18 16:49:31.523492 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:31.523335 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="680b3441-1e68-45ab-963c-4079511d3591" containerName="console" Mar 18 16:49:31.523492 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:31.523399 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="680b3441-1e68-45ab-963c-4079511d3591" containerName="console" Mar 18 16:49:31.526098 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:31.526083 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cj8g6f" Mar 18 16:49:31.527883 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:31.527856 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Mar 18 16:49:31.527883 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:31.527875 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-m227c\"" Mar 18 16:49:31.528388 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:31.528351 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Mar 18 16:49:31.536312 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:31.536207 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cj8g6f"] Mar 18 16:49:31.617675 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:31.617647 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/82cbca28-0058-49ab-b79d-bedd3ee673f4-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cj8g6f\" (UID: \"82cbca28-0058-49ab-b79d-bedd3ee673f4\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cj8g6f" Mar 18 16:49:31.617822 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:31.617682 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzhr8\" (UniqueName: \"kubernetes.io/projected/82cbca28-0058-49ab-b79d-bedd3ee673f4-kube-api-access-hzhr8\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cj8g6f\" (UID: \"82cbca28-0058-49ab-b79d-bedd3ee673f4\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cj8g6f" Mar 18 16:49:31.617822 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:31.617745 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/82cbca28-0058-49ab-b79d-bedd3ee673f4-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cj8g6f\" (UID: \"82cbca28-0058-49ab-b79d-bedd3ee673f4\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cj8g6f" Mar 18 16:49:31.719066 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:31.719036 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/82cbca28-0058-49ab-b79d-bedd3ee673f4-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cj8g6f\" (UID: \"82cbca28-0058-49ab-b79d-bedd3ee673f4\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cj8g6f" Mar 18 16:49:31.719143 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:31.719075 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hzhr8\" (UniqueName: \"kubernetes.io/projected/82cbca28-0058-49ab-b79d-bedd3ee673f4-kube-api-access-hzhr8\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cj8g6f\" (UID: \"82cbca28-0058-49ab-b79d-bedd3ee673f4\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cj8g6f" Mar 18 16:49:31.719143 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:31.719119 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/82cbca28-0058-49ab-b79d-bedd3ee673f4-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cj8g6f\" (UID: \"82cbca28-0058-49ab-b79d-bedd3ee673f4\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cj8g6f" Mar 18 16:49:31.719482 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:31.719462 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/82cbca28-0058-49ab-b79d-bedd3ee673f4-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cj8g6f\" (UID: \"82cbca28-0058-49ab-b79d-bedd3ee673f4\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cj8g6f" Mar 18 16:49:31.719524 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:31.719485 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/82cbca28-0058-49ab-b79d-bedd3ee673f4-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cj8g6f\" (UID: \"82cbca28-0058-49ab-b79d-bedd3ee673f4\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cj8g6f" Mar 18 16:49:31.727015 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:31.726993 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzhr8\" (UniqueName: \"kubernetes.io/projected/82cbca28-0058-49ab-b79d-bedd3ee673f4-kube-api-access-hzhr8\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cj8g6f\" (UID: \"82cbca28-0058-49ab-b79d-bedd3ee673f4\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cj8g6f" Mar 18 16:49:31.836094 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:31.836029 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cj8g6f" Mar 18 16:49:31.962905 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:31.962873 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cj8g6f"] Mar 18 16:49:31.966231 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:49:31.966201 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod82cbca28_0058_49ab_b79d_bedd3ee673f4.slice/crio-5af81338f3817fe1d3d9d4f717592c8091e67dbea81481a1ba95e2d5dec974ef WatchSource:0}: Error finding container 5af81338f3817fe1d3d9d4f717592c8091e67dbea81481a1ba95e2d5dec974ef: Status 404 returned error can't find the container with id 5af81338f3817fe1d3d9d4f717592c8091e67dbea81481a1ba95e2d5dec974ef Mar 18 16:49:32.167904 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:32.167874 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cj8g6f" event={"ID":"82cbca28-0058-49ab-b79d-bedd3ee673f4","Type":"ContainerStarted","Data":"5af81338f3817fe1d3d9d4f717592c8091e67dbea81481a1ba95e2d5dec974ef"} Mar 18 16:49:38.191924 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:38.191888 2575 generic.go:358] "Generic (PLEG): container finished" podID="82cbca28-0058-49ab-b79d-bedd3ee673f4" containerID="4ec77e02e6a9ed9afaebd9830b839b8ec3e361d16e9604100fc6a8a7985f39ea" exitCode=0 Mar 18 16:49:38.192405 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:38.191959 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cj8g6f" event={"ID":"82cbca28-0058-49ab-b79d-bedd3ee673f4","Type":"ContainerDied","Data":"4ec77e02e6a9ed9afaebd9830b839b8ec3e361d16e9604100fc6a8a7985f39ea"} Mar 18 16:49:41.203153 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:41.203121 2575 generic.go:358] "Generic (PLEG): container finished" podID="82cbca28-0058-49ab-b79d-bedd3ee673f4" containerID="121b7abcb36d48335d85885d3ee1449443dcb2cd35a9925cffad23b21a0efb50" exitCode=0 Mar 18 16:49:41.203534 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:41.203184 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cj8g6f" event={"ID":"82cbca28-0058-49ab-b79d-bedd3ee673f4","Type":"ContainerDied","Data":"121b7abcb36d48335d85885d3ee1449443dcb2cd35a9925cffad23b21a0efb50"} Mar 18 16:49:42.172802 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:42.172761 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qktrm_a51f93e2-c750-4700-bb46-109456c7c78a/ovn-acl-logging/0.log" Mar 18 16:49:42.173503 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:42.173415 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qktrm_a51f93e2-c750-4700-bb46-109456c7c78a/ovn-acl-logging/0.log" Mar 18 16:49:42.177020 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:42.176996 2575 kubelet.go:1628] "Image garbage collection succeeded" Mar 18 16:49:47.221846 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:47.221809 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cj8g6f" event={"ID":"82cbca28-0058-49ab-b79d-bedd3ee673f4","Type":"ContainerStarted","Data":"c103c78baf7d52199acd25e3ee307f9891ace6473028838f45cb071ebf8aa4e6"} Mar 18 16:49:47.254052 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:47.253991 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cj8g6f" podStartSLOduration=1.094703064 podStartE2EDuration="16.253971534s" podCreationTimestamp="2026-03-18 16:49:31 +0000 UTC" firstStartedPulling="2026-03-18 16:49:31.968415229 +0000 UTC m=+290.279668132" lastFinishedPulling="2026-03-18 16:49:47.127683703 +0000 UTC m=+305.438936602" observedRunningTime="2026-03-18 16:49:47.25077489 +0000 UTC m=+305.562027811" watchObservedRunningTime="2026-03-18 16:49:47.253971534 +0000 UTC m=+305.565224457" Mar 18 16:49:48.227024 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:48.226984 2575 generic.go:358] "Generic (PLEG): container finished" podID="82cbca28-0058-49ab-b79d-bedd3ee673f4" containerID="c103c78baf7d52199acd25e3ee307f9891ace6473028838f45cb071ebf8aa4e6" exitCode=0 Mar 18 16:49:48.227513 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:48.227101 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cj8g6f" event={"ID":"82cbca28-0058-49ab-b79d-bedd3ee673f4","Type":"ContainerDied","Data":"c103c78baf7d52199acd25e3ee307f9891ace6473028838f45cb071ebf8aa4e6"} Mar 18 16:49:49.349475 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:49.349452 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cj8g6f" Mar 18 16:49:49.471542 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:49.471515 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hzhr8\" (UniqueName: \"kubernetes.io/projected/82cbca28-0058-49ab-b79d-bedd3ee673f4-kube-api-access-hzhr8\") pod \"82cbca28-0058-49ab-b79d-bedd3ee673f4\" (UID: \"82cbca28-0058-49ab-b79d-bedd3ee673f4\") " Mar 18 16:49:49.471687 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:49.471605 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/82cbca28-0058-49ab-b79d-bedd3ee673f4-bundle\") pod \"82cbca28-0058-49ab-b79d-bedd3ee673f4\" (UID: \"82cbca28-0058-49ab-b79d-bedd3ee673f4\") " Mar 18 16:49:49.471687 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:49.471628 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/82cbca28-0058-49ab-b79d-bedd3ee673f4-util\") pod \"82cbca28-0058-49ab-b79d-bedd3ee673f4\" (UID: \"82cbca28-0058-49ab-b79d-bedd3ee673f4\") " Mar 18 16:49:49.472211 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:49.472181 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/82cbca28-0058-49ab-b79d-bedd3ee673f4-bundle" (OuterVolumeSpecName: "bundle") pod "82cbca28-0058-49ab-b79d-bedd3ee673f4" (UID: "82cbca28-0058-49ab-b79d-bedd3ee673f4"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 18 16:49:49.473691 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:49.473662 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82cbca28-0058-49ab-b79d-bedd3ee673f4-kube-api-access-hzhr8" (OuterVolumeSpecName: "kube-api-access-hzhr8") pod "82cbca28-0058-49ab-b79d-bedd3ee673f4" (UID: "82cbca28-0058-49ab-b79d-bedd3ee673f4"). InnerVolumeSpecName "kube-api-access-hzhr8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 16:49:49.476836 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:49.476811 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/82cbca28-0058-49ab-b79d-bedd3ee673f4-util" (OuterVolumeSpecName: "util") pod "82cbca28-0058-49ab-b79d-bedd3ee673f4" (UID: "82cbca28-0058-49ab-b79d-bedd3ee673f4"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 18 16:49:49.572643 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:49.572563 2575 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/82cbca28-0058-49ab-b79d-bedd3ee673f4-bundle\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 16:49:49.572643 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:49.572593 2575 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/82cbca28-0058-49ab-b79d-bedd3ee673f4-util\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 16:49:49.572643 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:49.572606 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hzhr8\" (UniqueName: \"kubernetes.io/projected/82cbca28-0058-49ab-b79d-bedd3ee673f4-kube-api-access-hzhr8\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 16:49:50.233784 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:50.233738 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cj8g6f" event={"ID":"82cbca28-0058-49ab-b79d-bedd3ee673f4","Type":"ContainerDied","Data":"5af81338f3817fe1d3d9d4f717592c8091e67dbea81481a1ba95e2d5dec974ef"} Mar 18 16:49:50.233784 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:50.233782 2575 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5af81338f3817fe1d3d9d4f717592c8091e67dbea81481a1ba95e2d5dec974ef" Mar 18 16:49:50.234065 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:50.233801 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cj8g6f" Mar 18 16:49:53.344297 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:53.344268 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-hvpxp"] Mar 18 16:49:53.344764 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:53.344738 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="82cbca28-0058-49ab-b79d-bedd3ee673f4" containerName="pull" Mar 18 16:49:53.344764 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:53.344755 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="82cbca28-0058-49ab-b79d-bedd3ee673f4" containerName="pull" Mar 18 16:49:53.344887 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:53.344779 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="82cbca28-0058-49ab-b79d-bedd3ee673f4" containerName="util" Mar 18 16:49:53.344887 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:53.344787 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="82cbca28-0058-49ab-b79d-bedd3ee673f4" containerName="util" Mar 18 16:49:53.344887 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:53.344803 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="82cbca28-0058-49ab-b79d-bedd3ee673f4" containerName="extract" Mar 18 16:49:53.344887 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:53.344810 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="82cbca28-0058-49ab-b79d-bedd3ee673f4" containerName="extract" Mar 18 16:49:53.344887 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:53.344885 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="82cbca28-0058-49ab-b79d-bedd3ee673f4" containerName="extract" Mar 18 16:49:53.355000 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:53.354980 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-hvpxp" Mar 18 16:49:53.357220 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:53.357199 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Mar 18 16:49:53.357378 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:53.357254 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Mar 18 16:49:53.357463 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:53.357304 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Mar 18 16:49:53.357523 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:53.357319 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-5gvzz\"" Mar 18 16:49:53.358351 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:53.358331 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-hvpxp"] Mar 18 16:49:53.501466 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:53.501434 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmhz6\" (UniqueName: \"kubernetes.io/projected/9bab0cb4-3466-4b78-a173-8d2b37b85164-kube-api-access-gmhz6\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-hvpxp\" (UID: \"9bab0cb4-3466-4b78-a173-8d2b37b85164\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-hvpxp" Mar 18 16:49:53.501466 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:53.501470 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/9bab0cb4-3466-4b78-a173-8d2b37b85164-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-hvpxp\" (UID: \"9bab0cb4-3466-4b78-a173-8d2b37b85164\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-hvpxp" Mar 18 16:49:53.602582 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:53.602506 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gmhz6\" (UniqueName: \"kubernetes.io/projected/9bab0cb4-3466-4b78-a173-8d2b37b85164-kube-api-access-gmhz6\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-hvpxp\" (UID: \"9bab0cb4-3466-4b78-a173-8d2b37b85164\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-hvpxp" Mar 18 16:49:53.602582 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:53.602542 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/9bab0cb4-3466-4b78-a173-8d2b37b85164-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-hvpxp\" (UID: \"9bab0cb4-3466-4b78-a173-8d2b37b85164\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-hvpxp" Mar 18 16:49:53.605025 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:53.605003 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/9bab0cb4-3466-4b78-a173-8d2b37b85164-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-hvpxp\" (UID: \"9bab0cb4-3466-4b78-a173-8d2b37b85164\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-hvpxp" Mar 18 16:49:53.615892 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:53.615867 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmhz6\" (UniqueName: \"kubernetes.io/projected/9bab0cb4-3466-4b78-a173-8d2b37b85164-kube-api-access-gmhz6\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-hvpxp\" (UID: \"9bab0cb4-3466-4b78-a173-8d2b37b85164\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-hvpxp" Mar 18 16:49:53.665804 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:53.665776 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-hvpxp" Mar 18 16:49:53.792827 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:53.792799 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-hvpxp"] Mar 18 16:49:53.795308 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:49:53.795281 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9bab0cb4_3466_4b78_a173_8d2b37b85164.slice/crio-186dbd9cee499ba42cf9b483cba08ded93a4c4c8ee7d0540ef01e52ee157b543 WatchSource:0}: Error finding container 186dbd9cee499ba42cf9b483cba08ded93a4c4c8ee7d0540ef01e52ee157b543: Status 404 returned error can't find the container with id 186dbd9cee499ba42cf9b483cba08ded93a4c4c8ee7d0540ef01e52ee157b543 Mar 18 16:49:53.797113 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:53.797089 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 16:49:54.246089 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:54.246055 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-hvpxp" event={"ID":"9bab0cb4-3466-4b78-a173-8d2b37b85164","Type":"ContainerStarted","Data":"186dbd9cee499ba42cf9b483cba08ded93a4c4c8ee7d0540ef01e52ee157b543"} Mar 18 16:49:58.070438 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:58.070406 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-9fbq8"] Mar 18 16:49:58.090047 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:58.090020 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-9fbq8"] Mar 18 16:49:58.090199 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:58.090132 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-9fbq8" Mar 18 16:49:58.092542 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:58.092516 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Mar 18 16:49:58.092674 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:58.092521 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-klwpk\"" Mar 18 16:49:58.092674 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:58.092610 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Mar 18 16:49:58.239803 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:58.239775 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltd2x\" (UniqueName: \"kubernetes.io/projected/c138b489-420a-46b9-b21f-57575128e8ab-kube-api-access-ltd2x\") pod \"keda-operator-ffbb595cb-9fbq8\" (UID: \"c138b489-420a-46b9-b21f-57575128e8ab\") " pod="openshift-keda/keda-operator-ffbb595cb-9fbq8" Mar 18 16:49:58.239953 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:58.239818 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/c138b489-420a-46b9-b21f-57575128e8ab-cabundle0\") pod \"keda-operator-ffbb595cb-9fbq8\" (UID: \"c138b489-420a-46b9-b21f-57575128e8ab\") " pod="openshift-keda/keda-operator-ffbb595cb-9fbq8" Mar 18 16:49:58.239953 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:58.239845 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/c138b489-420a-46b9-b21f-57575128e8ab-certificates\") pod \"keda-operator-ffbb595cb-9fbq8\" (UID: \"c138b489-420a-46b9-b21f-57575128e8ab\") " pod="openshift-keda/keda-operator-ffbb595cb-9fbq8" Mar 18 16:49:58.260699 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:58.260662 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-hvpxp" event={"ID":"9bab0cb4-3466-4b78-a173-8d2b37b85164","Type":"ContainerStarted","Data":"310e58ee0c5a4d40e511dedbad6684f5c418b3b5d2d7dd381e2e8764f7357041"} Mar 18 16:49:58.260854 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:58.260822 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-hvpxp" Mar 18 16:49:58.274857 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:58.274829 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-nh4mm"] Mar 18 16:49:58.279258 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:58.279213 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-hvpxp" podStartSLOduration=1.606820667 podStartE2EDuration="5.279198919s" podCreationTimestamp="2026-03-18 16:49:53 +0000 UTC" firstStartedPulling="2026-03-18 16:49:53.797227257 +0000 UTC m=+312.108480157" lastFinishedPulling="2026-03-18 16:49:57.469605496 +0000 UTC m=+315.780858409" observedRunningTime="2026-03-18 16:49:58.278570678 +0000 UTC m=+316.589823598" watchObservedRunningTime="2026-03-18 16:49:58.279198919 +0000 UTC m=+316.590451841" Mar 18 16:49:58.305148 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:58.305126 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-nh4mm" Mar 18 16:49:58.307079 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:58.307056 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Mar 18 16:49:58.308887 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:58.308869 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-nh4mm"] Mar 18 16:49:58.340187 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:58.340117 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/c138b489-420a-46b9-b21f-57575128e8ab-cabundle0\") pod \"keda-operator-ffbb595cb-9fbq8\" (UID: \"c138b489-420a-46b9-b21f-57575128e8ab\") " pod="openshift-keda/keda-operator-ffbb595cb-9fbq8" Mar 18 16:49:58.340187 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:58.340156 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/c138b489-420a-46b9-b21f-57575128e8ab-certificates\") pod \"keda-operator-ffbb595cb-9fbq8\" (UID: \"c138b489-420a-46b9-b21f-57575128e8ab\") " pod="openshift-keda/keda-operator-ffbb595cb-9fbq8" Mar 18 16:49:58.340343 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:58.340276 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ltd2x\" (UniqueName: \"kubernetes.io/projected/c138b489-420a-46b9-b21f-57575128e8ab-kube-api-access-ltd2x\") pod \"keda-operator-ffbb595cb-9fbq8\" (UID: \"c138b489-420a-46b9-b21f-57575128e8ab\") " pod="openshift-keda/keda-operator-ffbb595cb-9fbq8" Mar 18 16:49:58.340343 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:49:58.340333 2575 secret.go:281] references non-existent secret key: ca.crt Mar 18 16:49:58.340485 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:49:58.340351 2575 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Mar 18 16:49:58.340485 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:49:58.340379 2575 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-9fbq8: references non-existent secret key: ca.crt Mar 18 16:49:58.340485 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:49:58.340441 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c138b489-420a-46b9-b21f-57575128e8ab-certificates podName:c138b489-420a-46b9-b21f-57575128e8ab nodeName:}" failed. No retries permitted until 2026-03-18 16:49:58.840421131 +0000 UTC m=+317.151674046 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/c138b489-420a-46b9-b21f-57575128e8ab-certificates") pod "keda-operator-ffbb595cb-9fbq8" (UID: "c138b489-420a-46b9-b21f-57575128e8ab") : references non-existent secret key: ca.crt Mar 18 16:49:58.340791 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:58.340771 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/c138b489-420a-46b9-b21f-57575128e8ab-cabundle0\") pod \"keda-operator-ffbb595cb-9fbq8\" (UID: \"c138b489-420a-46b9-b21f-57575128e8ab\") " pod="openshift-keda/keda-operator-ffbb595cb-9fbq8" Mar 18 16:49:58.348317 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:58.348296 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltd2x\" (UniqueName: \"kubernetes.io/projected/c138b489-420a-46b9-b21f-57575128e8ab-kube-api-access-ltd2x\") pod \"keda-operator-ffbb595cb-9fbq8\" (UID: \"c138b489-420a-46b9-b21f-57575128e8ab\") " pod="openshift-keda/keda-operator-ffbb595cb-9fbq8" Mar 18 16:49:58.441630 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:58.441602 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/5c7a0f12-ae8d-43a9-977d-0cff8ba6283c-certificates\") pod \"keda-metrics-apiserver-7c9f485588-nh4mm\" (UID: \"5c7a0f12-ae8d-43a9-977d-0cff8ba6283c\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-nh4mm" Mar 18 16:49:58.441798 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:58.441638 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7lj2\" (UniqueName: \"kubernetes.io/projected/5c7a0f12-ae8d-43a9-977d-0cff8ba6283c-kube-api-access-s7lj2\") pod \"keda-metrics-apiserver-7c9f485588-nh4mm\" (UID: \"5c7a0f12-ae8d-43a9-977d-0cff8ba6283c\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-nh4mm" Mar 18 16:49:58.441798 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:58.441662 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/5c7a0f12-ae8d-43a9-977d-0cff8ba6283c-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-nh4mm\" (UID: \"5c7a0f12-ae8d-43a9-977d-0cff8ba6283c\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-nh4mm" Mar 18 16:49:58.542575 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:58.542539 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/5c7a0f12-ae8d-43a9-977d-0cff8ba6283c-certificates\") pod \"keda-metrics-apiserver-7c9f485588-nh4mm\" (UID: \"5c7a0f12-ae8d-43a9-977d-0cff8ba6283c\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-nh4mm" Mar 18 16:49:58.542753 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:58.542583 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s7lj2\" (UniqueName: \"kubernetes.io/projected/5c7a0f12-ae8d-43a9-977d-0cff8ba6283c-kube-api-access-s7lj2\") pod \"keda-metrics-apiserver-7c9f485588-nh4mm\" (UID: \"5c7a0f12-ae8d-43a9-977d-0cff8ba6283c\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-nh4mm" Mar 18 16:49:58.542753 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:58.542617 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/5c7a0f12-ae8d-43a9-977d-0cff8ba6283c-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-nh4mm\" (UID: \"5c7a0f12-ae8d-43a9-977d-0cff8ba6283c\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-nh4mm" Mar 18 16:49:58.542753 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:49:58.542687 2575 secret.go:281] references non-existent secret key: tls.crt Mar 18 16:49:58.542753 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:49:58.542708 2575 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Mar 18 16:49:58.542753 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:49:58.542728 2575 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-nh4mm: references non-existent secret key: tls.crt Mar 18 16:49:58.543016 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:49:58.542798 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5c7a0f12-ae8d-43a9-977d-0cff8ba6283c-certificates podName:5c7a0f12-ae8d-43a9-977d-0cff8ba6283c nodeName:}" failed. No retries permitted until 2026-03-18 16:49:59.042778136 +0000 UTC m=+317.354031042 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/5c7a0f12-ae8d-43a9-977d-0cff8ba6283c-certificates") pod "keda-metrics-apiserver-7c9f485588-nh4mm" (UID: "5c7a0f12-ae8d-43a9-977d-0cff8ba6283c") : references non-existent secret key: tls.crt Mar 18 16:49:58.543073 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:58.543017 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/5c7a0f12-ae8d-43a9-977d-0cff8ba6283c-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-nh4mm\" (UID: \"5c7a0f12-ae8d-43a9-977d-0cff8ba6283c\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-nh4mm" Mar 18 16:49:58.557022 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:58.556986 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7lj2\" (UniqueName: \"kubernetes.io/projected/5c7a0f12-ae8d-43a9-977d-0cff8ba6283c-kube-api-access-s7lj2\") pod \"keda-metrics-apiserver-7c9f485588-nh4mm\" (UID: \"5c7a0f12-ae8d-43a9-977d-0cff8ba6283c\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-nh4mm" Mar 18 16:49:58.557865 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:58.557840 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-gzktr"] Mar 18 16:49:58.582171 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:58.582144 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-gzktr"] Mar 18 16:49:58.582326 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:58.582278 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-gzktr" Mar 18 16:49:58.584582 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:58.584558 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Mar 18 16:49:58.744828 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:58.744791 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/d43b6345-9d33-426d-991c-4a6d25c9d67b-certificates\") pod \"keda-admission-cf49989db-gzktr\" (UID: \"d43b6345-9d33-426d-991c-4a6d25c9d67b\") " pod="openshift-keda/keda-admission-cf49989db-gzktr" Mar 18 16:49:58.744981 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:58.744844 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8cvq\" (UniqueName: \"kubernetes.io/projected/d43b6345-9d33-426d-991c-4a6d25c9d67b-kube-api-access-w8cvq\") pod \"keda-admission-cf49989db-gzktr\" (UID: \"d43b6345-9d33-426d-991c-4a6d25c9d67b\") " pod="openshift-keda/keda-admission-cf49989db-gzktr" Mar 18 16:49:58.846170 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:58.846065 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w8cvq\" (UniqueName: \"kubernetes.io/projected/d43b6345-9d33-426d-991c-4a6d25c9d67b-kube-api-access-w8cvq\") pod \"keda-admission-cf49989db-gzktr\" (UID: \"d43b6345-9d33-426d-991c-4a6d25c9d67b\") " pod="openshift-keda/keda-admission-cf49989db-gzktr" Mar 18 16:49:58.846170 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:58.846157 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/c138b489-420a-46b9-b21f-57575128e8ab-certificates\") pod \"keda-operator-ffbb595cb-9fbq8\" (UID: \"c138b489-420a-46b9-b21f-57575128e8ab\") " pod="openshift-keda/keda-operator-ffbb595cb-9fbq8" Mar 18 16:49:58.846413 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:58.846251 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/d43b6345-9d33-426d-991c-4a6d25c9d67b-certificates\") pod \"keda-admission-cf49989db-gzktr\" (UID: \"d43b6345-9d33-426d-991c-4a6d25c9d67b\") " pod="openshift-keda/keda-admission-cf49989db-gzktr" Mar 18 16:49:58.846899 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:49:58.846870 2575 secret.go:281] references non-existent secret key: ca.crt Mar 18 16:49:58.846899 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:49:58.846900 2575 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Mar 18 16:49:58.847041 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:49:58.846911 2575 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-9fbq8: references non-existent secret key: ca.crt Mar 18 16:49:58.847041 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:49:58.846971 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c138b489-420a-46b9-b21f-57575128e8ab-certificates podName:c138b489-420a-46b9-b21f-57575128e8ab nodeName:}" failed. No retries permitted until 2026-03-18 16:49:59.846952785 +0000 UTC m=+318.158205684 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/c138b489-420a-46b9-b21f-57575128e8ab-certificates") pod "keda-operator-ffbb595cb-9fbq8" (UID: "c138b489-420a-46b9-b21f-57575128e8ab") : references non-existent secret key: ca.crt Mar 18 16:49:58.849920 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:58.849895 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/d43b6345-9d33-426d-991c-4a6d25c9d67b-certificates\") pod \"keda-admission-cf49989db-gzktr\" (UID: \"d43b6345-9d33-426d-991c-4a6d25c9d67b\") " pod="openshift-keda/keda-admission-cf49989db-gzktr" Mar 18 16:49:58.868638 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:58.868613 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8cvq\" (UniqueName: \"kubernetes.io/projected/d43b6345-9d33-426d-991c-4a6d25c9d67b-kube-api-access-w8cvq\") pod \"keda-admission-cf49989db-gzktr\" (UID: \"d43b6345-9d33-426d-991c-4a6d25c9d67b\") " pod="openshift-keda/keda-admission-cf49989db-gzktr" Mar 18 16:49:58.895532 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:58.895506 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-gzktr" Mar 18 16:49:59.023706 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:59.023672 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-gzktr"] Mar 18 16:49:59.030314 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:49:59.030282 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd43b6345_9d33_426d_991c_4a6d25c9d67b.slice/crio-47982169be2b79a3a5ca2fc5abe500f0c5986e4db8707c0c4cd6e712010af440 WatchSource:0}: Error finding container 47982169be2b79a3a5ca2fc5abe500f0c5986e4db8707c0c4cd6e712010af440: Status 404 returned error can't find the container with id 47982169be2b79a3a5ca2fc5abe500f0c5986e4db8707c0c4cd6e712010af440 Mar 18 16:49:59.047212 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:59.047188 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/5c7a0f12-ae8d-43a9-977d-0cff8ba6283c-certificates\") pod \"keda-metrics-apiserver-7c9f485588-nh4mm\" (UID: \"5c7a0f12-ae8d-43a9-977d-0cff8ba6283c\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-nh4mm" Mar 18 16:49:59.047302 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:49:59.047294 2575 secret.go:281] references non-existent secret key: tls.crt Mar 18 16:49:59.047341 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:49:59.047304 2575 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Mar 18 16:49:59.047341 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:49:59.047319 2575 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-nh4mm: references non-existent secret key: tls.crt Mar 18 16:49:59.047434 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:49:59.047413 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5c7a0f12-ae8d-43a9-977d-0cff8ba6283c-certificates podName:5c7a0f12-ae8d-43a9-977d-0cff8ba6283c nodeName:}" failed. No retries permitted until 2026-03-18 16:50:00.047349226 +0000 UTC m=+318.358602125 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/5c7a0f12-ae8d-43a9-977d-0cff8ba6283c-certificates") pod "keda-metrics-apiserver-7c9f485588-nh4mm" (UID: "5c7a0f12-ae8d-43a9-977d-0cff8ba6283c") : references non-existent secret key: tls.crt Mar 18 16:49:59.265053 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:59.265014 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-gzktr" event={"ID":"d43b6345-9d33-426d-991c-4a6d25c9d67b","Type":"ContainerStarted","Data":"47982169be2b79a3a5ca2fc5abe500f0c5986e4db8707c0c4cd6e712010af440"} Mar 18 16:49:59.852808 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:49:59.852773 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/c138b489-420a-46b9-b21f-57575128e8ab-certificates\") pod \"keda-operator-ffbb595cb-9fbq8\" (UID: \"c138b489-420a-46b9-b21f-57575128e8ab\") " pod="openshift-keda/keda-operator-ffbb595cb-9fbq8" Mar 18 16:49:59.853012 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:49:59.852917 2575 secret.go:281] references non-existent secret key: ca.crt Mar 18 16:49:59.853012 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:49:59.852935 2575 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Mar 18 16:49:59.853012 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:49:59.852949 2575 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-9fbq8: references non-existent secret key: ca.crt Mar 18 16:49:59.853179 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:49:59.853015 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c138b489-420a-46b9-b21f-57575128e8ab-certificates podName:c138b489-420a-46b9-b21f-57575128e8ab nodeName:}" failed. No retries permitted until 2026-03-18 16:50:01.852995304 +0000 UTC m=+320.164248203 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/c138b489-420a-46b9-b21f-57575128e8ab-certificates") pod "keda-operator-ffbb595cb-9fbq8" (UID: "c138b489-420a-46b9-b21f-57575128e8ab") : references non-existent secret key: ca.crt Mar 18 16:50:00.054248 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:50:00.054208 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/5c7a0f12-ae8d-43a9-977d-0cff8ba6283c-certificates\") pod \"keda-metrics-apiserver-7c9f485588-nh4mm\" (UID: \"5c7a0f12-ae8d-43a9-977d-0cff8ba6283c\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-nh4mm" Mar 18 16:50:00.054447 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:50:00.054376 2575 secret.go:281] references non-existent secret key: tls.crt Mar 18 16:50:00.054447 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:50:00.054397 2575 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Mar 18 16:50:00.054447 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:50:00.054420 2575 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-nh4mm: references non-existent secret key: tls.crt Mar 18 16:50:00.054558 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:50:00.054480 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5c7a0f12-ae8d-43a9-977d-0cff8ba6283c-certificates podName:5c7a0f12-ae8d-43a9-977d-0cff8ba6283c nodeName:}" failed. No retries permitted until 2026-03-18 16:50:02.054462404 +0000 UTC m=+320.365715314 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/5c7a0f12-ae8d-43a9-977d-0cff8ba6283c-certificates") pod "keda-metrics-apiserver-7c9f485588-nh4mm" (UID: "5c7a0f12-ae8d-43a9-977d-0cff8ba6283c") : references non-existent secret key: tls.crt Mar 18 16:50:01.274148 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:50:01.274114 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-gzktr" event={"ID":"d43b6345-9d33-426d-991c-4a6d25c9d67b","Type":"ContainerStarted","Data":"3970d484e71ded32fac28e5a2dfffd6b2697bc8fc9fea570fdc472a02386fe90"} Mar 18 16:50:01.274628 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:50:01.274218 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-gzktr" Mar 18 16:50:01.292493 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:50:01.292440 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-gzktr" podStartSLOduration=1.556187414 podStartE2EDuration="3.29242436s" podCreationTimestamp="2026-03-18 16:49:58 +0000 UTC" firstStartedPulling="2026-03-18 16:49:59.031700035 +0000 UTC m=+317.342952934" lastFinishedPulling="2026-03-18 16:50:00.767936977 +0000 UTC m=+319.079189880" observedRunningTime="2026-03-18 16:50:01.290744903 +0000 UTC m=+319.601997826" watchObservedRunningTime="2026-03-18 16:50:01.29242436 +0000 UTC m=+319.603677284" Mar 18 16:50:01.868575 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:50:01.868542 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/c138b489-420a-46b9-b21f-57575128e8ab-certificates\") pod \"keda-operator-ffbb595cb-9fbq8\" (UID: \"c138b489-420a-46b9-b21f-57575128e8ab\") " pod="openshift-keda/keda-operator-ffbb595cb-9fbq8" Mar 18 16:50:01.868749 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:50:01.868700 2575 secret.go:281] references non-existent secret key: ca.crt Mar 18 16:50:01.868749 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:50:01.868719 2575 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Mar 18 16:50:01.868749 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:50:01.868728 2575 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-9fbq8: references non-existent secret key: ca.crt Mar 18 16:50:01.868849 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:50:01.868783 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c138b489-420a-46b9-b21f-57575128e8ab-certificates podName:c138b489-420a-46b9-b21f-57575128e8ab nodeName:}" failed. No retries permitted until 2026-03-18 16:50:05.868768043 +0000 UTC m=+324.180020946 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/c138b489-420a-46b9-b21f-57575128e8ab-certificates") pod "keda-operator-ffbb595cb-9fbq8" (UID: "c138b489-420a-46b9-b21f-57575128e8ab") : references non-existent secret key: ca.crt Mar 18 16:50:02.069703 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:50:02.069667 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/5c7a0f12-ae8d-43a9-977d-0cff8ba6283c-certificates\") pod \"keda-metrics-apiserver-7c9f485588-nh4mm\" (UID: \"5c7a0f12-ae8d-43a9-977d-0cff8ba6283c\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-nh4mm" Mar 18 16:50:02.069873 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:50:02.069784 2575 secret.go:281] references non-existent secret key: tls.crt Mar 18 16:50:02.069873 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:50:02.069795 2575 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Mar 18 16:50:02.069873 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:50:02.069812 2575 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-nh4mm: references non-existent secret key: tls.crt Mar 18 16:50:02.069873 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:50:02.069860 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5c7a0f12-ae8d-43a9-977d-0cff8ba6283c-certificates podName:5c7a0f12-ae8d-43a9-977d-0cff8ba6283c nodeName:}" failed. No retries permitted until 2026-03-18 16:50:06.069847873 +0000 UTC m=+324.381100772 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/5c7a0f12-ae8d-43a9-977d-0cff8ba6283c-certificates") pod "keda-metrics-apiserver-7c9f485588-nh4mm" (UID: "5c7a0f12-ae8d-43a9-977d-0cff8ba6283c") : references non-existent secret key: tls.crt Mar 18 16:50:05.896624 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:50:05.896588 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/c138b489-420a-46b9-b21f-57575128e8ab-certificates\") pod \"keda-operator-ffbb595cb-9fbq8\" (UID: \"c138b489-420a-46b9-b21f-57575128e8ab\") " pod="openshift-keda/keda-operator-ffbb595cb-9fbq8" Mar 18 16:50:05.899162 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:50:05.899135 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/c138b489-420a-46b9-b21f-57575128e8ab-certificates\") pod \"keda-operator-ffbb595cb-9fbq8\" (UID: \"c138b489-420a-46b9-b21f-57575128e8ab\") " pod="openshift-keda/keda-operator-ffbb595cb-9fbq8" Mar 18 16:50:05.899897 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:50:05.899881 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-9fbq8" Mar 18 16:50:06.019303 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:50:06.019272 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-9fbq8"] Mar 18 16:50:06.022671 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:50:06.022640 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc138b489_420a_46b9_b21f_57575128e8ab.slice/crio-e0fa9e3784de618e955c5e63a6987cf734082cd65dc41606de34581f555913d8 WatchSource:0}: Error finding container e0fa9e3784de618e955c5e63a6987cf734082cd65dc41606de34581f555913d8: Status 404 returned error can't find the container with id e0fa9e3784de618e955c5e63a6987cf734082cd65dc41606de34581f555913d8 Mar 18 16:50:06.098525 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:50:06.098493 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/5c7a0f12-ae8d-43a9-977d-0cff8ba6283c-certificates\") pod \"keda-metrics-apiserver-7c9f485588-nh4mm\" (UID: \"5c7a0f12-ae8d-43a9-977d-0cff8ba6283c\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-nh4mm" Mar 18 16:50:06.101111 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:50:06.101090 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/5c7a0f12-ae8d-43a9-977d-0cff8ba6283c-certificates\") pod \"keda-metrics-apiserver-7c9f485588-nh4mm\" (UID: \"5c7a0f12-ae8d-43a9-977d-0cff8ba6283c\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-nh4mm" Mar 18 16:50:06.117560 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:50:06.117542 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-nh4mm" Mar 18 16:50:06.233188 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:50:06.233012 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-nh4mm"] Mar 18 16:50:06.235817 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:50:06.235788 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c7a0f12_ae8d_43a9_977d_0cff8ba6283c.slice/crio-5262d87112a8e9b40be7e09f44acabb66689dfeb8919a0b542c1daf42bc777ef WatchSource:0}: Error finding container 5262d87112a8e9b40be7e09f44acabb66689dfeb8919a0b542c1daf42bc777ef: Status 404 returned error can't find the container with id 5262d87112a8e9b40be7e09f44acabb66689dfeb8919a0b542c1daf42bc777ef Mar 18 16:50:06.290072 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:50:06.290042 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-9fbq8" event={"ID":"c138b489-420a-46b9-b21f-57575128e8ab","Type":"ContainerStarted","Data":"e0fa9e3784de618e955c5e63a6987cf734082cd65dc41606de34581f555913d8"} Mar 18 16:50:06.290329 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:50:06.290309 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-nh4mm" event={"ID":"5c7a0f12-ae8d-43a9-977d-0cff8ba6283c","Type":"ContainerStarted","Data":"5262d87112a8e9b40be7e09f44acabb66689dfeb8919a0b542c1daf42bc777ef"} Mar 18 16:50:09.301123 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:50:09.301081 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-nh4mm" event={"ID":"5c7a0f12-ae8d-43a9-977d-0cff8ba6283c","Type":"ContainerStarted","Data":"3034d5b5419eca58fb5df693d56d568fd1a92864af3a5696ac26dd822cc947f9"} Mar 18 16:50:09.301488 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:50:09.301221 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-nh4mm" Mar 18 16:50:09.317301 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:50:09.317260 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-nh4mm" podStartSLOduration=8.463264287 podStartE2EDuration="11.317247551s" podCreationTimestamp="2026-03-18 16:49:58 +0000 UTC" firstStartedPulling="2026-03-18 16:50:06.237208234 +0000 UTC m=+324.548461133" lastFinishedPulling="2026-03-18 16:50:09.091191482 +0000 UTC m=+327.402444397" observedRunningTime="2026-03-18 16:50:09.315202771 +0000 UTC m=+327.626455691" watchObservedRunningTime="2026-03-18 16:50:09.317247551 +0000 UTC m=+327.628500449" Mar 18 16:50:15.320968 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:50:15.320933 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-9fbq8" event={"ID":"c138b489-420a-46b9-b21f-57575128e8ab","Type":"ContainerStarted","Data":"c75a2930e345d36a975e1113c0e0f7e039cf68029230acb4489ee00ea0a7cf64"} Mar 18 16:50:15.321340 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:50:15.321054 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-9fbq8" Mar 18 16:50:15.337982 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:50:15.337945 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-9fbq8" podStartSLOduration=8.913559291 podStartE2EDuration="17.337931404s" podCreationTimestamp="2026-03-18 16:49:58 +0000 UTC" firstStartedPulling="2026-03-18 16:50:06.024243249 +0000 UTC m=+324.335496148" lastFinishedPulling="2026-03-18 16:50:14.448615362 +0000 UTC m=+332.759868261" observedRunningTime="2026-03-18 16:50:15.336643927 +0000 UTC m=+333.647896848" watchObservedRunningTime="2026-03-18 16:50:15.337931404 +0000 UTC m=+333.649184328" Mar 18 16:50:19.267417 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:50:19.267375 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-hvpxp" Mar 18 16:50:20.308228 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:50:20.308154 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-nh4mm" Mar 18 16:50:22.278757 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:50:22.278730 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-gzktr" Mar 18 16:50:36.325469 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:50:36.325439 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-9fbq8" Mar 18 16:51:03.914186 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:51:03.914149 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-69d7c9bbdc-8qnj9"] Mar 18 16:51:03.917526 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:51:03.917503 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-69d7c9bbdc-8qnj9" Mar 18 16:51:03.919778 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:51:03.919750 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Mar 18 16:51:03.919778 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:51:03.919765 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Mar 18 16:51:03.920273 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:51:03.920256 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-q7g85\"" Mar 18 16:51:03.920375 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:51:03.920279 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Mar 18 16:51:03.926733 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:51:03.926715 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-69d7c9bbdc-8qnj9"] Mar 18 16:51:04.021855 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:51:04.021824 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xc7q\" (UniqueName: \"kubernetes.io/projected/281a34d7-fda4-47fc-880c-f533b7f4138b-kube-api-access-6xc7q\") pod \"kserve-controller-manager-69d7c9bbdc-8qnj9\" (UID: \"281a34d7-fda4-47fc-880c-f533b7f4138b\") " pod="kserve/kserve-controller-manager-69d7c9bbdc-8qnj9" Mar 18 16:51:04.021982 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:51:04.021869 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/281a34d7-fda4-47fc-880c-f533b7f4138b-cert\") pod \"kserve-controller-manager-69d7c9bbdc-8qnj9\" (UID: \"281a34d7-fda4-47fc-880c-f533b7f4138b\") " pod="kserve/kserve-controller-manager-69d7c9bbdc-8qnj9" Mar 18 16:51:04.122381 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:51:04.122334 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6xc7q\" (UniqueName: \"kubernetes.io/projected/281a34d7-fda4-47fc-880c-f533b7f4138b-kube-api-access-6xc7q\") pod \"kserve-controller-manager-69d7c9bbdc-8qnj9\" (UID: \"281a34d7-fda4-47fc-880c-f533b7f4138b\") " pod="kserve/kserve-controller-manager-69d7c9bbdc-8qnj9" Mar 18 16:51:04.122528 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:51:04.122410 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/281a34d7-fda4-47fc-880c-f533b7f4138b-cert\") pod \"kserve-controller-manager-69d7c9bbdc-8qnj9\" (UID: \"281a34d7-fda4-47fc-880c-f533b7f4138b\") " pod="kserve/kserve-controller-manager-69d7c9bbdc-8qnj9" Mar 18 16:51:04.122528 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:51:04.122515 2575 secret.go:189] Couldn't get secret kserve/kserve-webhook-server-cert: secret "kserve-webhook-server-cert" not found Mar 18 16:51:04.122603 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:51:04.122558 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/281a34d7-fda4-47fc-880c-f533b7f4138b-cert podName:281a34d7-fda4-47fc-880c-f533b7f4138b nodeName:}" failed. No retries permitted until 2026-03-18 16:51:04.622544035 +0000 UTC m=+382.933796934 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/281a34d7-fda4-47fc-880c-f533b7f4138b-cert") pod "kserve-controller-manager-69d7c9bbdc-8qnj9" (UID: "281a34d7-fda4-47fc-880c-f533b7f4138b") : secret "kserve-webhook-server-cert" not found Mar 18 16:51:04.131374 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:51:04.131345 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xc7q\" (UniqueName: \"kubernetes.io/projected/281a34d7-fda4-47fc-880c-f533b7f4138b-kube-api-access-6xc7q\") pod \"kserve-controller-manager-69d7c9bbdc-8qnj9\" (UID: \"281a34d7-fda4-47fc-880c-f533b7f4138b\") " pod="kserve/kserve-controller-manager-69d7c9bbdc-8qnj9" Mar 18 16:51:04.626673 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:51:04.626637 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/281a34d7-fda4-47fc-880c-f533b7f4138b-cert\") pod \"kserve-controller-manager-69d7c9bbdc-8qnj9\" (UID: \"281a34d7-fda4-47fc-880c-f533b7f4138b\") " pod="kserve/kserve-controller-manager-69d7c9bbdc-8qnj9" Mar 18 16:51:04.629236 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:51:04.629212 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/281a34d7-fda4-47fc-880c-f533b7f4138b-cert\") pod \"kserve-controller-manager-69d7c9bbdc-8qnj9\" (UID: \"281a34d7-fda4-47fc-880c-f533b7f4138b\") " pod="kserve/kserve-controller-manager-69d7c9bbdc-8qnj9" Mar 18 16:51:04.828204 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:51:04.828147 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-69d7c9bbdc-8qnj9" Mar 18 16:51:04.946887 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:51:04.946861 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-69d7c9bbdc-8qnj9"] Mar 18 16:51:04.949420 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:51:04.949390 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod281a34d7_fda4_47fc_880c_f533b7f4138b.slice/crio-93dbdefc552e44051beb9eca53e76decdc8904bc17244a2c82a56c17204c3c9e WatchSource:0}: Error finding container 93dbdefc552e44051beb9eca53e76decdc8904bc17244a2c82a56c17204c3c9e: Status 404 returned error can't find the container with id 93dbdefc552e44051beb9eca53e76decdc8904bc17244a2c82a56c17204c3c9e Mar 18 16:51:05.338745 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:51:05.338661 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-69d7c9bbdc-8qnj9"] Mar 18 16:51:05.470293 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:51:05.470257 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-69d7c9bbdc-8qnj9" event={"ID":"281a34d7-fda4-47fc-880c-f533b7f4138b","Type":"ContainerStarted","Data":"93dbdefc552e44051beb9eca53e76decdc8904bc17244a2c82a56c17204c3c9e"} Mar 18 16:51:08.482605 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:51:08.482563 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-69d7c9bbdc-8qnj9" event={"ID":"281a34d7-fda4-47fc-880c-f533b7f4138b","Type":"ContainerStarted","Data":"aca758d28c703680543af67aab1d2fe86da222df6d531671d5c8dfdcc0735f6c"} Mar 18 16:51:08.483017 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:51:08.482612 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-69d7c9bbdc-8qnj9" podUID="281a34d7-fda4-47fc-880c-f533b7f4138b" containerName="manager" containerID="cri-o://aca758d28c703680543af67aab1d2fe86da222df6d531671d5c8dfdcc0735f6c" gracePeriod=10 Mar 18 16:51:08.483017 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:51:08.482725 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-69d7c9bbdc-8qnj9" Mar 18 16:51:08.497340 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:51:08.497287 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-69d7c9bbdc-8qnj9" podStartSLOduration=2.87469602 podStartE2EDuration="5.497274628s" podCreationTimestamp="2026-03-18 16:51:03 +0000 UTC" firstStartedPulling="2026-03-18 16:51:04.951224159 +0000 UTC m=+383.262477058" lastFinishedPulling="2026-03-18 16:51:07.573802764 +0000 UTC m=+385.885055666" observedRunningTime="2026-03-18 16:51:08.496789385 +0000 UTC m=+386.808042303" watchObservedRunningTime="2026-03-18 16:51:08.497274628 +0000 UTC m=+386.808527550" Mar 18 16:51:08.720717 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:51:08.720696 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-69d7c9bbdc-8qnj9" Mar 18 16:51:08.760485 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:51:08.760421 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6xc7q\" (UniqueName: \"kubernetes.io/projected/281a34d7-fda4-47fc-880c-f533b7f4138b-kube-api-access-6xc7q\") pod \"281a34d7-fda4-47fc-880c-f533b7f4138b\" (UID: \"281a34d7-fda4-47fc-880c-f533b7f4138b\") " Mar 18 16:51:08.760485 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:51:08.760457 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/281a34d7-fda4-47fc-880c-f533b7f4138b-cert\") pod \"281a34d7-fda4-47fc-880c-f533b7f4138b\" (UID: \"281a34d7-fda4-47fc-880c-f533b7f4138b\") " Mar 18 16:51:08.762740 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:51:08.762718 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/281a34d7-fda4-47fc-880c-f533b7f4138b-cert" (OuterVolumeSpecName: "cert") pod "281a34d7-fda4-47fc-880c-f533b7f4138b" (UID: "281a34d7-fda4-47fc-880c-f533b7f4138b"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:51:08.762845 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:51:08.762815 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/281a34d7-fda4-47fc-880c-f533b7f4138b-kube-api-access-6xc7q" (OuterVolumeSpecName: "kube-api-access-6xc7q") pod "281a34d7-fda4-47fc-880c-f533b7f4138b" (UID: "281a34d7-fda4-47fc-880c-f533b7f4138b"). InnerVolumeSpecName "kube-api-access-6xc7q". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 16:51:08.861461 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:51:08.861432 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6xc7q\" (UniqueName: \"kubernetes.io/projected/281a34d7-fda4-47fc-880c-f533b7f4138b-kube-api-access-6xc7q\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 16:51:08.861461 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:51:08.861459 2575 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/281a34d7-fda4-47fc-880c-f533b7f4138b-cert\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 16:51:09.486375 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:51:09.486337 2575 generic.go:358] "Generic (PLEG): container finished" podID="281a34d7-fda4-47fc-880c-f533b7f4138b" containerID="aca758d28c703680543af67aab1d2fe86da222df6d531671d5c8dfdcc0735f6c" exitCode=0 Mar 18 16:51:09.486753 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:51:09.486399 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-69d7c9bbdc-8qnj9" event={"ID":"281a34d7-fda4-47fc-880c-f533b7f4138b","Type":"ContainerDied","Data":"aca758d28c703680543af67aab1d2fe86da222df6d531671d5c8dfdcc0735f6c"} Mar 18 16:51:09.486753 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:51:09.486418 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-69d7c9bbdc-8qnj9" Mar 18 16:51:09.486753 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:51:09.486427 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-69d7c9bbdc-8qnj9" event={"ID":"281a34d7-fda4-47fc-880c-f533b7f4138b","Type":"ContainerDied","Data":"93dbdefc552e44051beb9eca53e76decdc8904bc17244a2c82a56c17204c3c9e"} Mar 18 16:51:09.486753 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:51:09.486443 2575 scope.go:117] "RemoveContainer" containerID="aca758d28c703680543af67aab1d2fe86da222df6d531671d5c8dfdcc0735f6c" Mar 18 16:51:09.494978 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:51:09.494960 2575 scope.go:117] "RemoveContainer" containerID="aca758d28c703680543af67aab1d2fe86da222df6d531671d5c8dfdcc0735f6c" Mar 18 16:51:09.495233 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:51:09.495216 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aca758d28c703680543af67aab1d2fe86da222df6d531671d5c8dfdcc0735f6c\": container with ID starting with aca758d28c703680543af67aab1d2fe86da222df6d531671d5c8dfdcc0735f6c not found: ID does not exist" containerID="aca758d28c703680543af67aab1d2fe86da222df6d531671d5c8dfdcc0735f6c" Mar 18 16:51:09.495282 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:51:09.495243 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aca758d28c703680543af67aab1d2fe86da222df6d531671d5c8dfdcc0735f6c"} err="failed to get container status \"aca758d28c703680543af67aab1d2fe86da222df6d531671d5c8dfdcc0735f6c\": rpc error: code = NotFound desc = could not find container \"aca758d28c703680543af67aab1d2fe86da222df6d531671d5c8dfdcc0735f6c\": container with ID starting with aca758d28c703680543af67aab1d2fe86da222df6d531671d5c8dfdcc0735f6c not found: ID does not exist" Mar 18 16:51:09.505683 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:51:09.505659 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-69d7c9bbdc-8qnj9"] Mar 18 16:51:09.509000 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:51:09.508981 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-69d7c9bbdc-8qnj9"] Mar 18 16:51:10.290189 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:51:10.290149 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="281a34d7-fda4-47fc-880c-f533b7f4138b" path="/var/lib/kubelet/pods/281a34d7-fda4-47fc-880c-f533b7f4138b/volumes" Mar 18 16:51:41.871670 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:51:41.871640 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-9699c8d45-ttb7x"] Mar 18 16:51:41.872139 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:51:41.871933 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="281a34d7-fda4-47fc-880c-f533b7f4138b" containerName="manager" Mar 18 16:51:41.872139 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:51:41.871944 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="281a34d7-fda4-47fc-880c-f533b7f4138b" containerName="manager" Mar 18 16:51:41.872139 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:51:41.872000 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="281a34d7-fda4-47fc-880c-f533b7f4138b" containerName="manager" Mar 18 16:51:41.877726 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:51:41.877708 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-9699c8d45-ttb7x" Mar 18 16:51:41.879840 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:51:41.879818 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-cttxw\"" Mar 18 16:51:41.879982 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:51:41.879850 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Mar 18 16:51:41.879982 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:51:41.879866 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Mar 18 16:51:41.879982 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:51:41.879850 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Mar 18 16:51:41.885521 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:51:41.885501 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-9699c8d45-ttb7x"] Mar 18 16:51:41.892162 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:51:41.892137 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-twgv8"] Mar 18 16:51:41.894927 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:51:41.894907 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-twgv8" Mar 18 16:51:41.897092 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:51:41.897071 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Mar 18 16:51:41.897208 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:51:41.897108 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-bmrpb\"" Mar 18 16:51:41.903867 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:51:41.903839 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-twgv8"] Mar 18 16:51:42.002850 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:51:42.002821 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/71bd2296-9d68-4ee7-96fc-7080d3255d69-cert\") pod \"odh-model-controller-696fc77849-twgv8\" (UID: \"71bd2296-9d68-4ee7-96fc-7080d3255d69\") " pod="kserve/odh-model-controller-696fc77849-twgv8" Mar 18 16:51:42.003030 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:51:42.002881 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-th4sv\" (UniqueName: \"kubernetes.io/projected/71bd2296-9d68-4ee7-96fc-7080d3255d69-kube-api-access-th4sv\") pod \"odh-model-controller-696fc77849-twgv8\" (UID: \"71bd2296-9d68-4ee7-96fc-7080d3255d69\") " pod="kserve/odh-model-controller-696fc77849-twgv8" Mar 18 16:51:42.003030 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:51:42.002927 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e41abd12-239d-46fb-9cdb-9d35fa51024d-tls-certs\") pod \"model-serving-api-9699c8d45-ttb7x\" (UID: \"e41abd12-239d-46fb-9cdb-9d35fa51024d\") " pod="kserve/model-serving-api-9699c8d45-ttb7x" Mar 18 16:51:42.003030 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:51:42.002972 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhs26\" (UniqueName: \"kubernetes.io/projected/e41abd12-239d-46fb-9cdb-9d35fa51024d-kube-api-access-vhs26\") pod \"model-serving-api-9699c8d45-ttb7x\" (UID: \"e41abd12-239d-46fb-9cdb-9d35fa51024d\") " pod="kserve/model-serving-api-9699c8d45-ttb7x" Mar 18 16:51:42.103561 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:51:42.103527 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/71bd2296-9d68-4ee7-96fc-7080d3255d69-cert\") pod \"odh-model-controller-696fc77849-twgv8\" (UID: \"71bd2296-9d68-4ee7-96fc-7080d3255d69\") " pod="kserve/odh-model-controller-696fc77849-twgv8" Mar 18 16:51:42.103755 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:51:42.103642 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-th4sv\" (UniqueName: \"kubernetes.io/projected/71bd2296-9d68-4ee7-96fc-7080d3255d69-kube-api-access-th4sv\") pod \"odh-model-controller-696fc77849-twgv8\" (UID: \"71bd2296-9d68-4ee7-96fc-7080d3255d69\") " pod="kserve/odh-model-controller-696fc77849-twgv8" Mar 18 16:51:42.103755 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:51:42.103673 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e41abd12-239d-46fb-9cdb-9d35fa51024d-tls-certs\") pod \"model-serving-api-9699c8d45-ttb7x\" (UID: \"e41abd12-239d-46fb-9cdb-9d35fa51024d\") " pod="kserve/model-serving-api-9699c8d45-ttb7x" Mar 18 16:51:42.103755 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:51:42.103715 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vhs26\" (UniqueName: \"kubernetes.io/projected/e41abd12-239d-46fb-9cdb-9d35fa51024d-kube-api-access-vhs26\") pod \"model-serving-api-9699c8d45-ttb7x\" (UID: \"e41abd12-239d-46fb-9cdb-9d35fa51024d\") " pod="kserve/model-serving-api-9699c8d45-ttb7x" Mar 18 16:51:42.103927 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:51:42.103811 2575 secret.go:189] Couldn't get secret kserve/model-serving-api-tls: secret "model-serving-api-tls" not found Mar 18 16:51:42.103927 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:51:42.103887 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e41abd12-239d-46fb-9cdb-9d35fa51024d-tls-certs podName:e41abd12-239d-46fb-9cdb-9d35fa51024d nodeName:}" failed. No retries permitted until 2026-03-18 16:51:42.603867215 +0000 UTC m=+420.915120116 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/e41abd12-239d-46fb-9cdb-9d35fa51024d-tls-certs") pod "model-serving-api-9699c8d45-ttb7x" (UID: "e41abd12-239d-46fb-9cdb-9d35fa51024d") : secret "model-serving-api-tls" not found Mar 18 16:51:42.106161 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:51:42.106125 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/71bd2296-9d68-4ee7-96fc-7080d3255d69-cert\") pod \"odh-model-controller-696fc77849-twgv8\" (UID: \"71bd2296-9d68-4ee7-96fc-7080d3255d69\") " pod="kserve/odh-model-controller-696fc77849-twgv8" Mar 18 16:51:42.114546 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:51:42.114522 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhs26\" (UniqueName: \"kubernetes.io/projected/e41abd12-239d-46fb-9cdb-9d35fa51024d-kube-api-access-vhs26\") pod \"model-serving-api-9699c8d45-ttb7x\" (UID: \"e41abd12-239d-46fb-9cdb-9d35fa51024d\") " pod="kserve/model-serving-api-9699c8d45-ttb7x" Mar 18 16:51:42.114650 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:51:42.114636 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-th4sv\" (UniqueName: \"kubernetes.io/projected/71bd2296-9d68-4ee7-96fc-7080d3255d69-kube-api-access-th4sv\") pod \"odh-model-controller-696fc77849-twgv8\" (UID: \"71bd2296-9d68-4ee7-96fc-7080d3255d69\") " pod="kserve/odh-model-controller-696fc77849-twgv8" Mar 18 16:51:42.208031 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:51:42.208012 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-bmrpb\"" Mar 18 16:51:42.216901 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:51:42.216881 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-twgv8" Mar 18 16:51:42.341473 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:51:42.341451 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-twgv8"] Mar 18 16:51:42.343782 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:51:42.343751 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bd2296_9d68_4ee7_96fc_7080d3255d69.slice/crio-625288d04493613dbff9f39efce52f1e51434a1f8d1c0a5c3dc514ed7f26bf7a WatchSource:0}: Error finding container 625288d04493613dbff9f39efce52f1e51434a1f8d1c0a5c3dc514ed7f26bf7a: Status 404 returned error can't find the container with id 625288d04493613dbff9f39efce52f1e51434a1f8d1c0a5c3dc514ed7f26bf7a Mar 18 16:51:42.595246 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:51:42.595154 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-twgv8" event={"ID":"71bd2296-9d68-4ee7-96fc-7080d3255d69","Type":"ContainerStarted","Data":"625288d04493613dbff9f39efce52f1e51434a1f8d1c0a5c3dc514ed7f26bf7a"} Mar 18 16:51:42.609659 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:51:42.609631 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e41abd12-239d-46fb-9cdb-9d35fa51024d-tls-certs\") pod \"model-serving-api-9699c8d45-ttb7x\" (UID: \"e41abd12-239d-46fb-9cdb-9d35fa51024d\") " pod="kserve/model-serving-api-9699c8d45-ttb7x" Mar 18 16:51:42.612061 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:51:42.612032 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e41abd12-239d-46fb-9cdb-9d35fa51024d-tls-certs\") pod \"model-serving-api-9699c8d45-ttb7x\" (UID: \"e41abd12-239d-46fb-9cdb-9d35fa51024d\") " pod="kserve/model-serving-api-9699c8d45-ttb7x" Mar 18 16:51:42.791130 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:51:42.791102 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-cttxw\"" Mar 18 16:51:42.799064 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:51:42.799043 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-9699c8d45-ttb7x" Mar 18 16:51:42.934500 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:51:42.934470 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-9699c8d45-ttb7x"] Mar 18 16:51:42.936644 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:51:42.936600 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode41abd12_239d_46fb_9cdb_9d35fa51024d.slice/crio-17e4b7046ef62d825633e1bf87edc8be7b6dc263c8dcce13093ba44d498f693b WatchSource:0}: Error finding container 17e4b7046ef62d825633e1bf87edc8be7b6dc263c8dcce13093ba44d498f693b: Status 404 returned error can't find the container with id 17e4b7046ef62d825633e1bf87edc8be7b6dc263c8dcce13093ba44d498f693b Mar 18 16:51:43.252926 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:51:43.252864 2575 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized" image="quay.io/opendatahub/odh-model-serving-api:fast" Mar 18 16:51:43.253243 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:51:43.253181 2575 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:server,Image:quay.io/opendatahub/odh-model-serving-api:fast,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},ContainerPort{Name:metrics,HostPort:0,ContainerPort:9090,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:TLS_CERT_DIR,Value:/tls,ValueFrom:nil,},EnvVar{Name:GATEWAY_LABEL_SELECTOR,Value:,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{50 -3} {} 50m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tls-certs,ReadOnly:true,MountPath:/tls,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vhs26,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000640000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod model-serving-api-9699c8d45-ttb7x_kserve(e41abd12-239d-46fb-9cdb-9d35fa51024d): ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized" logger="UnhandledError" Mar 18 16:51:43.254468 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:51:43.254409 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ErrImagePull: \"unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 16:51:43.598985 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:51:43.598948 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-9699c8d45-ttb7x" event={"ID":"e41abd12-239d-46fb-9cdb-9d35fa51024d","Type":"ContainerStarted","Data":"17e4b7046ef62d825633e1bf87edc8be7b6dc263c8dcce13093ba44d498f693b"} Mar 18 16:51:43.600016 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:51:43.599989 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 16:51:44.603829 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:51:44.603779 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 16:51:45.607752 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:51:45.607659 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-twgv8" event={"ID":"71bd2296-9d68-4ee7-96fc-7080d3255d69","Type":"ContainerStarted","Data":"7d1c343291497363420a6c41be70347647f05e87d43bc95b41df1a42d211857c"} Mar 18 16:51:45.607752 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:51:45.607727 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-twgv8" Mar 18 16:51:45.623815 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:51:45.623766 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-twgv8" podStartSLOduration=1.710076231 podStartE2EDuration="4.623748521s" podCreationTimestamp="2026-03-18 16:51:41 +0000 UTC" firstStartedPulling="2026-03-18 16:51:42.344971092 +0000 UTC m=+420.656223994" lastFinishedPulling="2026-03-18 16:51:45.25864337 +0000 UTC m=+423.569896284" observedRunningTime="2026-03-18 16:51:45.622633516 +0000 UTC m=+423.933886436" watchObservedRunningTime="2026-03-18 16:51:45.623748521 +0000 UTC m=+423.935001443" Mar 18 16:51:56.613645 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:51:56.613556 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-twgv8" Mar 18 16:51:58.552147 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:51:58.552046 2575 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized" image="quay.io/opendatahub/odh-model-serving-api:fast" Mar 18 16:51:58.552553 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:51:58.552212 2575 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:server,Image:quay.io/opendatahub/odh-model-serving-api:fast,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},ContainerPort{Name:metrics,HostPort:0,ContainerPort:9090,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:TLS_CERT_DIR,Value:/tls,ValueFrom:nil,},EnvVar{Name:GATEWAY_LABEL_SELECTOR,Value:,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{50 -3} {} 50m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tls-certs,ReadOnly:true,MountPath:/tls,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vhs26,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000640000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod model-serving-api-9699c8d45-ttb7x_kserve(e41abd12-239d-46fb-9cdb-9d35fa51024d): ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized" logger="UnhandledError" Mar 18 16:51:58.553409 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:51:58.553382 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ErrImagePull: \"unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 16:52:13.286716 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:52:13.286682 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 16:52:13.969975 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:52:13.969942 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-tls-custom-5c88b85bb7-nbfjr"] Mar 18 16:52:13.973078 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:52:13.973063 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-nbfjr" Mar 18 16:52:13.975411 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:52:13.975389 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-qwszx\"" Mar 18 16:52:13.975540 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:52:13.975441 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom\"" Mar 18 16:52:13.975540 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:52:13.975441 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom-artifact\"" Mar 18 16:52:13.980447 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:52:13.980426 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-5c88b85bb7-nbfjr"] Mar 18 16:52:14.151146 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:52:14.151119 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"seaweedfs-tls-custom\" (UniqueName: \"kubernetes.io/projected/9a9d8fcc-a639-41e7-96d6-a089e8324b5d-seaweedfs-tls-custom\") pod \"seaweedfs-tls-custom-5c88b85bb7-nbfjr\" (UID: \"9a9d8fcc-a639-41e7-96d6-a089e8324b5d\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-nbfjr" Mar 18 16:52:14.151146 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:52:14.151149 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/9a9d8fcc-a639-41e7-96d6-a089e8324b5d-data\") pod \"seaweedfs-tls-custom-5c88b85bb7-nbfjr\" (UID: \"9a9d8fcc-a639-41e7-96d6-a089e8324b5d\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-nbfjr" Mar 18 16:52:14.151332 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:52:14.151198 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhsbk\" (UniqueName: \"kubernetes.io/projected/9a9d8fcc-a639-41e7-96d6-a089e8324b5d-kube-api-access-qhsbk\") pod \"seaweedfs-tls-custom-5c88b85bb7-nbfjr\" (UID: \"9a9d8fcc-a639-41e7-96d6-a089e8324b5d\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-nbfjr" Mar 18 16:52:14.252466 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:52:14.252409 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qhsbk\" (UniqueName: \"kubernetes.io/projected/9a9d8fcc-a639-41e7-96d6-a089e8324b5d-kube-api-access-qhsbk\") pod \"seaweedfs-tls-custom-5c88b85bb7-nbfjr\" (UID: \"9a9d8fcc-a639-41e7-96d6-a089e8324b5d\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-nbfjr" Mar 18 16:52:14.252466 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:52:14.252457 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"seaweedfs-tls-custom\" (UniqueName: \"kubernetes.io/projected/9a9d8fcc-a639-41e7-96d6-a089e8324b5d-seaweedfs-tls-custom\") pod \"seaweedfs-tls-custom-5c88b85bb7-nbfjr\" (UID: \"9a9d8fcc-a639-41e7-96d6-a089e8324b5d\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-nbfjr" Mar 18 16:52:14.252579 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:52:14.252475 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/9a9d8fcc-a639-41e7-96d6-a089e8324b5d-data\") pod \"seaweedfs-tls-custom-5c88b85bb7-nbfjr\" (UID: \"9a9d8fcc-a639-41e7-96d6-a089e8324b5d\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-nbfjr" Mar 18 16:52:14.252886 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:52:14.252865 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/9a9d8fcc-a639-41e7-96d6-a089e8324b5d-data\") pod \"seaweedfs-tls-custom-5c88b85bb7-nbfjr\" (UID: \"9a9d8fcc-a639-41e7-96d6-a089e8324b5d\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-nbfjr" Mar 18 16:52:14.255029 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:52:14.255007 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"seaweedfs-tls-custom\" (UniqueName: \"kubernetes.io/projected/9a9d8fcc-a639-41e7-96d6-a089e8324b5d-seaweedfs-tls-custom\") pod \"seaweedfs-tls-custom-5c88b85bb7-nbfjr\" (UID: \"9a9d8fcc-a639-41e7-96d6-a089e8324b5d\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-nbfjr" Mar 18 16:52:14.260042 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:52:14.260020 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhsbk\" (UniqueName: \"kubernetes.io/projected/9a9d8fcc-a639-41e7-96d6-a089e8324b5d-kube-api-access-qhsbk\") pod \"seaweedfs-tls-custom-5c88b85bb7-nbfjr\" (UID: \"9a9d8fcc-a639-41e7-96d6-a089e8324b5d\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-nbfjr" Mar 18 16:52:14.282517 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:52:14.282499 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-nbfjr" Mar 18 16:52:14.408148 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:52:14.408118 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-5c88b85bb7-nbfjr"] Mar 18 16:52:14.410537 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:52:14.410503 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a9d8fcc_a639_41e7_96d6_a089e8324b5d.slice/crio-4fee7f811c1c85cfe5b4992e50426b42efc2acb23e6d95e6fcce9a2babd636a1 WatchSource:0}: Error finding container 4fee7f811c1c85cfe5b4992e50426b42efc2acb23e6d95e6fcce9a2babd636a1: Status 404 returned error can't find the container with id 4fee7f811c1c85cfe5b4992e50426b42efc2acb23e6d95e6fcce9a2babd636a1 Mar 18 16:52:14.701310 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:52:14.701267 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-nbfjr" event={"ID":"9a9d8fcc-a639-41e7-96d6-a089e8324b5d","Type":"ContainerStarted","Data":"4fee7f811c1c85cfe5b4992e50426b42efc2acb23e6d95e6fcce9a2babd636a1"} Mar 18 16:52:17.711604 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:52:17.711568 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-nbfjr" event={"ID":"9a9d8fcc-a639-41e7-96d6-a089e8324b5d","Type":"ContainerStarted","Data":"899a71d05d6e072a233c9cb55e27c0287191fb969ba40c2f0b8ac0a7b05384e7"} Mar 18 16:52:17.728164 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:52:17.728108 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-nbfjr" podStartSLOduration=2.133935356 podStartE2EDuration="4.728090809s" podCreationTimestamp="2026-03-18 16:52:13 +0000 UTC" firstStartedPulling="2026-03-18 16:52:14.411950204 +0000 UTC m=+452.723203103" lastFinishedPulling="2026-03-18 16:52:17.006105643 +0000 UTC m=+455.317358556" observedRunningTime="2026-03-18 16:52:17.725751355 +0000 UTC m=+456.037004276" watchObservedRunningTime="2026-03-18 16:52:17.728090809 +0000 UTC m=+456.039343730" Mar 18 16:52:26.579880 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:52:26.579794 2575 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized" image="quay.io/opendatahub/odh-model-serving-api:fast" Mar 18 16:52:26.580244 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:52:26.579960 2575 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:server,Image:quay.io/opendatahub/odh-model-serving-api:fast,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},ContainerPort{Name:metrics,HostPort:0,ContainerPort:9090,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:TLS_CERT_DIR,Value:/tls,ValueFrom:nil,},EnvVar{Name:GATEWAY_LABEL_SELECTOR,Value:,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{50 -3} {} 50m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tls-certs,ReadOnly:true,MountPath:/tls,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vhs26,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000640000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod model-serving-api-9699c8d45-ttb7x_kserve(e41abd12-239d-46fb-9cdb-9d35fa51024d): ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized" logger="UnhandledError" Mar 18 16:52:26.581147 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:52:26.581119 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ErrImagePull: \"unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 16:52:26.814983 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:52:26.814950 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-tls-serving-7fd5766db9-9f7vn"] Mar 18 16:52:26.818191 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:52:26.818175 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-serving-7fd5766db9-9f7vn" Mar 18 16:52:26.820308 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:52:26.820283 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-serving\"" Mar 18 16:52:26.820447 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:52:26.820283 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-serving-artifact\"" Mar 18 16:52:26.825681 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:52:26.825660 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-serving-7fd5766db9-9f7vn"] Mar 18 16:52:26.845182 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:52:26.845127 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"seaweedfs-tls-serving\" (UniqueName: \"kubernetes.io/projected/af9bfd01-4022-420b-a391-212857be16a7-seaweedfs-tls-serving\") pod \"seaweedfs-tls-serving-7fd5766db9-9f7vn\" (UID: \"af9bfd01-4022-420b-a391-212857be16a7\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-9f7vn" Mar 18 16:52:26.845182 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:52:26.845158 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/af9bfd01-4022-420b-a391-212857be16a7-data\") pod \"seaweedfs-tls-serving-7fd5766db9-9f7vn\" (UID: \"af9bfd01-4022-420b-a391-212857be16a7\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-9f7vn" Mar 18 16:52:26.845310 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:52:26.845180 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfs77\" (UniqueName: \"kubernetes.io/projected/af9bfd01-4022-420b-a391-212857be16a7-kube-api-access-zfs77\") pod \"seaweedfs-tls-serving-7fd5766db9-9f7vn\" (UID: \"af9bfd01-4022-420b-a391-212857be16a7\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-9f7vn" Mar 18 16:52:26.946425 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:52:26.946398 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zfs77\" (UniqueName: \"kubernetes.io/projected/af9bfd01-4022-420b-a391-212857be16a7-kube-api-access-zfs77\") pod \"seaweedfs-tls-serving-7fd5766db9-9f7vn\" (UID: \"af9bfd01-4022-420b-a391-212857be16a7\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-9f7vn" Mar 18 16:52:26.946579 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:52:26.946506 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"seaweedfs-tls-serving\" (UniqueName: \"kubernetes.io/projected/af9bfd01-4022-420b-a391-212857be16a7-seaweedfs-tls-serving\") pod \"seaweedfs-tls-serving-7fd5766db9-9f7vn\" (UID: \"af9bfd01-4022-420b-a391-212857be16a7\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-9f7vn" Mar 18 16:52:26.946579 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:52:26.946541 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/af9bfd01-4022-420b-a391-212857be16a7-data\") pod \"seaweedfs-tls-serving-7fd5766db9-9f7vn\" (UID: \"af9bfd01-4022-420b-a391-212857be16a7\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-9f7vn" Mar 18 16:52:26.946873 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:52:26.946852 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/af9bfd01-4022-420b-a391-212857be16a7-data\") pod \"seaweedfs-tls-serving-7fd5766db9-9f7vn\" (UID: \"af9bfd01-4022-420b-a391-212857be16a7\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-9f7vn" Mar 18 16:52:26.948990 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:52:26.948973 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"seaweedfs-tls-serving\" (UniqueName: \"kubernetes.io/projected/af9bfd01-4022-420b-a391-212857be16a7-seaweedfs-tls-serving\") pod \"seaweedfs-tls-serving-7fd5766db9-9f7vn\" (UID: \"af9bfd01-4022-420b-a391-212857be16a7\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-9f7vn" Mar 18 16:52:26.960076 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:52:26.960057 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfs77\" (UniqueName: \"kubernetes.io/projected/af9bfd01-4022-420b-a391-212857be16a7-kube-api-access-zfs77\") pod \"seaweedfs-tls-serving-7fd5766db9-9f7vn\" (UID: \"af9bfd01-4022-420b-a391-212857be16a7\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-9f7vn" Mar 18 16:52:27.128489 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:52:27.128436 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-serving-7fd5766db9-9f7vn" Mar 18 16:52:27.245914 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:52:27.245890 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-serving-7fd5766db9-9f7vn"] Mar 18 16:52:27.248062 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:52:27.248025 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf9bfd01_4022_420b_a391_212857be16a7.slice/crio-6cb0a756eb1440866c6d1d9179ddb19ee53a0b0d08e012dffa0adac2ea62c784 WatchSource:0}: Error finding container 6cb0a756eb1440866c6d1d9179ddb19ee53a0b0d08e012dffa0adac2ea62c784: Status 404 returned error can't find the container with id 6cb0a756eb1440866c6d1d9179ddb19ee53a0b0d08e012dffa0adac2ea62c784 Mar 18 16:52:27.743017 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:52:27.742981 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-serving-7fd5766db9-9f7vn" event={"ID":"af9bfd01-4022-420b-a391-212857be16a7","Type":"ContainerStarted","Data":"c97becb79f6c9056a4f0f0296ef96c3a022a1634048c57c0a697320d68d6f9a2"} Mar 18 16:52:27.743017 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:52:27.743021 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-serving-7fd5766db9-9f7vn" event={"ID":"af9bfd01-4022-420b-a391-212857be16a7","Type":"ContainerStarted","Data":"6cb0a756eb1440866c6d1d9179ddb19ee53a0b0d08e012dffa0adac2ea62c784"} Mar 18 16:52:27.760064 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:52:27.760018 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-tls-serving-7fd5766db9-9f7vn" podStartSLOduration=1.493128816 podStartE2EDuration="1.760003463s" podCreationTimestamp="2026-03-18 16:52:26 +0000 UTC" firstStartedPulling="2026-03-18 16:52:27.249249512 +0000 UTC m=+465.560502413" lastFinishedPulling="2026-03-18 16:52:27.516124154 +0000 UTC m=+465.827377060" observedRunningTime="2026-03-18 16:52:27.759262306 +0000 UTC m=+466.070515228" watchObservedRunningTime="2026-03-18 16:52:27.760003463 +0000 UTC m=+466.071256383" Mar 18 16:52:41.287312 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:52:41.287278 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 16:52:42.944301 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:52:42.944270 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-bfbcdb44b-w4hqr"] Mar 18 16:52:42.949122 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:52:42.949105 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-bfbcdb44b-w4hqr" Mar 18 16:52:42.950936 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:52:42.950908 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-9lwwn\"" Mar 18 16:52:42.951085 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:52:42.951066 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Mar 18 16:52:42.951162 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:52:42.951068 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Mar 18 16:52:42.954571 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:52:42.954550 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-bfbcdb44b-w4hqr"] Mar 18 16:52:43.070254 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:52:43.070225 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/94a388fe-f8dd-4e67-8ae5-82157cf67a3c-kserve-provision-location\") pod \"isvc-sklearn-batcher-predictor-bfbcdb44b-w4hqr\" (UID: \"94a388fe-f8dd-4e67-8ae5-82157cf67a3c\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-bfbcdb44b-w4hqr" Mar 18 16:52:43.070430 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:52:43.070259 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvwc6\" (UniqueName: \"kubernetes.io/projected/94a388fe-f8dd-4e67-8ae5-82157cf67a3c-kube-api-access-kvwc6\") pod \"isvc-sklearn-batcher-predictor-bfbcdb44b-w4hqr\" (UID: \"94a388fe-f8dd-4e67-8ae5-82157cf67a3c\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-bfbcdb44b-w4hqr" Mar 18 16:52:43.170771 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:52:43.170742 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/94a388fe-f8dd-4e67-8ae5-82157cf67a3c-kserve-provision-location\") pod \"isvc-sklearn-batcher-predictor-bfbcdb44b-w4hqr\" (UID: \"94a388fe-f8dd-4e67-8ae5-82157cf67a3c\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-bfbcdb44b-w4hqr" Mar 18 16:52:43.170899 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:52:43.170790 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kvwc6\" (UniqueName: \"kubernetes.io/projected/94a388fe-f8dd-4e67-8ae5-82157cf67a3c-kube-api-access-kvwc6\") pod \"isvc-sklearn-batcher-predictor-bfbcdb44b-w4hqr\" (UID: \"94a388fe-f8dd-4e67-8ae5-82157cf67a3c\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-bfbcdb44b-w4hqr" Mar 18 16:52:43.171130 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:52:43.171110 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/94a388fe-f8dd-4e67-8ae5-82157cf67a3c-kserve-provision-location\") pod \"isvc-sklearn-batcher-predictor-bfbcdb44b-w4hqr\" (UID: \"94a388fe-f8dd-4e67-8ae5-82157cf67a3c\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-bfbcdb44b-w4hqr" Mar 18 16:52:43.178585 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:52:43.178562 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvwc6\" (UniqueName: \"kubernetes.io/projected/94a388fe-f8dd-4e67-8ae5-82157cf67a3c-kube-api-access-kvwc6\") pod \"isvc-sklearn-batcher-predictor-bfbcdb44b-w4hqr\" (UID: \"94a388fe-f8dd-4e67-8ae5-82157cf67a3c\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-bfbcdb44b-w4hqr" Mar 18 16:52:43.259776 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:52:43.259705 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-bfbcdb44b-w4hqr" Mar 18 16:52:43.385351 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:52:43.385321 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-bfbcdb44b-w4hqr"] Mar 18 16:52:43.387834 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:52:43.387806 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod94a388fe_f8dd_4e67_8ae5_82157cf67a3c.slice/crio-27c2592b925632a03e4e6dc4161d63f589b88c3f7184f58cf4fd29dbdc0314d1 WatchSource:0}: Error finding container 27c2592b925632a03e4e6dc4161d63f589b88c3f7184f58cf4fd29dbdc0314d1: Status 404 returned error can't find the container with id 27c2592b925632a03e4e6dc4161d63f589b88c3f7184f58cf4fd29dbdc0314d1 Mar 18 16:52:43.795343 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:52:43.795311 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-bfbcdb44b-w4hqr" event={"ID":"94a388fe-f8dd-4e67-8ae5-82157cf67a3c","Type":"ContainerStarted","Data":"27c2592b925632a03e4e6dc4161d63f589b88c3f7184f58cf4fd29dbdc0314d1"} Mar 18 16:52:47.247977 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:52:47.247936 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5fd89d9bf9-mh6nr"] Mar 18 16:52:47.251222 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:52:47.251197 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5fd89d9bf9-mh6nr" Mar 18 16:52:47.263530 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:52:47.263499 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5fd89d9bf9-mh6nr"] Mar 18 16:52:47.306968 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:52:47.306939 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3464ba8e-beb1-44e3-b263-a29520f68252-trusted-ca-bundle\") pod \"console-5fd89d9bf9-mh6nr\" (UID: \"3464ba8e-beb1-44e3-b263-a29520f68252\") " pod="openshift-console/console-5fd89d9bf9-mh6nr" Mar 18 16:52:47.307128 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:52:47.306982 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3464ba8e-beb1-44e3-b263-a29520f68252-console-config\") pod \"console-5fd89d9bf9-mh6nr\" (UID: \"3464ba8e-beb1-44e3-b263-a29520f68252\") " pod="openshift-console/console-5fd89d9bf9-mh6nr" Mar 18 16:52:47.307128 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:52:47.307013 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrb22\" (UniqueName: \"kubernetes.io/projected/3464ba8e-beb1-44e3-b263-a29520f68252-kube-api-access-mrb22\") pod \"console-5fd89d9bf9-mh6nr\" (UID: \"3464ba8e-beb1-44e3-b263-a29520f68252\") " pod="openshift-console/console-5fd89d9bf9-mh6nr" Mar 18 16:52:47.307128 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:52:47.307087 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3464ba8e-beb1-44e3-b263-a29520f68252-oauth-serving-cert\") pod \"console-5fd89d9bf9-mh6nr\" (UID: \"3464ba8e-beb1-44e3-b263-a29520f68252\") " pod="openshift-console/console-5fd89d9bf9-mh6nr" Mar 18 16:52:47.307291 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:52:47.307143 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3464ba8e-beb1-44e3-b263-a29520f68252-console-oauth-config\") pod \"console-5fd89d9bf9-mh6nr\" (UID: \"3464ba8e-beb1-44e3-b263-a29520f68252\") " pod="openshift-console/console-5fd89d9bf9-mh6nr" Mar 18 16:52:47.307291 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:52:47.307174 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3464ba8e-beb1-44e3-b263-a29520f68252-console-serving-cert\") pod \"console-5fd89d9bf9-mh6nr\" (UID: \"3464ba8e-beb1-44e3-b263-a29520f68252\") " pod="openshift-console/console-5fd89d9bf9-mh6nr" Mar 18 16:52:47.307291 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:52:47.307234 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3464ba8e-beb1-44e3-b263-a29520f68252-service-ca\") pod \"console-5fd89d9bf9-mh6nr\" (UID: \"3464ba8e-beb1-44e3-b263-a29520f68252\") " pod="openshift-console/console-5fd89d9bf9-mh6nr" Mar 18 16:52:47.408255 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:52:47.408212 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3464ba8e-beb1-44e3-b263-a29520f68252-console-serving-cert\") pod \"console-5fd89d9bf9-mh6nr\" (UID: \"3464ba8e-beb1-44e3-b263-a29520f68252\") " pod="openshift-console/console-5fd89d9bf9-mh6nr" Mar 18 16:52:47.408442 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:52:47.408291 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3464ba8e-beb1-44e3-b263-a29520f68252-service-ca\") pod \"console-5fd89d9bf9-mh6nr\" (UID: \"3464ba8e-beb1-44e3-b263-a29520f68252\") " pod="openshift-console/console-5fd89d9bf9-mh6nr" Mar 18 16:52:47.408442 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:52:47.408348 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3464ba8e-beb1-44e3-b263-a29520f68252-trusted-ca-bundle\") pod \"console-5fd89d9bf9-mh6nr\" (UID: \"3464ba8e-beb1-44e3-b263-a29520f68252\") " pod="openshift-console/console-5fd89d9bf9-mh6nr" Mar 18 16:52:47.408442 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:52:47.408403 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3464ba8e-beb1-44e3-b263-a29520f68252-console-config\") pod \"console-5fd89d9bf9-mh6nr\" (UID: \"3464ba8e-beb1-44e3-b263-a29520f68252\") " pod="openshift-console/console-5fd89d9bf9-mh6nr" Mar 18 16:52:47.408442 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:52:47.408432 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mrb22\" (UniqueName: \"kubernetes.io/projected/3464ba8e-beb1-44e3-b263-a29520f68252-kube-api-access-mrb22\") pod \"console-5fd89d9bf9-mh6nr\" (UID: \"3464ba8e-beb1-44e3-b263-a29520f68252\") " pod="openshift-console/console-5fd89d9bf9-mh6nr" Mar 18 16:52:47.408658 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:52:47.408463 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3464ba8e-beb1-44e3-b263-a29520f68252-oauth-serving-cert\") pod \"console-5fd89d9bf9-mh6nr\" (UID: \"3464ba8e-beb1-44e3-b263-a29520f68252\") " pod="openshift-console/console-5fd89d9bf9-mh6nr" Mar 18 16:52:47.408658 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:52:47.408522 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3464ba8e-beb1-44e3-b263-a29520f68252-console-oauth-config\") pod \"console-5fd89d9bf9-mh6nr\" (UID: \"3464ba8e-beb1-44e3-b263-a29520f68252\") " pod="openshift-console/console-5fd89d9bf9-mh6nr" Mar 18 16:52:47.409197 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:52:47.409163 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3464ba8e-beb1-44e3-b263-a29520f68252-console-config\") pod \"console-5fd89d9bf9-mh6nr\" (UID: \"3464ba8e-beb1-44e3-b263-a29520f68252\") " pod="openshift-console/console-5fd89d9bf9-mh6nr" Mar 18 16:52:47.409378 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:52:47.409331 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3464ba8e-beb1-44e3-b263-a29520f68252-trusted-ca-bundle\") pod \"console-5fd89d9bf9-mh6nr\" (UID: \"3464ba8e-beb1-44e3-b263-a29520f68252\") " pod="openshift-console/console-5fd89d9bf9-mh6nr" Mar 18 16:52:47.409576 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:52:47.409556 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3464ba8e-beb1-44e3-b263-a29520f68252-service-ca\") pod \"console-5fd89d9bf9-mh6nr\" (UID: \"3464ba8e-beb1-44e3-b263-a29520f68252\") " pod="openshift-console/console-5fd89d9bf9-mh6nr" Mar 18 16:52:47.409715 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:52:47.409698 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3464ba8e-beb1-44e3-b263-a29520f68252-oauth-serving-cert\") pod \"console-5fd89d9bf9-mh6nr\" (UID: \"3464ba8e-beb1-44e3-b263-a29520f68252\") " pod="openshift-console/console-5fd89d9bf9-mh6nr" Mar 18 16:52:47.411494 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:52:47.411471 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3464ba8e-beb1-44e3-b263-a29520f68252-console-oauth-config\") pod \"console-5fd89d9bf9-mh6nr\" (UID: \"3464ba8e-beb1-44e3-b263-a29520f68252\") " pod="openshift-console/console-5fd89d9bf9-mh6nr" Mar 18 16:52:47.411585 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:52:47.411570 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3464ba8e-beb1-44e3-b263-a29520f68252-console-serving-cert\") pod \"console-5fd89d9bf9-mh6nr\" (UID: \"3464ba8e-beb1-44e3-b263-a29520f68252\") " pod="openshift-console/console-5fd89d9bf9-mh6nr" Mar 18 16:52:47.416283 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:52:47.416259 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrb22\" (UniqueName: \"kubernetes.io/projected/3464ba8e-beb1-44e3-b263-a29520f68252-kube-api-access-mrb22\") pod \"console-5fd89d9bf9-mh6nr\" (UID: \"3464ba8e-beb1-44e3-b263-a29520f68252\") " pod="openshift-console/console-5fd89d9bf9-mh6nr" Mar 18 16:52:47.560571 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:52:47.560480 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5fd89d9bf9-mh6nr" Mar 18 16:52:47.687961 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:52:47.687930 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5fd89d9bf9-mh6nr"] Mar 18 16:52:47.689782 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:52:47.689749 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3464ba8e_beb1_44e3_b263_a29520f68252.slice/crio-a0616bd0b3caf9e6a16483f5f71ada8af64eefcff988955db48d1211a490a860 WatchSource:0}: Error finding container a0616bd0b3caf9e6a16483f5f71ada8af64eefcff988955db48d1211a490a860: Status 404 returned error can't find the container with id a0616bd0b3caf9e6a16483f5f71ada8af64eefcff988955db48d1211a490a860 Mar 18 16:52:47.812084 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:52:47.811980 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-bfbcdb44b-w4hqr" event={"ID":"94a388fe-f8dd-4e67-8ae5-82157cf67a3c","Type":"ContainerStarted","Data":"d1c246bfa5d69c74d3c8faaa46c2b721b46ed7ca249023bb20ab877eb4df1b9b"} Mar 18 16:52:47.813492 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:52:47.813464 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5fd89d9bf9-mh6nr" event={"ID":"3464ba8e-beb1-44e3-b263-a29520f68252","Type":"ContainerStarted","Data":"0ee05ebfae556c9b7cfe7b26f0b2b4de7a159a0aa5c61c68c1c685151a796b34"} Mar 18 16:52:47.813625 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:52:47.813498 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5fd89d9bf9-mh6nr" event={"ID":"3464ba8e-beb1-44e3-b263-a29520f68252","Type":"ContainerStarted","Data":"a0616bd0b3caf9e6a16483f5f71ada8af64eefcff988955db48d1211a490a860"} Mar 18 16:52:47.846295 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:52:47.846249 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5fd89d9bf9-mh6nr" podStartSLOduration=0.846234506 podStartE2EDuration="846.234506ms" podCreationTimestamp="2026-03-18 16:52:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:52:47.844341084 +0000 UTC m=+486.155594008" watchObservedRunningTime="2026-03-18 16:52:47.846234506 +0000 UTC m=+486.157487427" Mar 18 16:52:51.829144 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:52:51.829111 2575 generic.go:358] "Generic (PLEG): container finished" podID="94a388fe-f8dd-4e67-8ae5-82157cf67a3c" containerID="d1c246bfa5d69c74d3c8faaa46c2b721b46ed7ca249023bb20ab877eb4df1b9b" exitCode=0 Mar 18 16:52:51.829531 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:52:51.829184 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-bfbcdb44b-w4hqr" event={"ID":"94a388fe-f8dd-4e67-8ae5-82157cf67a3c","Type":"ContainerDied","Data":"d1c246bfa5d69c74d3c8faaa46c2b721b46ed7ca249023bb20ab877eb4df1b9b"} Mar 18 16:52:56.287438 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:52:56.287385 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 16:52:57.560665 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:52:57.560629 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5fd89d9bf9-mh6nr" Mar 18 16:52:57.560665 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:52:57.560683 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5fd89d9bf9-mh6nr" Mar 18 16:52:57.566081 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:52:57.566056 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5fd89d9bf9-mh6nr" Mar 18 16:52:57.857750 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:52:57.857660 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5fd89d9bf9-mh6nr" Mar 18 16:52:57.911923 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:52:57.911891 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-65ff667744-cfz6r"] Mar 18 16:52:58.859913 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:52:58.859871 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-bfbcdb44b-w4hqr" event={"ID":"94a388fe-f8dd-4e67-8ae5-82157cf67a3c","Type":"ContainerStarted","Data":"b62e3a540fd4e2ee20a1fde67071254dcdd3c274643dc097c6cd8d4e5e75a54a"} Mar 18 16:53:00.869629 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:53:00.869579 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-bfbcdb44b-w4hqr" event={"ID":"94a388fe-f8dd-4e67-8ae5-82157cf67a3c","Type":"ContainerStarted","Data":"825d46c50bc659ab5543e4c69faa688395fd0a6f95da3c4221cb31ac67d611ed"} Mar 18 16:53:00.870069 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:53:00.869853 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-bfbcdb44b-w4hqr" Mar 18 16:53:00.871527 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:53:00.871483 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-bfbcdb44b-w4hqr" podUID="94a388fe-f8dd-4e67-8ae5-82157cf67a3c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Mar 18 16:53:00.891477 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:53:00.891423 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-bfbcdb44b-w4hqr" podStartSLOduration=2.110060172 podStartE2EDuration="18.891409164s" podCreationTimestamp="2026-03-18 16:52:42 +0000 UTC" firstStartedPulling="2026-03-18 16:52:43.389767379 +0000 UTC m=+481.701020278" lastFinishedPulling="2026-03-18 16:53:00.17111635 +0000 UTC m=+498.482369270" observedRunningTime="2026-03-18 16:53:00.889350307 +0000 UTC m=+499.200603229" watchObservedRunningTime="2026-03-18 16:53:00.891409164 +0000 UTC m=+499.202662079" Mar 18 16:53:01.873697 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:53:01.873653 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-bfbcdb44b-w4hqr" Mar 18 16:53:01.874137 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:53:01.873770 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-bfbcdb44b-w4hqr" podUID="94a388fe-f8dd-4e67-8ae5-82157cf67a3c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Mar 18 16:53:01.874826 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:53:01.874799 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-bfbcdb44b-w4hqr" podUID="94a388fe-f8dd-4e67-8ae5-82157cf67a3c" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 16:53:02.877141 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:53:02.877098 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-bfbcdb44b-w4hqr" podUID="94a388fe-f8dd-4e67-8ae5-82157cf67a3c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Mar 18 16:53:02.877618 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:53:02.877518 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-bfbcdb44b-w4hqr" podUID="94a388fe-f8dd-4e67-8ae5-82157cf67a3c" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 16:53:08.571914 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:53:08.571809 2575 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized" image="quay.io/opendatahub/odh-model-serving-api:fast" Mar 18 16:53:08.572408 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:53:08.572029 2575 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:server,Image:quay.io/opendatahub/odh-model-serving-api:fast,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},ContainerPort{Name:metrics,HostPort:0,ContainerPort:9090,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:TLS_CERT_DIR,Value:/tls,ValueFrom:nil,},EnvVar{Name:GATEWAY_LABEL_SELECTOR,Value:,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{50 -3} {} 50m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tls-certs,ReadOnly:true,MountPath:/tls,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vhs26,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000640000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod model-serving-api-9699c8d45-ttb7x_kserve(e41abd12-239d-46fb-9cdb-9d35fa51024d): ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized" logger="UnhandledError" Mar 18 16:53:08.573231 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:53:08.573201 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ErrImagePull: \"unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 16:53:12.877653 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:53:12.877606 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-bfbcdb44b-w4hqr" podUID="94a388fe-f8dd-4e67-8ae5-82157cf67a3c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Mar 18 16:53:12.878183 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:53:12.878157 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-bfbcdb44b-w4hqr" podUID="94a388fe-f8dd-4e67-8ae5-82157cf67a3c" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 16:53:21.287389 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:53:21.287335 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 16:53:22.877331 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:53:22.877285 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-bfbcdb44b-w4hqr" podUID="94a388fe-f8dd-4e67-8ae5-82157cf67a3c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Mar 18 16:53:22.877803 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:53:22.877745 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-bfbcdb44b-w4hqr" podUID="94a388fe-f8dd-4e67-8ae5-82157cf67a3c" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 16:53:22.935913 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:53:22.935876 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-65ff667744-cfz6r" podUID="0a41a65a-dd92-4261-8e3b-b795190bc745" containerName="console" containerID="cri-o://d002d22eba6e0c3194ef241d861390378a7e05fb7720c6c0f90c71d23267eff7" gracePeriod=15 Mar 18 16:53:23.204226 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:53:23.204198 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-65ff667744-cfz6r_0a41a65a-dd92-4261-8e3b-b795190bc745/console/0.log" Mar 18 16:53:23.204350 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:53:23.204268 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-65ff667744-cfz6r" Mar 18 16:53:23.300004 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:53:23.299973 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0a41a65a-dd92-4261-8e3b-b795190bc745-console-config\") pod \"0a41a65a-dd92-4261-8e3b-b795190bc745\" (UID: \"0a41a65a-dd92-4261-8e3b-b795190bc745\") " Mar 18 16:53:23.300167 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:53:23.300021 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0a41a65a-dd92-4261-8e3b-b795190bc745-service-ca\") pod \"0a41a65a-dd92-4261-8e3b-b795190bc745\" (UID: \"0a41a65a-dd92-4261-8e3b-b795190bc745\") " Mar 18 16:53:23.300167 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:53:23.300051 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0a41a65a-dd92-4261-8e3b-b795190bc745-trusted-ca-bundle\") pod \"0a41a65a-dd92-4261-8e3b-b795190bc745\" (UID: \"0a41a65a-dd92-4261-8e3b-b795190bc745\") " Mar 18 16:53:23.300167 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:53:23.300067 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0a41a65a-dd92-4261-8e3b-b795190bc745-console-oauth-config\") pod \"0a41a65a-dd92-4261-8e3b-b795190bc745\" (UID: \"0a41a65a-dd92-4261-8e3b-b795190bc745\") " Mar 18 16:53:23.300167 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:53:23.300087 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0a41a65a-dd92-4261-8e3b-b795190bc745-oauth-serving-cert\") pod \"0a41a65a-dd92-4261-8e3b-b795190bc745\" (UID: \"0a41a65a-dd92-4261-8e3b-b795190bc745\") " Mar 18 16:53:23.300167 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:53:23.300105 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pkh49\" (UniqueName: \"kubernetes.io/projected/0a41a65a-dd92-4261-8e3b-b795190bc745-kube-api-access-pkh49\") pod \"0a41a65a-dd92-4261-8e3b-b795190bc745\" (UID: \"0a41a65a-dd92-4261-8e3b-b795190bc745\") " Mar 18 16:53:23.300167 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:53:23.300168 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0a41a65a-dd92-4261-8e3b-b795190bc745-console-serving-cert\") pod \"0a41a65a-dd92-4261-8e3b-b795190bc745\" (UID: \"0a41a65a-dd92-4261-8e3b-b795190bc745\") " Mar 18 16:53:23.300499 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:53:23.300467 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a41a65a-dd92-4261-8e3b-b795190bc745-console-config" (OuterVolumeSpecName: "console-config") pod "0a41a65a-dd92-4261-8e3b-b795190bc745" (UID: "0a41a65a-dd92-4261-8e3b-b795190bc745"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 16:53:23.300557 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:53:23.300495 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a41a65a-dd92-4261-8e3b-b795190bc745-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "0a41a65a-dd92-4261-8e3b-b795190bc745" (UID: "0a41a65a-dd92-4261-8e3b-b795190bc745"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 16:53:23.300557 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:53:23.300519 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a41a65a-dd92-4261-8e3b-b795190bc745-service-ca" (OuterVolumeSpecName: "service-ca") pod "0a41a65a-dd92-4261-8e3b-b795190bc745" (UID: "0a41a65a-dd92-4261-8e3b-b795190bc745"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 16:53:23.300557 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:53:23.300543 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a41a65a-dd92-4261-8e3b-b795190bc745-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "0a41a65a-dd92-4261-8e3b-b795190bc745" (UID: "0a41a65a-dd92-4261-8e3b-b795190bc745"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 16:53:23.302534 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:53:23.302506 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a41a65a-dd92-4261-8e3b-b795190bc745-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "0a41a65a-dd92-4261-8e3b-b795190bc745" (UID: "0a41a65a-dd92-4261-8e3b-b795190bc745"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:53:23.302638 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:53:23.302555 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a41a65a-dd92-4261-8e3b-b795190bc745-kube-api-access-pkh49" (OuterVolumeSpecName: "kube-api-access-pkh49") pod "0a41a65a-dd92-4261-8e3b-b795190bc745" (UID: "0a41a65a-dd92-4261-8e3b-b795190bc745"). InnerVolumeSpecName "kube-api-access-pkh49". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 16:53:23.302638 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:53:23.302575 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a41a65a-dd92-4261-8e3b-b795190bc745-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "0a41a65a-dd92-4261-8e3b-b795190bc745" (UID: "0a41a65a-dd92-4261-8e3b-b795190bc745"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:53:23.401416 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:53:23.401389 2575 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0a41a65a-dd92-4261-8e3b-b795190bc745-console-config\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 16:53:23.401416 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:53:23.401410 2575 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0a41a65a-dd92-4261-8e3b-b795190bc745-service-ca\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 16:53:23.401569 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:53:23.401488 2575 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0a41a65a-dd92-4261-8e3b-b795190bc745-trusted-ca-bundle\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 16:53:23.401569 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:53:23.401497 2575 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0a41a65a-dd92-4261-8e3b-b795190bc745-console-oauth-config\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 16:53:23.401569 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:53:23.401506 2575 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0a41a65a-dd92-4261-8e3b-b795190bc745-oauth-serving-cert\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 16:53:23.401569 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:53:23.401515 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pkh49\" (UniqueName: \"kubernetes.io/projected/0a41a65a-dd92-4261-8e3b-b795190bc745-kube-api-access-pkh49\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 16:53:23.401569 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:53:23.401523 2575 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0a41a65a-dd92-4261-8e3b-b795190bc745-console-serving-cert\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 16:53:23.946574 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:53:23.946541 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-65ff667744-cfz6r_0a41a65a-dd92-4261-8e3b-b795190bc745/console/0.log" Mar 18 16:53:23.946951 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:53:23.946587 2575 generic.go:358] "Generic (PLEG): container finished" podID="0a41a65a-dd92-4261-8e3b-b795190bc745" containerID="d002d22eba6e0c3194ef241d861390378a7e05fb7720c6c0f90c71d23267eff7" exitCode=2 Mar 18 16:53:23.946951 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:53:23.946678 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-65ff667744-cfz6r" Mar 18 16:53:23.946951 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:53:23.946677 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-65ff667744-cfz6r" event={"ID":"0a41a65a-dd92-4261-8e3b-b795190bc745","Type":"ContainerDied","Data":"d002d22eba6e0c3194ef241d861390378a7e05fb7720c6c0f90c71d23267eff7"} Mar 18 16:53:23.946951 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:53:23.946720 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-65ff667744-cfz6r" event={"ID":"0a41a65a-dd92-4261-8e3b-b795190bc745","Type":"ContainerDied","Data":"a4526effcb02b384e5aca7c826817250cc807989131b429f7a3b23da1e75164d"} Mar 18 16:53:23.946951 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:53:23.946740 2575 scope.go:117] "RemoveContainer" containerID="d002d22eba6e0c3194ef241d861390378a7e05fb7720c6c0f90c71d23267eff7" Mar 18 16:53:23.955594 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:53:23.955578 2575 scope.go:117] "RemoveContainer" containerID="d002d22eba6e0c3194ef241d861390378a7e05fb7720c6c0f90c71d23267eff7" Mar 18 16:53:23.955820 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:53:23.955804 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d002d22eba6e0c3194ef241d861390378a7e05fb7720c6c0f90c71d23267eff7\": container with ID starting with d002d22eba6e0c3194ef241d861390378a7e05fb7720c6c0f90c71d23267eff7 not found: ID does not exist" containerID="d002d22eba6e0c3194ef241d861390378a7e05fb7720c6c0f90c71d23267eff7" Mar 18 16:53:23.955866 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:53:23.955827 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d002d22eba6e0c3194ef241d861390378a7e05fb7720c6c0f90c71d23267eff7"} err="failed to get container status \"d002d22eba6e0c3194ef241d861390378a7e05fb7720c6c0f90c71d23267eff7\": rpc error: code = NotFound desc = could not find container \"d002d22eba6e0c3194ef241d861390378a7e05fb7720c6c0f90c71d23267eff7\": container with ID starting with d002d22eba6e0c3194ef241d861390378a7e05fb7720c6c0f90c71d23267eff7 not found: ID does not exist" Mar 18 16:53:23.966454 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:53:23.966432 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-65ff667744-cfz6r"] Mar 18 16:53:23.970499 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:53:23.970482 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-65ff667744-cfz6r"] Mar 18 16:53:24.290774 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:53:24.290692 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a41a65a-dd92-4261-8e3b-b795190bc745" path="/var/lib/kubelet/pods/0a41a65a-dd92-4261-8e3b-b795190bc745/volumes" Mar 18 16:53:32.877909 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:53:32.877852 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-bfbcdb44b-w4hqr" podUID="94a388fe-f8dd-4e67-8ae5-82157cf67a3c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Mar 18 16:53:32.878498 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:53:32.878278 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-bfbcdb44b-w4hqr" podUID="94a388fe-f8dd-4e67-8ae5-82157cf67a3c" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 16:53:36.286836 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:53:36.286806 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 16:53:42.877475 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:53:42.877422 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-bfbcdb44b-w4hqr" podUID="94a388fe-f8dd-4e67-8ae5-82157cf67a3c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Mar 18 16:53:42.877899 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:53:42.877872 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-bfbcdb44b-w4hqr" podUID="94a388fe-f8dd-4e67-8ae5-82157cf67a3c" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 16:53:50.287158 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:53:50.287125 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 16:53:52.877566 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:53:52.877524 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-bfbcdb44b-w4hqr" podUID="94a388fe-f8dd-4e67-8ae5-82157cf67a3c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Mar 18 16:53:52.878012 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:53:52.877971 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-bfbcdb44b-w4hqr" podUID="94a388fe-f8dd-4e67-8ae5-82157cf67a3c" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 16:54:02.877519 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:54:02.877478 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-bfbcdb44b-w4hqr" podUID="94a388fe-f8dd-4e67-8ae5-82157cf67a3c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Mar 18 16:54:02.877964 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:54:02.877917 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-bfbcdb44b-w4hqr" podUID="94a388fe-f8dd-4e67-8ae5-82157cf67a3c" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 16:54:05.287354 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:54:05.287296 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 16:54:12.288308 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:54:12.288264 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-bfbcdb44b-w4hqr" podUID="94a388fe-f8dd-4e67-8ae5-82157cf67a3c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Mar 18 16:54:12.288889 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:54:12.288864 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-bfbcdb44b-w4hqr" podUID="94a388fe-f8dd-4e67-8ae5-82157cf67a3c" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 16:54:18.287247 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:54:18.287217 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 16:54:22.290288 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:54:22.290262 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-bfbcdb44b-w4hqr" Mar 18 16:54:22.290686 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:54:22.290313 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-bfbcdb44b-w4hqr" Mar 18 16:54:27.982235 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:54:27.982202 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-bfbcdb44b-w4hqr"] Mar 18 16:54:27.982610 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:54:27.982473 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-bfbcdb44b-w4hqr" podUID="94a388fe-f8dd-4e67-8ae5-82157cf67a3c" containerName="kserve-container" containerID="cri-o://b62e3a540fd4e2ee20a1fde67071254dcdd3c274643dc097c6cd8d4e5e75a54a" gracePeriod=30 Mar 18 16:54:27.982665 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:54:27.982584 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-bfbcdb44b-w4hqr" podUID="94a388fe-f8dd-4e67-8ae5-82157cf67a3c" containerName="agent" containerID="cri-o://825d46c50bc659ab5543e4c69faa688395fd0a6f95da3c4221cb31ac67d611ed" gracePeriod=30 Mar 18 16:54:28.248912 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:54:28.248836 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6466bdb79f-rl9fm"] Mar 18 16:54:28.249211 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:54:28.249198 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0a41a65a-dd92-4261-8e3b-b795190bc745" containerName="console" Mar 18 16:54:28.249258 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:54:28.249213 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a41a65a-dd92-4261-8e3b-b795190bc745" containerName="console" Mar 18 16:54:28.249292 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:54:28.249279 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="0a41a65a-dd92-4261-8e3b-b795190bc745" containerName="console" Mar 18 16:54:28.252156 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:54:28.252141 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6466bdb79f-rl9fm" Mar 18 16:54:28.261095 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:54:28.261073 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6466bdb79f-rl9fm"] Mar 18 16:54:28.408917 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:54:28.408885 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b897baac-bbf7-4ed9-8ac8-7a44308baaae-kserve-provision-location\") pod \"isvc-sklearn-batcher-custom-predictor-6466bdb79f-rl9fm\" (UID: \"b897baac-bbf7-4ed9-8ac8-7a44308baaae\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6466bdb79f-rl9fm" Mar 18 16:54:28.409086 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:54:28.408944 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkg2x\" (UniqueName: \"kubernetes.io/projected/b897baac-bbf7-4ed9-8ac8-7a44308baaae-kube-api-access-qkg2x\") pod \"isvc-sklearn-batcher-custom-predictor-6466bdb79f-rl9fm\" (UID: \"b897baac-bbf7-4ed9-8ac8-7a44308baaae\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6466bdb79f-rl9fm" Mar 18 16:54:28.509966 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:54:28.509872 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b897baac-bbf7-4ed9-8ac8-7a44308baaae-kserve-provision-location\") pod \"isvc-sklearn-batcher-custom-predictor-6466bdb79f-rl9fm\" (UID: \"b897baac-bbf7-4ed9-8ac8-7a44308baaae\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6466bdb79f-rl9fm" Mar 18 16:54:28.509966 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:54:28.509932 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qkg2x\" (UniqueName: \"kubernetes.io/projected/b897baac-bbf7-4ed9-8ac8-7a44308baaae-kube-api-access-qkg2x\") pod \"isvc-sklearn-batcher-custom-predictor-6466bdb79f-rl9fm\" (UID: \"b897baac-bbf7-4ed9-8ac8-7a44308baaae\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6466bdb79f-rl9fm" Mar 18 16:54:28.510319 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:54:28.510296 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b897baac-bbf7-4ed9-8ac8-7a44308baaae-kserve-provision-location\") pod \"isvc-sklearn-batcher-custom-predictor-6466bdb79f-rl9fm\" (UID: \"b897baac-bbf7-4ed9-8ac8-7a44308baaae\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6466bdb79f-rl9fm" Mar 18 16:54:28.519412 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:54:28.519383 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkg2x\" (UniqueName: \"kubernetes.io/projected/b897baac-bbf7-4ed9-8ac8-7a44308baaae-kube-api-access-qkg2x\") pod \"isvc-sklearn-batcher-custom-predictor-6466bdb79f-rl9fm\" (UID: \"b897baac-bbf7-4ed9-8ac8-7a44308baaae\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6466bdb79f-rl9fm" Mar 18 16:54:28.563121 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:54:28.563077 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6466bdb79f-rl9fm" Mar 18 16:54:28.689269 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:54:28.689241 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6466bdb79f-rl9fm"] Mar 18 16:54:28.691991 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:54:28.691956 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb897baac_bbf7_4ed9_8ac8_7a44308baaae.slice/crio-883a68c296d96586413c5973ba4a7b81499ea059d21230e99726ad40b68acbc0 WatchSource:0}: Error finding container 883a68c296d96586413c5973ba4a7b81499ea059d21230e99726ad40b68acbc0: Status 404 returned error can't find the container with id 883a68c296d96586413c5973ba4a7b81499ea059d21230e99726ad40b68acbc0 Mar 18 16:54:29.164410 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:54:29.164350 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6466bdb79f-rl9fm" event={"ID":"b897baac-bbf7-4ed9-8ac8-7a44308baaae","Type":"ContainerStarted","Data":"a8991dd3537bc2f51e7b96b089b08fd0598a2a6101be9f74d57861ee84d67b1c"} Mar 18 16:54:29.164785 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:54:29.164416 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6466bdb79f-rl9fm" event={"ID":"b897baac-bbf7-4ed9-8ac8-7a44308baaae","Type":"ContainerStarted","Data":"883a68c296d96586413c5973ba4a7b81499ea059d21230e99726ad40b68acbc0"} Mar 18 16:54:29.558771 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:54:29.558734 2575 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized" image="quay.io/opendatahub/odh-model-serving-api:fast" Mar 18 16:54:29.558939 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:54:29.558889 2575 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:server,Image:quay.io/opendatahub/odh-model-serving-api:fast,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},ContainerPort{Name:metrics,HostPort:0,ContainerPort:9090,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:TLS_CERT_DIR,Value:/tls,ValueFrom:nil,},EnvVar{Name:GATEWAY_LABEL_SELECTOR,Value:,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{50 -3} {} 50m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tls-certs,ReadOnly:true,MountPath:/tls,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vhs26,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000640000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod model-serving-api-9699c8d45-ttb7x_kserve(e41abd12-239d-46fb-9cdb-9d35fa51024d): ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized" logger="UnhandledError" Mar 18 16:54:29.560051 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:54:29.560029 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ErrImagePull: \"unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 16:54:32.288350 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:54:32.288287 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-bfbcdb44b-w4hqr" podUID="94a388fe-f8dd-4e67-8ae5-82157cf67a3c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Mar 18 16:54:32.288827 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:54:32.288670 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-bfbcdb44b-w4hqr" podUID="94a388fe-f8dd-4e67-8ae5-82157cf67a3c" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 16:54:33.183123 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:54:33.183084 2575 generic.go:358] "Generic (PLEG): container finished" podID="94a388fe-f8dd-4e67-8ae5-82157cf67a3c" containerID="b62e3a540fd4e2ee20a1fde67071254dcdd3c274643dc097c6cd8d4e5e75a54a" exitCode=0 Mar 18 16:54:33.183302 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:54:33.183162 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-bfbcdb44b-w4hqr" event={"ID":"94a388fe-f8dd-4e67-8ae5-82157cf67a3c","Type":"ContainerDied","Data":"b62e3a540fd4e2ee20a1fde67071254dcdd3c274643dc097c6cd8d4e5e75a54a"} Mar 18 16:54:33.184327 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:54:33.184306 2575 generic.go:358] "Generic (PLEG): container finished" podID="b897baac-bbf7-4ed9-8ac8-7a44308baaae" containerID="a8991dd3537bc2f51e7b96b089b08fd0598a2a6101be9f74d57861ee84d67b1c" exitCode=0 Mar 18 16:54:33.184464 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:54:33.184382 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6466bdb79f-rl9fm" event={"ID":"b897baac-bbf7-4ed9-8ac8-7a44308baaae","Type":"ContainerDied","Data":"a8991dd3537bc2f51e7b96b089b08fd0598a2a6101be9f74d57861ee84d67b1c"} Mar 18 16:54:34.190179 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:54:34.190147 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6466bdb79f-rl9fm" event={"ID":"b897baac-bbf7-4ed9-8ac8-7a44308baaae","Type":"ContainerStarted","Data":"361a511ea48ca8e12f9cd424e39b112d9147800fd07d73eda874de4dc38dc682"} Mar 18 16:54:34.190179 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:54:34.190182 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6466bdb79f-rl9fm" event={"ID":"b897baac-bbf7-4ed9-8ac8-7a44308baaae","Type":"ContainerStarted","Data":"17e6b0ee2b50242e0f3c98cf531d893a6c227df0b1dc6bc536a32921db591159"} Mar 18 16:54:34.190608 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:54:34.190582 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6466bdb79f-rl9fm" Mar 18 16:54:34.191858 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:54:34.191831 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6466bdb79f-rl9fm" podUID="b897baac-bbf7-4ed9-8ac8-7a44308baaae" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:5000: connect: connection refused" Mar 18 16:54:34.210169 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:54:34.210127 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6466bdb79f-rl9fm" podStartSLOduration=6.210110879 podStartE2EDuration="6.210110879s" podCreationTimestamp="2026-03-18 16:54:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:54:34.209710531 +0000 UTC m=+592.520963473" watchObservedRunningTime="2026-03-18 16:54:34.210110879 +0000 UTC m=+592.521363799" Mar 18 16:54:35.193308 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:54:35.193276 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6466bdb79f-rl9fm" Mar 18 16:54:35.193743 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:54:35.193437 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6466bdb79f-rl9fm" podUID="b897baac-bbf7-4ed9-8ac8-7a44308baaae" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:5000: connect: connection refused" Mar 18 16:54:35.194423 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:54:35.194395 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6466bdb79f-rl9fm" podUID="b897baac-bbf7-4ed9-8ac8-7a44308baaae" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 16:54:36.196640 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:54:36.196595 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6466bdb79f-rl9fm" podUID="b897baac-bbf7-4ed9-8ac8-7a44308baaae" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:5000: connect: connection refused" Mar 18 16:54:36.197158 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:54:36.197075 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6466bdb79f-rl9fm" podUID="b897baac-bbf7-4ed9-8ac8-7a44308baaae" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 16:54:42.197239 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:54:42.197213 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qktrm_a51f93e2-c750-4700-bb46-109456c7c78a/ovn-acl-logging/0.log" Mar 18 16:54:42.199095 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:54:42.199074 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qktrm_a51f93e2-c750-4700-bb46-109456c7c78a/ovn-acl-logging/0.log" Mar 18 16:54:42.288698 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:54:42.288301 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-bfbcdb44b-w4hqr" podUID="94a388fe-f8dd-4e67-8ae5-82157cf67a3c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Mar 18 16:54:42.288805 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:54:42.288767 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 16:54:42.289051 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:54:42.289023 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-bfbcdb44b-w4hqr" podUID="94a388fe-f8dd-4e67-8ae5-82157cf67a3c" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 16:54:46.196646 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:54:46.196611 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6466bdb79f-rl9fm" podUID="b897baac-bbf7-4ed9-8ac8-7a44308baaae" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:5000: connect: connection refused" Mar 18 16:54:46.197119 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:54:46.197095 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6466bdb79f-rl9fm" podUID="b897baac-bbf7-4ed9-8ac8-7a44308baaae" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 16:54:52.288398 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:54:52.288285 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-bfbcdb44b-w4hqr" podUID="94a388fe-f8dd-4e67-8ae5-82157cf67a3c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Mar 18 16:54:52.290824 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:54:52.288717 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-bfbcdb44b-w4hqr" podUID="94a388fe-f8dd-4e67-8ae5-82157cf67a3c" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 16:54:52.290824 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:54:52.290589 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-bfbcdb44b-w4hqr" Mar 18 16:54:52.290824 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:54:52.290630 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-bfbcdb44b-w4hqr" Mar 18 16:54:53.287199 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:54:53.287162 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 16:54:56.196868 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:54:56.196820 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6466bdb79f-rl9fm" podUID="b897baac-bbf7-4ed9-8ac8-7a44308baaae" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:5000: connect: connection refused" Mar 18 16:54:56.197400 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:54:56.197231 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6466bdb79f-rl9fm" podUID="b897baac-bbf7-4ed9-8ac8-7a44308baaae" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 16:54:58.130278 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:54:58.130249 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-bfbcdb44b-w4hqr" Mar 18 16:54:58.262134 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:54:58.262051 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/94a388fe-f8dd-4e67-8ae5-82157cf67a3c-kserve-provision-location\") pod \"94a388fe-f8dd-4e67-8ae5-82157cf67a3c\" (UID: \"94a388fe-f8dd-4e67-8ae5-82157cf67a3c\") " Mar 18 16:54:58.262134 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:54:58.262095 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kvwc6\" (UniqueName: \"kubernetes.io/projected/94a388fe-f8dd-4e67-8ae5-82157cf67a3c-kube-api-access-kvwc6\") pod \"94a388fe-f8dd-4e67-8ae5-82157cf67a3c\" (UID: \"94a388fe-f8dd-4e67-8ae5-82157cf67a3c\") " Mar 18 16:54:58.262463 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:54:58.262440 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94a388fe-f8dd-4e67-8ae5-82157cf67a3c-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "94a388fe-f8dd-4e67-8ae5-82157cf67a3c" (UID: "94a388fe-f8dd-4e67-8ae5-82157cf67a3c"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 18 16:54:58.264390 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:54:58.264345 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94a388fe-f8dd-4e67-8ae5-82157cf67a3c-kube-api-access-kvwc6" (OuterVolumeSpecName: "kube-api-access-kvwc6") pod "94a388fe-f8dd-4e67-8ae5-82157cf67a3c" (UID: "94a388fe-f8dd-4e67-8ae5-82157cf67a3c"). InnerVolumeSpecName "kube-api-access-kvwc6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 16:54:58.269921 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:54:58.269897 2575 generic.go:358] "Generic (PLEG): container finished" podID="94a388fe-f8dd-4e67-8ae5-82157cf67a3c" containerID="825d46c50bc659ab5543e4c69faa688395fd0a6f95da3c4221cb31ac67d611ed" exitCode=0 Mar 18 16:54:58.270034 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:54:58.269959 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-bfbcdb44b-w4hqr" event={"ID":"94a388fe-f8dd-4e67-8ae5-82157cf67a3c","Type":"ContainerDied","Data":"825d46c50bc659ab5543e4c69faa688395fd0a6f95da3c4221cb31ac67d611ed"} Mar 18 16:54:58.270034 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:54:58.269989 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-bfbcdb44b-w4hqr" Mar 18 16:54:58.270034 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:54:58.270009 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-bfbcdb44b-w4hqr" event={"ID":"94a388fe-f8dd-4e67-8ae5-82157cf67a3c","Type":"ContainerDied","Data":"27c2592b925632a03e4e6dc4161d63f589b88c3f7184f58cf4fd29dbdc0314d1"} Mar 18 16:54:58.270034 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:54:58.270033 2575 scope.go:117] "RemoveContainer" containerID="825d46c50bc659ab5543e4c69faa688395fd0a6f95da3c4221cb31ac67d611ed" Mar 18 16:54:58.279223 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:54:58.279199 2575 scope.go:117] "RemoveContainer" containerID="b62e3a540fd4e2ee20a1fde67071254dcdd3c274643dc097c6cd8d4e5e75a54a" Mar 18 16:54:58.286850 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:54:58.286836 2575 scope.go:117] "RemoveContainer" containerID="d1c246bfa5d69c74d3c8faaa46c2b721b46ed7ca249023bb20ab877eb4df1b9b" Mar 18 16:54:58.293852 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:54:58.293825 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-bfbcdb44b-w4hqr"] Mar 18 16:54:58.297391 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:54:58.297353 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-bfbcdb44b-w4hqr"] Mar 18 16:54:58.303175 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:54:58.303154 2575 scope.go:117] "RemoveContainer" containerID="825d46c50bc659ab5543e4c69faa688395fd0a6f95da3c4221cb31ac67d611ed" Mar 18 16:54:58.303469 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:54:58.303448 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"825d46c50bc659ab5543e4c69faa688395fd0a6f95da3c4221cb31ac67d611ed\": container with ID starting with 825d46c50bc659ab5543e4c69faa688395fd0a6f95da3c4221cb31ac67d611ed not found: ID does not exist" containerID="825d46c50bc659ab5543e4c69faa688395fd0a6f95da3c4221cb31ac67d611ed" Mar 18 16:54:58.303536 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:54:58.303479 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"825d46c50bc659ab5543e4c69faa688395fd0a6f95da3c4221cb31ac67d611ed"} err="failed to get container status \"825d46c50bc659ab5543e4c69faa688395fd0a6f95da3c4221cb31ac67d611ed\": rpc error: code = NotFound desc = could not find container \"825d46c50bc659ab5543e4c69faa688395fd0a6f95da3c4221cb31ac67d611ed\": container with ID starting with 825d46c50bc659ab5543e4c69faa688395fd0a6f95da3c4221cb31ac67d611ed not found: ID does not exist" Mar 18 16:54:58.303536 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:54:58.303496 2575 scope.go:117] "RemoveContainer" containerID="b62e3a540fd4e2ee20a1fde67071254dcdd3c274643dc097c6cd8d4e5e75a54a" Mar 18 16:54:58.303767 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:54:58.303748 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b62e3a540fd4e2ee20a1fde67071254dcdd3c274643dc097c6cd8d4e5e75a54a\": container with ID starting with b62e3a540fd4e2ee20a1fde67071254dcdd3c274643dc097c6cd8d4e5e75a54a not found: ID does not exist" containerID="b62e3a540fd4e2ee20a1fde67071254dcdd3c274643dc097c6cd8d4e5e75a54a" Mar 18 16:54:58.303821 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:54:58.303771 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b62e3a540fd4e2ee20a1fde67071254dcdd3c274643dc097c6cd8d4e5e75a54a"} err="failed to get container status \"b62e3a540fd4e2ee20a1fde67071254dcdd3c274643dc097c6cd8d4e5e75a54a\": rpc error: code = NotFound desc = could not find container \"b62e3a540fd4e2ee20a1fde67071254dcdd3c274643dc097c6cd8d4e5e75a54a\": container with ID starting with b62e3a540fd4e2ee20a1fde67071254dcdd3c274643dc097c6cd8d4e5e75a54a not found: ID does not exist" Mar 18 16:54:58.303821 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:54:58.303794 2575 scope.go:117] "RemoveContainer" containerID="d1c246bfa5d69c74d3c8faaa46c2b721b46ed7ca249023bb20ab877eb4df1b9b" Mar 18 16:54:58.304037 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:54:58.304019 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1c246bfa5d69c74d3c8faaa46c2b721b46ed7ca249023bb20ab877eb4df1b9b\": container with ID starting with d1c246bfa5d69c74d3c8faaa46c2b721b46ed7ca249023bb20ab877eb4df1b9b not found: ID does not exist" containerID="d1c246bfa5d69c74d3c8faaa46c2b721b46ed7ca249023bb20ab877eb4df1b9b" Mar 18 16:54:58.304083 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:54:58.304042 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1c246bfa5d69c74d3c8faaa46c2b721b46ed7ca249023bb20ab877eb4df1b9b"} err="failed to get container status \"d1c246bfa5d69c74d3c8faaa46c2b721b46ed7ca249023bb20ab877eb4df1b9b\": rpc error: code = NotFound desc = could not find container \"d1c246bfa5d69c74d3c8faaa46c2b721b46ed7ca249023bb20ab877eb4df1b9b\": container with ID starting with d1c246bfa5d69c74d3c8faaa46c2b721b46ed7ca249023bb20ab877eb4df1b9b not found: ID does not exist" Mar 18 16:54:58.362938 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:54:58.362915 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/94a388fe-f8dd-4e67-8ae5-82157cf67a3c-kserve-provision-location\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 16:54:58.362938 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:54:58.362937 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kvwc6\" (UniqueName: \"kubernetes.io/projected/94a388fe-f8dd-4e67-8ae5-82157cf67a3c-kube-api-access-kvwc6\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 16:55:00.290217 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:55:00.290182 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94a388fe-f8dd-4e67-8ae5-82157cf67a3c" path="/var/lib/kubelet/pods/94a388fe-f8dd-4e67-8ae5-82157cf67a3c/volumes" Mar 18 16:55:06.196873 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:55:06.196826 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6466bdb79f-rl9fm" podUID="b897baac-bbf7-4ed9-8ac8-7a44308baaae" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:5000: connect: connection refused" Mar 18 16:55:06.197586 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:55:06.197551 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6466bdb79f-rl9fm" podUID="b897baac-bbf7-4ed9-8ac8-7a44308baaae" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 16:55:06.287632 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:55:06.287507 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 16:55:06.287701 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:55:06.287675 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 16:55:16.197299 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:55:16.197233 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6466bdb79f-rl9fm" podUID="b897baac-bbf7-4ed9-8ac8-7a44308baaae" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:5000: connect: connection refused" Mar 18 16:55:16.197864 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:55:16.197785 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6466bdb79f-rl9fm" podUID="b897baac-bbf7-4ed9-8ac8-7a44308baaae" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 16:55:21.286839 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:55:21.286806 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 16:55:26.197031 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:55:26.196981 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6466bdb79f-rl9fm" podUID="b897baac-bbf7-4ed9-8ac8-7a44308baaae" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:5000: connect: connection refused" Mar 18 16:55:26.197540 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:55:26.197501 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6466bdb79f-rl9fm" podUID="b897baac-bbf7-4ed9-8ac8-7a44308baaae" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 16:55:36.197172 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:55:36.197122 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6466bdb79f-rl9fm" podUID="b897baac-bbf7-4ed9-8ac8-7a44308baaae" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:5000: connect: connection refused" Mar 18 16:55:36.197678 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:55:36.197652 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6466bdb79f-rl9fm" podUID="b897baac-bbf7-4ed9-8ac8-7a44308baaae" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 16:55:36.291168 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:55:36.291142 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 16:55:46.197376 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:55:46.197306 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6466bdb79f-rl9fm" podUID="b897baac-bbf7-4ed9-8ac8-7a44308baaae" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:5000: connect: connection refused" Mar 18 16:55:46.197823 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:55:46.197800 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6466bdb79f-rl9fm" podUID="b897baac-bbf7-4ed9-8ac8-7a44308baaae" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 16:55:49.286708 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:55:49.286650 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 16:55:56.197169 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:55:56.197132 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6466bdb79f-rl9fm" Mar 18 16:55:56.197599 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:55:56.197220 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6466bdb79f-rl9fm" Mar 18 16:56:02.289629 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:56:02.289591 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 16:56:03.488009 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:56:03.487974 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6466bdb79f-rl9fm"] Mar 18 16:56:03.488513 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:56:03.488329 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6466bdb79f-rl9fm" podUID="b897baac-bbf7-4ed9-8ac8-7a44308baaae" containerName="kserve-container" containerID="cri-o://17e6b0ee2b50242e0f3c98cf531d893a6c227df0b1dc6bc536a32921db591159" gracePeriod=30 Mar 18 16:56:03.488583 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:56:03.488427 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6466bdb79f-rl9fm" podUID="b897baac-bbf7-4ed9-8ac8-7a44308baaae" containerName="agent" containerID="cri-o://361a511ea48ca8e12f9cd424e39b112d9147800fd07d73eda874de4dc38dc682" gracePeriod=30 Mar 18 16:56:06.197640 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:56:06.197575 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6466bdb79f-rl9fm" podUID="b897baac-bbf7-4ed9-8ac8-7a44308baaae" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:5000: connect: connection refused" Mar 18 16:56:06.198043 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:56:06.197844 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6466bdb79f-rl9fm" podUID="b897baac-bbf7-4ed9-8ac8-7a44308baaae" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 16:56:08.506012 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:56:08.505978 2575 generic.go:358] "Generic (PLEG): container finished" podID="b897baac-bbf7-4ed9-8ac8-7a44308baaae" containerID="17e6b0ee2b50242e0f3c98cf531d893a6c227df0b1dc6bc536a32921db591159" exitCode=0 Mar 18 16:56:08.506350 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:56:08.506046 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6466bdb79f-rl9fm" event={"ID":"b897baac-bbf7-4ed9-8ac8-7a44308baaae","Type":"ContainerDied","Data":"17e6b0ee2b50242e0f3c98cf531d893a6c227df0b1dc6bc536a32921db591159"} Mar 18 16:56:13.261610 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:56:13.261564 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-678fd49897-xd9nl"] Mar 18 16:56:13.261969 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:56:13.261917 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="94a388fe-f8dd-4e67-8ae5-82157cf67a3c" containerName="storage-initializer" Mar 18 16:56:13.261969 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:56:13.261929 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="94a388fe-f8dd-4e67-8ae5-82157cf67a3c" containerName="storage-initializer" Mar 18 16:56:13.261969 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:56:13.261942 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="94a388fe-f8dd-4e67-8ae5-82157cf67a3c" containerName="kserve-container" Mar 18 16:56:13.261969 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:56:13.261951 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="94a388fe-f8dd-4e67-8ae5-82157cf67a3c" containerName="kserve-container" Mar 18 16:56:13.261969 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:56:13.261962 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="94a388fe-f8dd-4e67-8ae5-82157cf67a3c" containerName="agent" Mar 18 16:56:13.261969 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:56:13.261967 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="94a388fe-f8dd-4e67-8ae5-82157cf67a3c" containerName="agent" Mar 18 16:56:13.262179 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:56:13.262033 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="94a388fe-f8dd-4e67-8ae5-82157cf67a3c" containerName="kserve-container" Mar 18 16:56:13.262179 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:56:13.262044 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="94a388fe-f8dd-4e67-8ae5-82157cf67a3c" containerName="agent" Mar 18 16:56:13.265330 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:56:13.265314 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-678fd49897-xd9nl" Mar 18 16:56:13.275535 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:56:13.275507 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-678fd49897-xd9nl"] Mar 18 16:56:13.356660 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:56:13.356634 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0c351db7-dc44-43ab-ac42-12a48913a689-kserve-provision-location\") pod \"isvc-logger-predictor-678fd49897-xd9nl\" (UID: \"0c351db7-dc44-43ab-ac42-12a48913a689\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-678fd49897-xd9nl" Mar 18 16:56:13.356810 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:56:13.356746 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8n87\" (UniqueName: \"kubernetes.io/projected/0c351db7-dc44-43ab-ac42-12a48913a689-kube-api-access-s8n87\") pod \"isvc-logger-predictor-678fd49897-xd9nl\" (UID: \"0c351db7-dc44-43ab-ac42-12a48913a689\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-678fd49897-xd9nl" Mar 18 16:56:13.458064 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:56:13.458033 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s8n87\" (UniqueName: \"kubernetes.io/projected/0c351db7-dc44-43ab-ac42-12a48913a689-kube-api-access-s8n87\") pod \"isvc-logger-predictor-678fd49897-xd9nl\" (UID: \"0c351db7-dc44-43ab-ac42-12a48913a689\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-678fd49897-xd9nl" Mar 18 16:56:13.458227 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:56:13.458076 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0c351db7-dc44-43ab-ac42-12a48913a689-kserve-provision-location\") pod \"isvc-logger-predictor-678fd49897-xd9nl\" (UID: \"0c351db7-dc44-43ab-ac42-12a48913a689\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-678fd49897-xd9nl" Mar 18 16:56:13.458432 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:56:13.458417 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0c351db7-dc44-43ab-ac42-12a48913a689-kserve-provision-location\") pod \"isvc-logger-predictor-678fd49897-xd9nl\" (UID: \"0c351db7-dc44-43ab-ac42-12a48913a689\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-678fd49897-xd9nl" Mar 18 16:56:13.466060 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:56:13.466040 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8n87\" (UniqueName: \"kubernetes.io/projected/0c351db7-dc44-43ab-ac42-12a48913a689-kube-api-access-s8n87\") pod \"isvc-logger-predictor-678fd49897-xd9nl\" (UID: \"0c351db7-dc44-43ab-ac42-12a48913a689\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-678fd49897-xd9nl" Mar 18 16:56:13.582421 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:56:13.582326 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-678fd49897-xd9nl" Mar 18 16:56:13.705172 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:56:13.705144 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-678fd49897-xd9nl"] Mar 18 16:56:13.707416 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:56:13.707385 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c351db7_dc44_43ab_ac42_12a48913a689.slice/crio-4f2e483c9486e28c6611e9ecdfa23bc6c385e073e7162bc3c3a111144ee16ea5 WatchSource:0}: Error finding container 4f2e483c9486e28c6611e9ecdfa23bc6c385e073e7162bc3c3a111144ee16ea5: Status 404 returned error can't find the container with id 4f2e483c9486e28c6611e9ecdfa23bc6c385e073e7162bc3c3a111144ee16ea5 Mar 18 16:56:14.526809 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:56:14.526772 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-678fd49897-xd9nl" event={"ID":"0c351db7-dc44-43ab-ac42-12a48913a689","Type":"ContainerStarted","Data":"d74e59295a2cf90df236d005ef947fb79fc4ed20754ee6ddbe1670fd0c738459"} Mar 18 16:56:14.526809 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:56:14.526813 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-678fd49897-xd9nl" event={"ID":"0c351db7-dc44-43ab-ac42-12a48913a689","Type":"ContainerStarted","Data":"4f2e483c9486e28c6611e9ecdfa23bc6c385e073e7162bc3c3a111144ee16ea5"} Mar 18 16:56:16.197552 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:56:16.197511 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6466bdb79f-rl9fm" podUID="b897baac-bbf7-4ed9-8ac8-7a44308baaae" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:5000: connect: connection refused" Mar 18 16:56:16.197958 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:56:16.197797 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6466bdb79f-rl9fm" podUID="b897baac-bbf7-4ed9-8ac8-7a44308baaae" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 16:56:17.286834 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:56:17.286805 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 16:56:18.541635 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:56:18.541599 2575 generic.go:358] "Generic (PLEG): container finished" podID="0c351db7-dc44-43ab-ac42-12a48913a689" containerID="d74e59295a2cf90df236d005ef947fb79fc4ed20754ee6ddbe1670fd0c738459" exitCode=0 Mar 18 16:56:18.542040 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:56:18.541652 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-678fd49897-xd9nl" event={"ID":"0c351db7-dc44-43ab-ac42-12a48913a689","Type":"ContainerDied","Data":"d74e59295a2cf90df236d005ef947fb79fc4ed20754ee6ddbe1670fd0c738459"} Mar 18 16:56:19.548352 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:56:19.548320 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-678fd49897-xd9nl" event={"ID":"0c351db7-dc44-43ab-ac42-12a48913a689","Type":"ContainerStarted","Data":"d27af57178ea649fbec2d63fe94d9d810c0934c05be4bed58584243a73ff5c5b"} Mar 18 16:56:19.548743 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:56:19.548376 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-678fd49897-xd9nl" event={"ID":"0c351db7-dc44-43ab-ac42-12a48913a689","Type":"ContainerStarted","Data":"95add0971349057340cb95b26940877f87fc5fdc0ef74a8c8ac37c3d43d61b69"} Mar 18 16:56:19.548743 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:56:19.548674 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-678fd49897-xd9nl" Mar 18 16:56:19.550113 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:56:19.550086 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-678fd49897-xd9nl" podUID="0c351db7-dc44-43ab-ac42-12a48913a689" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Mar 18 16:56:19.566419 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:56:19.566325 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-logger-predictor-678fd49897-xd9nl" podStartSLOduration=6.566312691 podStartE2EDuration="6.566312691s" podCreationTimestamp="2026-03-18 16:56:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:56:19.564578061 +0000 UTC m=+697.875830998" watchObservedRunningTime="2026-03-18 16:56:19.566312691 +0000 UTC m=+697.877565612" Mar 18 16:56:20.551760 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:56:20.551728 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-678fd49897-xd9nl" Mar 18 16:56:20.552237 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:56:20.551821 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-678fd49897-xd9nl" podUID="0c351db7-dc44-43ab-ac42-12a48913a689" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Mar 18 16:56:20.552756 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:56:20.552729 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-678fd49897-xd9nl" podUID="0c351db7-dc44-43ab-ac42-12a48913a689" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 16:56:21.555223 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:56:21.555184 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-678fd49897-xd9nl" podUID="0c351db7-dc44-43ab-ac42-12a48913a689" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Mar 18 16:56:21.555676 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:56:21.555451 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-678fd49897-xd9nl" podUID="0c351db7-dc44-43ab-ac42-12a48913a689" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 16:56:26.196655 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:56:26.196603 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6466bdb79f-rl9fm" podUID="b897baac-bbf7-4ed9-8ac8-7a44308baaae" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:5000: connect: connection refused" Mar 18 16:56:26.197069 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:56:26.196793 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6466bdb79f-rl9fm" Mar 18 16:56:26.197069 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:56:26.196971 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6466bdb79f-rl9fm" podUID="b897baac-bbf7-4ed9-8ac8-7a44308baaae" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 16:56:26.197162 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:56:26.197070 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6466bdb79f-rl9fm" Mar 18 16:56:30.287831 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:56:30.287790 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 16:56:31.555440 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:56:31.555398 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-678fd49897-xd9nl" podUID="0c351db7-dc44-43ab-ac42-12a48913a689" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Mar 18 16:56:31.555848 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:56:31.555817 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-678fd49897-xd9nl" podUID="0c351db7-dc44-43ab-ac42-12a48913a689" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 16:56:33.599893 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:56:33.599861 2575 generic.go:358] "Generic (PLEG): container finished" podID="b897baac-bbf7-4ed9-8ac8-7a44308baaae" containerID="361a511ea48ca8e12f9cd424e39b112d9147800fd07d73eda874de4dc38dc682" exitCode=0 Mar 18 16:56:33.600226 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:56:33.599942 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6466bdb79f-rl9fm" event={"ID":"b897baac-bbf7-4ed9-8ac8-7a44308baaae","Type":"ContainerDied","Data":"361a511ea48ca8e12f9cd424e39b112d9147800fd07d73eda874de4dc38dc682"} Mar 18 16:56:33.640226 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:56:33.640206 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6466bdb79f-rl9fm" Mar 18 16:56:33.724498 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:56:33.724470 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b897baac-bbf7-4ed9-8ac8-7a44308baaae-kserve-provision-location\") pod \"b897baac-bbf7-4ed9-8ac8-7a44308baaae\" (UID: \"b897baac-bbf7-4ed9-8ac8-7a44308baaae\") " Mar 18 16:56:33.724673 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:56:33.724537 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qkg2x\" (UniqueName: \"kubernetes.io/projected/b897baac-bbf7-4ed9-8ac8-7a44308baaae-kube-api-access-qkg2x\") pod \"b897baac-bbf7-4ed9-8ac8-7a44308baaae\" (UID: \"b897baac-bbf7-4ed9-8ac8-7a44308baaae\") " Mar 18 16:56:33.724802 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:56:33.724776 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b897baac-bbf7-4ed9-8ac8-7a44308baaae-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "b897baac-bbf7-4ed9-8ac8-7a44308baaae" (UID: "b897baac-bbf7-4ed9-8ac8-7a44308baaae"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 18 16:56:33.726758 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:56:33.726736 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b897baac-bbf7-4ed9-8ac8-7a44308baaae-kube-api-access-qkg2x" (OuterVolumeSpecName: "kube-api-access-qkg2x") pod "b897baac-bbf7-4ed9-8ac8-7a44308baaae" (UID: "b897baac-bbf7-4ed9-8ac8-7a44308baaae"). InnerVolumeSpecName "kube-api-access-qkg2x". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 16:56:33.825047 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:56:33.824982 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qkg2x\" (UniqueName: \"kubernetes.io/projected/b897baac-bbf7-4ed9-8ac8-7a44308baaae-kube-api-access-qkg2x\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 16:56:33.825047 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:56:33.825004 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b897baac-bbf7-4ed9-8ac8-7a44308baaae-kserve-provision-location\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 16:56:34.604509 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:56:34.604416 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6466bdb79f-rl9fm" event={"ID":"b897baac-bbf7-4ed9-8ac8-7a44308baaae","Type":"ContainerDied","Data":"883a68c296d96586413c5973ba4a7b81499ea059d21230e99726ad40b68acbc0"} Mar 18 16:56:34.604509 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:56:34.604467 2575 scope.go:117] "RemoveContainer" containerID="361a511ea48ca8e12f9cd424e39b112d9147800fd07d73eda874de4dc38dc682" Mar 18 16:56:34.604509 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:56:34.604485 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6466bdb79f-rl9fm" Mar 18 16:56:34.612870 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:56:34.612854 2575 scope.go:117] "RemoveContainer" containerID="17e6b0ee2b50242e0f3c98cf531d893a6c227df0b1dc6bc536a32921db591159" Mar 18 16:56:34.619746 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:56:34.619722 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6466bdb79f-rl9fm"] Mar 18 16:56:34.621014 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:56:34.620991 2575 scope.go:117] "RemoveContainer" containerID="a8991dd3537bc2f51e7b96b089b08fd0598a2a6101be9f74d57861ee84d67b1c" Mar 18 16:56:34.624589 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:56:34.624568 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6466bdb79f-rl9fm"] Mar 18 16:56:36.291134 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:56:36.291099 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b897baac-bbf7-4ed9-8ac8-7a44308baaae" path="/var/lib/kubelet/pods/b897baac-bbf7-4ed9-8ac8-7a44308baaae/volumes" Mar 18 16:56:41.555448 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:56:41.555401 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-678fd49897-xd9nl" podUID="0c351db7-dc44-43ab-ac42-12a48913a689" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Mar 18 16:56:41.555937 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:56:41.555910 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-678fd49897-xd9nl" podUID="0c351db7-dc44-43ab-ac42-12a48913a689" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 16:56:43.286948 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:56:43.286917 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 16:56:51.555342 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:56:51.555285 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-678fd49897-xd9nl" podUID="0c351db7-dc44-43ab-ac42-12a48913a689" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Mar 18 16:56:51.555840 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:56:51.555761 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-678fd49897-xd9nl" podUID="0c351db7-dc44-43ab-ac42-12a48913a689" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 16:56:57.287180 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:56:57.287147 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 16:57:01.555212 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:57:01.555162 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-678fd49897-xd9nl" podUID="0c351db7-dc44-43ab-ac42-12a48913a689" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Mar 18 16:57:01.555669 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:57:01.555604 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-678fd49897-xd9nl" podUID="0c351db7-dc44-43ab-ac42-12a48913a689" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 16:57:08.287822 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:57:08.287780 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 16:57:11.555574 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:57:11.555520 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-678fd49897-xd9nl" podUID="0c351db7-dc44-43ab-ac42-12a48913a689" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Mar 18 16:57:11.556107 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:57:11.556081 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-678fd49897-xd9nl" podUID="0c351db7-dc44-43ab-ac42-12a48913a689" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 16:57:21.555330 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:57:21.555280 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-678fd49897-xd9nl" podUID="0c351db7-dc44-43ab-ac42-12a48913a689" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Mar 18 16:57:21.555762 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:57:21.555733 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-678fd49897-xd9nl" podUID="0c351db7-dc44-43ab-ac42-12a48913a689" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 16:57:23.589185 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:57:23.589086 2575 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized" image="quay.io/opendatahub/odh-model-serving-api:fast" Mar 18 16:57:23.589602 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:57:23.589275 2575 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:server,Image:quay.io/opendatahub/odh-model-serving-api:fast,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},ContainerPort{Name:metrics,HostPort:0,ContainerPort:9090,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:TLS_CERT_DIR,Value:/tls,ValueFrom:nil,},EnvVar{Name:GATEWAY_LABEL_SELECTOR,Value:,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{50 -3} {} 50m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tls-certs,ReadOnly:true,MountPath:/tls,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vhs26,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000640000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod model-serving-api-9699c8d45-ttb7x_kserve(e41abd12-239d-46fb-9cdb-9d35fa51024d): ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized" logger="UnhandledError" Mar 18 16:57:23.590467 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:57:23.590436 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ErrImagePull: \"unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 16:57:31.555277 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:57:31.555228 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-678fd49897-xd9nl" podUID="0c351db7-dc44-43ab-ac42-12a48913a689" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Mar 18 16:57:31.555716 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:57:31.555654 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-678fd49897-xd9nl" podUID="0c351db7-dc44-43ab-ac42-12a48913a689" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 16:57:34.285908 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:57:34.285870 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-678fd49897-xd9nl" podUID="0c351db7-dc44-43ab-ac42-12a48913a689" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Mar 18 16:57:34.286388 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:57:34.286136 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-678fd49897-xd9nl" podUID="0c351db7-dc44-43ab-ac42-12a48913a689" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 16:57:37.287193 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:57:37.287162 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 16:57:44.290633 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:57:44.290607 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-678fd49897-xd9nl" Mar 18 16:57:44.291003 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:57:44.290657 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-678fd49897-xd9nl" Mar 18 16:57:50.287700 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:57:50.287669 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 16:57:58.678526 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:57:58.678493 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-678fd49897-xd9nl"] Mar 18 16:57:58.679008 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:57:58.678828 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-predictor-678fd49897-xd9nl" podUID="0c351db7-dc44-43ab-ac42-12a48913a689" containerName="agent" containerID="cri-o://d27af57178ea649fbec2d63fe94d9d810c0934c05be4bed58584243a73ff5c5b" gracePeriod=30 Mar 18 16:57:58.679008 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:57:58.678843 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-predictor-678fd49897-xd9nl" podUID="0c351db7-dc44-43ab-ac42-12a48913a689" containerName="kserve-container" containerID="cri-o://95add0971349057340cb95b26940877f87fc5fdc0ef74a8c8ac37c3d43d61b69" gracePeriod=30 Mar 18 16:57:58.984820 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:57:58.984733 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-5785cfb7f4-hwrxw"] Mar 18 16:57:58.985148 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:57:58.985131 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b897baac-bbf7-4ed9-8ac8-7a44308baaae" containerName="storage-initializer" Mar 18 16:57:58.985235 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:57:58.985152 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="b897baac-bbf7-4ed9-8ac8-7a44308baaae" containerName="storage-initializer" Mar 18 16:57:58.985235 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:57:58.985166 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b897baac-bbf7-4ed9-8ac8-7a44308baaae" containerName="agent" Mar 18 16:57:58.985235 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:57:58.985174 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="b897baac-bbf7-4ed9-8ac8-7a44308baaae" containerName="agent" Mar 18 16:57:58.985235 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:57:58.985202 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b897baac-bbf7-4ed9-8ac8-7a44308baaae" containerName="kserve-container" Mar 18 16:57:58.985235 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:57:58.985211 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="b897baac-bbf7-4ed9-8ac8-7a44308baaae" containerName="kserve-container" Mar 18 16:57:58.985504 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:57:58.985298 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="b897baac-bbf7-4ed9-8ac8-7a44308baaae" containerName="agent" Mar 18 16:57:58.985504 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:57:58.985315 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="b897baac-bbf7-4ed9-8ac8-7a44308baaae" containerName="kserve-container" Mar 18 16:57:58.988589 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:57:58.988568 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-5785cfb7f4-hwrxw" Mar 18 16:57:59.053827 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:57:59.053802 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-5785cfb7f4-hwrxw"] Mar 18 16:57:59.082671 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:57:59.082632 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qf5vs\" (UniqueName: \"kubernetes.io/projected/6e3eb22b-19c9-438b-87d5-d7cef38ea59a-kube-api-access-qf5vs\") pod \"isvc-lightgbm-predictor-5785cfb7f4-hwrxw\" (UID: \"6e3eb22b-19c9-438b-87d5-d7cef38ea59a\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-5785cfb7f4-hwrxw" Mar 18 16:57:59.082671 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:57:59.082672 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6e3eb22b-19c9-438b-87d5-d7cef38ea59a-kserve-provision-location\") pod \"isvc-lightgbm-predictor-5785cfb7f4-hwrxw\" (UID: \"6e3eb22b-19c9-438b-87d5-d7cef38ea59a\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-5785cfb7f4-hwrxw" Mar 18 16:57:59.184087 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:57:59.184054 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qf5vs\" (UniqueName: \"kubernetes.io/projected/6e3eb22b-19c9-438b-87d5-d7cef38ea59a-kube-api-access-qf5vs\") pod \"isvc-lightgbm-predictor-5785cfb7f4-hwrxw\" (UID: \"6e3eb22b-19c9-438b-87d5-d7cef38ea59a\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-5785cfb7f4-hwrxw" Mar 18 16:57:59.184216 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:57:59.184104 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6e3eb22b-19c9-438b-87d5-d7cef38ea59a-kserve-provision-location\") pod \"isvc-lightgbm-predictor-5785cfb7f4-hwrxw\" (UID: \"6e3eb22b-19c9-438b-87d5-d7cef38ea59a\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-5785cfb7f4-hwrxw" Mar 18 16:57:59.184493 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:57:59.184471 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6e3eb22b-19c9-438b-87d5-d7cef38ea59a-kserve-provision-location\") pod \"isvc-lightgbm-predictor-5785cfb7f4-hwrxw\" (UID: \"6e3eb22b-19c9-438b-87d5-d7cef38ea59a\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-5785cfb7f4-hwrxw" Mar 18 16:57:59.191822 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:57:59.191802 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qf5vs\" (UniqueName: \"kubernetes.io/projected/6e3eb22b-19c9-438b-87d5-d7cef38ea59a-kube-api-access-qf5vs\") pod \"isvc-lightgbm-predictor-5785cfb7f4-hwrxw\" (UID: \"6e3eb22b-19c9-438b-87d5-d7cef38ea59a\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-5785cfb7f4-hwrxw" Mar 18 16:57:59.298446 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:57:59.298385 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-5785cfb7f4-hwrxw" Mar 18 16:57:59.424966 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:57:59.421336 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-5785cfb7f4-hwrxw"] Mar 18 16:57:59.429108 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:57:59.429076 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e3eb22b_19c9_438b_87d5_d7cef38ea59a.slice/crio-d98d08ca87e3516c676309fb1bc06f9a5f89b9e60682f60edab2b186fbcc8046 WatchSource:0}: Error finding container d98d08ca87e3516c676309fb1bc06f9a5f89b9e60682f60edab2b186fbcc8046: Status 404 returned error can't find the container with id d98d08ca87e3516c676309fb1bc06f9a5f89b9e60682f60edab2b186fbcc8046 Mar 18 16:57:59.884400 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:57:59.884342 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-5785cfb7f4-hwrxw" event={"ID":"6e3eb22b-19c9-438b-87d5-d7cef38ea59a","Type":"ContainerStarted","Data":"543c29e889e53f54934786b91705a0b9af8e3da21ff15f9169b4db2e9e517680"} Mar 18 16:57:59.884400 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:57:59.884405 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-5785cfb7f4-hwrxw" event={"ID":"6e3eb22b-19c9-438b-87d5-d7cef38ea59a","Type":"ContainerStarted","Data":"d98d08ca87e3516c676309fb1bc06f9a5f89b9e60682f60edab2b186fbcc8046"} Mar 18 16:58:01.286673 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:58:01.286645 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 16:58:03.898976 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:58:03.898945 2575 generic.go:358] "Generic (PLEG): container finished" podID="0c351db7-dc44-43ab-ac42-12a48913a689" containerID="95add0971349057340cb95b26940877f87fc5fdc0ef74a8c8ac37c3d43d61b69" exitCode=0 Mar 18 16:58:03.899378 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:58:03.899013 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-678fd49897-xd9nl" event={"ID":"0c351db7-dc44-43ab-ac42-12a48913a689","Type":"ContainerDied","Data":"95add0971349057340cb95b26940877f87fc5fdc0ef74a8c8ac37c3d43d61b69"} Mar 18 16:58:03.900413 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:58:03.900390 2575 generic.go:358] "Generic (PLEG): container finished" podID="6e3eb22b-19c9-438b-87d5-d7cef38ea59a" containerID="543c29e889e53f54934786b91705a0b9af8e3da21ff15f9169b4db2e9e517680" exitCode=0 Mar 18 16:58:03.900525 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:58:03.900427 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-5785cfb7f4-hwrxw" event={"ID":"6e3eb22b-19c9-438b-87d5-d7cef38ea59a","Type":"ContainerDied","Data":"543c29e889e53f54934786b91705a0b9af8e3da21ff15f9169b4db2e9e517680"} Mar 18 16:58:04.286597 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:58:04.286511 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-678fd49897-xd9nl" podUID="0c351db7-dc44-43ab-ac42-12a48913a689" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Mar 18 16:58:04.287608 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:58:04.287284 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-678fd49897-xd9nl" podUID="0c351db7-dc44-43ab-ac42-12a48913a689" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 16:58:10.929132 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:58:10.929096 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-5785cfb7f4-hwrxw" event={"ID":"6e3eb22b-19c9-438b-87d5-d7cef38ea59a","Type":"ContainerStarted","Data":"dc87af24bd7bd7f57dd0ce5dd2a5ba186818674a6e21c3d0ba062e72c816e414"} Mar 18 16:58:10.929627 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:58:10.929466 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-5785cfb7f4-hwrxw" Mar 18 16:58:10.930585 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:58:10.930556 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-5785cfb7f4-hwrxw" podUID="6e3eb22b-19c9-438b-87d5-d7cef38ea59a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Mar 18 16:58:10.946249 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:58:10.946194 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-5785cfb7f4-hwrxw" podStartSLOduration=6.449821405 podStartE2EDuration="12.946176545s" podCreationTimestamp="2026-03-18 16:57:58 +0000 UTC" firstStartedPulling="2026-03-18 16:58:03.901693403 +0000 UTC m=+802.212946305" lastFinishedPulling="2026-03-18 16:58:10.398048542 +0000 UTC m=+808.709301445" observedRunningTime="2026-03-18 16:58:10.945061512 +0000 UTC m=+809.256314433" watchObservedRunningTime="2026-03-18 16:58:10.946176545 +0000 UTC m=+809.257429469" Mar 18 16:58:11.932745 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:58:11.932702 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-5785cfb7f4-hwrxw" podUID="6e3eb22b-19c9-438b-87d5-d7cef38ea59a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Mar 18 16:58:14.285855 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:58:14.285813 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-678fd49897-xd9nl" podUID="0c351db7-dc44-43ab-ac42-12a48913a689" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Mar 18 16:58:14.286285 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:58:14.286170 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-678fd49897-xd9nl" podUID="0c351db7-dc44-43ab-ac42-12a48913a689" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 16:58:16.291160 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:58:16.291129 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 16:58:21.933436 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:58:21.933396 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-5785cfb7f4-hwrxw" podUID="6e3eb22b-19c9-438b-87d5-d7cef38ea59a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Mar 18 16:58:24.285882 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:58:24.285832 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-678fd49897-xd9nl" podUID="0c351db7-dc44-43ab-ac42-12a48913a689" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Mar 18 16:58:24.286432 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:58:24.286402 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-678fd49897-xd9nl" podUID="0c351db7-dc44-43ab-ac42-12a48913a689" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 16:58:24.290456 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:58:24.290437 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-678fd49897-xd9nl" Mar 18 16:58:24.290546 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:58:24.290483 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-678fd49897-xd9nl" Mar 18 16:58:28.991922 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:58:28.991843 2575 generic.go:358] "Generic (PLEG): container finished" podID="0c351db7-dc44-43ab-ac42-12a48913a689" containerID="d27af57178ea649fbec2d63fe94d9d810c0934c05be4bed58584243a73ff5c5b" exitCode=0 Mar 18 16:58:28.992239 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:58:28.991918 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-678fd49897-xd9nl" event={"ID":"0c351db7-dc44-43ab-ac42-12a48913a689","Type":"ContainerDied","Data":"d27af57178ea649fbec2d63fe94d9d810c0934c05be4bed58584243a73ff5c5b"} Mar 18 16:58:29.286945 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:58:29.286915 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 16:58:29.324422 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:58:29.324398 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-678fd49897-xd9nl" Mar 18 16:58:29.428212 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:58:29.428177 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0c351db7-dc44-43ab-ac42-12a48913a689-kserve-provision-location\") pod \"0c351db7-dc44-43ab-ac42-12a48913a689\" (UID: \"0c351db7-dc44-43ab-ac42-12a48913a689\") " Mar 18 16:58:29.428401 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:58:29.428220 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s8n87\" (UniqueName: \"kubernetes.io/projected/0c351db7-dc44-43ab-ac42-12a48913a689-kube-api-access-s8n87\") pod \"0c351db7-dc44-43ab-ac42-12a48913a689\" (UID: \"0c351db7-dc44-43ab-ac42-12a48913a689\") " Mar 18 16:58:29.428540 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:58:29.428513 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c351db7-dc44-43ab-ac42-12a48913a689-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "0c351db7-dc44-43ab-ac42-12a48913a689" (UID: "0c351db7-dc44-43ab-ac42-12a48913a689"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 18 16:58:29.430511 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:58:29.430485 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c351db7-dc44-43ab-ac42-12a48913a689-kube-api-access-s8n87" (OuterVolumeSpecName: "kube-api-access-s8n87") pod "0c351db7-dc44-43ab-ac42-12a48913a689" (UID: "0c351db7-dc44-43ab-ac42-12a48913a689"). InnerVolumeSpecName "kube-api-access-s8n87". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 16:58:29.529707 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:58:29.529632 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0c351db7-dc44-43ab-ac42-12a48913a689-kserve-provision-location\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 16:58:29.529707 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:58:29.529668 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-s8n87\" (UniqueName: \"kubernetes.io/projected/0c351db7-dc44-43ab-ac42-12a48913a689-kube-api-access-s8n87\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 16:58:29.997982 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:58:29.997959 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-678fd49897-xd9nl" Mar 18 16:58:29.998437 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:58:29.997957 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-678fd49897-xd9nl" event={"ID":"0c351db7-dc44-43ab-ac42-12a48913a689","Type":"ContainerDied","Data":"4f2e483c9486e28c6611e9ecdfa23bc6c385e073e7162bc3c3a111144ee16ea5"} Mar 18 16:58:29.998437 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:58:29.998072 2575 scope.go:117] "RemoveContainer" containerID="d27af57178ea649fbec2d63fe94d9d810c0934c05be4bed58584243a73ff5c5b" Mar 18 16:58:30.008021 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:58:30.008006 2575 scope.go:117] "RemoveContainer" containerID="95add0971349057340cb95b26940877f87fc5fdc0ef74a8c8ac37c3d43d61b69" Mar 18 16:58:30.015378 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:58:30.015336 2575 scope.go:117] "RemoveContainer" containerID="d74e59295a2cf90df236d005ef947fb79fc4ed20754ee6ddbe1670fd0c738459" Mar 18 16:58:30.019178 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:58:30.019158 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-678fd49897-xd9nl"] Mar 18 16:58:30.021816 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:58:30.021794 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-678fd49897-xd9nl"] Mar 18 16:58:30.290758 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:58:30.290684 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c351db7-dc44-43ab-ac42-12a48913a689" path="/var/lib/kubelet/pods/0c351db7-dc44-43ab-ac42-12a48913a689/volumes" Mar 18 16:58:31.932743 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:58:31.932700 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-5785cfb7f4-hwrxw" podUID="6e3eb22b-19c9-438b-87d5-d7cef38ea59a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Mar 18 16:58:40.287435 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:58:40.287403 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 16:58:41.933354 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:58:41.933313 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-5785cfb7f4-hwrxw" podUID="6e3eb22b-19c9-438b-87d5-d7cef38ea59a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Mar 18 16:58:51.932836 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:58:51.932792 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-5785cfb7f4-hwrxw" podUID="6e3eb22b-19c9-438b-87d5-d7cef38ea59a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Mar 18 16:58:54.287701 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:58:54.287667 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 16:59:01.932659 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:59:01.932614 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-5785cfb7f4-hwrxw" podUID="6e3eb22b-19c9-438b-87d5-d7cef38ea59a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Mar 18 16:59:06.286918 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:59:06.286886 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 16:59:11.933539 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:59:11.933498 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-5785cfb7f4-hwrxw" podUID="6e3eb22b-19c9-438b-87d5-d7cef38ea59a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Mar 18 16:59:19.287275 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:59:19.287239 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 16:59:21.932863 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:59:21.932784 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-5785cfb7f4-hwrxw" podUID="6e3eb22b-19c9-438b-87d5-d7cef38ea59a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Mar 18 16:59:28.290496 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:59:28.290467 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-5785cfb7f4-hwrxw" Mar 18 16:59:28.676780 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:59:28.676748 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-5785cfb7f4-hwrxw"] Mar 18 16:59:29.045950 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:59:29.045867 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-7c89745488-b4d74"] Mar 18 16:59:29.046195 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:59:29.046183 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0c351db7-dc44-43ab-ac42-12a48913a689" containerName="storage-initializer" Mar 18 16:59:29.046249 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:59:29.046196 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c351db7-dc44-43ab-ac42-12a48913a689" containerName="storage-initializer" Mar 18 16:59:29.046249 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:59:29.046213 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0c351db7-dc44-43ab-ac42-12a48913a689" containerName="kserve-container" Mar 18 16:59:29.046249 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:59:29.046218 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c351db7-dc44-43ab-ac42-12a48913a689" containerName="kserve-container" Mar 18 16:59:29.046249 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:59:29.046228 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0c351db7-dc44-43ab-ac42-12a48913a689" containerName="agent" Mar 18 16:59:29.046249 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:59:29.046234 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c351db7-dc44-43ab-ac42-12a48913a689" containerName="agent" Mar 18 16:59:29.046453 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:59:29.046288 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="0c351db7-dc44-43ab-ac42-12a48913a689" containerName="agent" Mar 18 16:59:29.046453 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:59:29.046295 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="0c351db7-dc44-43ab-ac42-12a48913a689" containerName="kserve-container" Mar 18 16:59:29.049287 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:59:29.049270 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-7c89745488-b4d74" Mar 18 16:59:29.056773 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:59:29.056749 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-7c89745488-b4d74"] Mar 18 16:59:29.191059 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:59:29.191022 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2876d9b4-827f-4ccc-8ff0-fcbebcffb615-kserve-provision-location\") pod \"isvc-lightgbm-runtime-predictor-7c89745488-b4d74\" (UID: \"2876d9b4-827f-4ccc-8ff0-fcbebcffb615\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-7c89745488-b4d74" Mar 18 16:59:29.191209 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:59:29.191064 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhd8d\" (UniqueName: \"kubernetes.io/projected/2876d9b4-827f-4ccc-8ff0-fcbebcffb615-kube-api-access-bhd8d\") pod \"isvc-lightgbm-runtime-predictor-7c89745488-b4d74\" (UID: \"2876d9b4-827f-4ccc-8ff0-fcbebcffb615\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-7c89745488-b4d74" Mar 18 16:59:29.291653 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:59:29.291618 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2876d9b4-827f-4ccc-8ff0-fcbebcffb615-kserve-provision-location\") pod \"isvc-lightgbm-runtime-predictor-7c89745488-b4d74\" (UID: \"2876d9b4-827f-4ccc-8ff0-fcbebcffb615\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-7c89745488-b4d74" Mar 18 16:59:29.291653 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:59:29.291657 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bhd8d\" (UniqueName: \"kubernetes.io/projected/2876d9b4-827f-4ccc-8ff0-fcbebcffb615-kube-api-access-bhd8d\") pod \"isvc-lightgbm-runtime-predictor-7c89745488-b4d74\" (UID: \"2876d9b4-827f-4ccc-8ff0-fcbebcffb615\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-7c89745488-b4d74" Mar 18 16:59:29.292213 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:59:29.292082 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2876d9b4-827f-4ccc-8ff0-fcbebcffb615-kserve-provision-location\") pod \"isvc-lightgbm-runtime-predictor-7c89745488-b4d74\" (UID: \"2876d9b4-827f-4ccc-8ff0-fcbebcffb615\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-7c89745488-b4d74" Mar 18 16:59:29.299918 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:59:29.299863 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhd8d\" (UniqueName: \"kubernetes.io/projected/2876d9b4-827f-4ccc-8ff0-fcbebcffb615-kube-api-access-bhd8d\") pod \"isvc-lightgbm-runtime-predictor-7c89745488-b4d74\" (UID: \"2876d9b4-827f-4ccc-8ff0-fcbebcffb615\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-7c89745488-b4d74" Mar 18 16:59:29.360781 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:59:29.360752 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-7c89745488-b4d74" Mar 18 16:59:29.484618 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:59:29.484589 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-7c89745488-b4d74"] Mar 18 16:59:29.487959 ip-10-0-139-49 kubenswrapper[2575]: W0318 16:59:29.487933 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2876d9b4_827f_4ccc_8ff0_fcbebcffb615.slice/crio-9a10604af2a864267cbbe1ad5c50b3a00b375a95e9e3448935af7cabb6008044 WatchSource:0}: Error finding container 9a10604af2a864267cbbe1ad5c50b3a00b375a95e9e3448935af7cabb6008044: Status 404 returned error can't find the container with id 9a10604af2a864267cbbe1ad5c50b3a00b375a95e9e3448935af7cabb6008044 Mar 18 16:59:30.203181 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:59:30.203146 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-7c89745488-b4d74" event={"ID":"2876d9b4-827f-4ccc-8ff0-fcbebcffb615","Type":"ContainerStarted","Data":"cc9a6b92a9ba72ddcbe346663b9db8eea1ae874a7a8b407cebfde049ba397eb4"} Mar 18 16:59:30.203181 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:59:30.203182 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-7c89745488-b4d74" event={"ID":"2876d9b4-827f-4ccc-8ff0-fcbebcffb615","Type":"ContainerStarted","Data":"9a10604af2a864267cbbe1ad5c50b3a00b375a95e9e3448935af7cabb6008044"} Mar 18 16:59:30.203448 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:59:30.203407 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-5785cfb7f4-hwrxw" podUID="6e3eb22b-19c9-438b-87d5-d7cef38ea59a" containerName="kserve-container" containerID="cri-o://dc87af24bd7bd7f57dd0ce5dd2a5ba186818674a6e21c3d0ba062e72c816e414" gracePeriod=30 Mar 18 16:59:32.288971 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:59:32.288855 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 16:59:34.220421 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:59:34.220390 2575 generic.go:358] "Generic (PLEG): container finished" podID="2876d9b4-827f-4ccc-8ff0-fcbebcffb615" containerID="cc9a6b92a9ba72ddcbe346663b9db8eea1ae874a7a8b407cebfde049ba397eb4" exitCode=0 Mar 18 16:59:34.220807 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:59:34.220460 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-7c89745488-b4d74" event={"ID":"2876d9b4-827f-4ccc-8ff0-fcbebcffb615","Type":"ContainerDied","Data":"cc9a6b92a9ba72ddcbe346663b9db8eea1ae874a7a8b407cebfde049ba397eb4"} Mar 18 16:59:34.445095 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:59:34.445074 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-5785cfb7f4-hwrxw" Mar 18 16:59:34.539190 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:59:34.539105 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qf5vs\" (UniqueName: \"kubernetes.io/projected/6e3eb22b-19c9-438b-87d5-d7cef38ea59a-kube-api-access-qf5vs\") pod \"6e3eb22b-19c9-438b-87d5-d7cef38ea59a\" (UID: \"6e3eb22b-19c9-438b-87d5-d7cef38ea59a\") " Mar 18 16:59:34.539350 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:59:34.539224 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6e3eb22b-19c9-438b-87d5-d7cef38ea59a-kserve-provision-location\") pod \"6e3eb22b-19c9-438b-87d5-d7cef38ea59a\" (UID: \"6e3eb22b-19c9-438b-87d5-d7cef38ea59a\") " Mar 18 16:59:34.539584 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:59:34.539561 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e3eb22b-19c9-438b-87d5-d7cef38ea59a-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "6e3eb22b-19c9-438b-87d5-d7cef38ea59a" (UID: "6e3eb22b-19c9-438b-87d5-d7cef38ea59a"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 18 16:59:34.541309 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:59:34.541288 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e3eb22b-19c9-438b-87d5-d7cef38ea59a-kube-api-access-qf5vs" (OuterVolumeSpecName: "kube-api-access-qf5vs") pod "6e3eb22b-19c9-438b-87d5-d7cef38ea59a" (UID: "6e3eb22b-19c9-438b-87d5-d7cef38ea59a"). InnerVolumeSpecName "kube-api-access-qf5vs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 16:59:34.640308 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:59:34.640267 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6e3eb22b-19c9-438b-87d5-d7cef38ea59a-kserve-provision-location\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 16:59:34.640308 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:59:34.640296 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qf5vs\" (UniqueName: \"kubernetes.io/projected/6e3eb22b-19c9-438b-87d5-d7cef38ea59a-kube-api-access-qf5vs\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 16:59:35.225487 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:59:35.225454 2575 generic.go:358] "Generic (PLEG): container finished" podID="6e3eb22b-19c9-438b-87d5-d7cef38ea59a" containerID="dc87af24bd7bd7f57dd0ce5dd2a5ba186818674a6e21c3d0ba062e72c816e414" exitCode=0 Mar 18 16:59:35.225945 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:59:35.225526 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-5785cfb7f4-hwrxw" Mar 18 16:59:35.225945 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:59:35.225546 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-5785cfb7f4-hwrxw" event={"ID":"6e3eb22b-19c9-438b-87d5-d7cef38ea59a","Type":"ContainerDied","Data":"dc87af24bd7bd7f57dd0ce5dd2a5ba186818674a6e21c3d0ba062e72c816e414"} Mar 18 16:59:35.225945 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:59:35.225580 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-5785cfb7f4-hwrxw" event={"ID":"6e3eb22b-19c9-438b-87d5-d7cef38ea59a","Type":"ContainerDied","Data":"d98d08ca87e3516c676309fb1bc06f9a5f89b9e60682f60edab2b186fbcc8046"} Mar 18 16:59:35.225945 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:59:35.225605 2575 scope.go:117] "RemoveContainer" containerID="dc87af24bd7bd7f57dd0ce5dd2a5ba186818674a6e21c3d0ba062e72c816e414" Mar 18 16:59:35.227749 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:59:35.227726 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-7c89745488-b4d74" event={"ID":"2876d9b4-827f-4ccc-8ff0-fcbebcffb615","Type":"ContainerStarted","Data":"4a2798d238c341d90613f657df0577809b1686ce8f5f2b7ea3735d3c5f714341"} Mar 18 16:59:35.228011 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:59:35.227995 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-7c89745488-b4d74" Mar 18 16:59:35.229725 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:59:35.229684 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-7c89745488-b4d74" podUID="2876d9b4-827f-4ccc-8ff0-fcbebcffb615" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Mar 18 16:59:35.234447 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:59:35.234431 2575 scope.go:117] "RemoveContainer" containerID="543c29e889e53f54934786b91705a0b9af8e3da21ff15f9169b4db2e9e517680" Mar 18 16:59:35.243141 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:59:35.243119 2575 scope.go:117] "RemoveContainer" containerID="dc87af24bd7bd7f57dd0ce5dd2a5ba186818674a6e21c3d0ba062e72c816e414" Mar 18 16:59:35.243466 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:59:35.243439 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc87af24bd7bd7f57dd0ce5dd2a5ba186818674a6e21c3d0ba062e72c816e414\": container with ID starting with dc87af24bd7bd7f57dd0ce5dd2a5ba186818674a6e21c3d0ba062e72c816e414 not found: ID does not exist" containerID="dc87af24bd7bd7f57dd0ce5dd2a5ba186818674a6e21c3d0ba062e72c816e414" Mar 18 16:59:35.243556 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:59:35.243473 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc87af24bd7bd7f57dd0ce5dd2a5ba186818674a6e21c3d0ba062e72c816e414"} err="failed to get container status \"dc87af24bd7bd7f57dd0ce5dd2a5ba186818674a6e21c3d0ba062e72c816e414\": rpc error: code = NotFound desc = could not find container \"dc87af24bd7bd7f57dd0ce5dd2a5ba186818674a6e21c3d0ba062e72c816e414\": container with ID starting with dc87af24bd7bd7f57dd0ce5dd2a5ba186818674a6e21c3d0ba062e72c816e414 not found: ID does not exist" Mar 18 16:59:35.243556 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:59:35.243505 2575 scope.go:117] "RemoveContainer" containerID="543c29e889e53f54934786b91705a0b9af8e3da21ff15f9169b4db2e9e517680" Mar 18 16:59:35.243893 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:59:35.243866 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"543c29e889e53f54934786b91705a0b9af8e3da21ff15f9169b4db2e9e517680\": container with ID starting with 543c29e889e53f54934786b91705a0b9af8e3da21ff15f9169b4db2e9e517680 not found: ID does not exist" containerID="543c29e889e53f54934786b91705a0b9af8e3da21ff15f9169b4db2e9e517680" Mar 18 16:59:35.244001 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:59:35.243905 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"543c29e889e53f54934786b91705a0b9af8e3da21ff15f9169b4db2e9e517680"} err="failed to get container status \"543c29e889e53f54934786b91705a0b9af8e3da21ff15f9169b4db2e9e517680\": rpc error: code = NotFound desc = could not find container \"543c29e889e53f54934786b91705a0b9af8e3da21ff15f9169b4db2e9e517680\": container with ID starting with 543c29e889e53f54934786b91705a0b9af8e3da21ff15f9169b4db2e9e517680 not found: ID does not exist" Mar 18 16:59:35.244911 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:59:35.244868 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-7c89745488-b4d74" podStartSLOduration=6.244854301 podStartE2EDuration="6.244854301s" podCreationTimestamp="2026-03-18 16:59:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:59:35.24307062 +0000 UTC m=+893.554323542" watchObservedRunningTime="2026-03-18 16:59:35.244854301 +0000 UTC m=+893.556107222" Mar 18 16:59:35.254701 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:59:35.254681 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-5785cfb7f4-hwrxw"] Mar 18 16:59:35.260570 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:59:35.260551 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-5785cfb7f4-hwrxw"] Mar 18 16:59:36.232988 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:59:36.232944 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-7c89745488-b4d74" podUID="2876d9b4-827f-4ccc-8ff0-fcbebcffb615" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Mar 18 16:59:36.290240 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:59:36.290200 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e3eb22b-19c9-438b-87d5-d7cef38ea59a" path="/var/lib/kubelet/pods/6e3eb22b-19c9-438b-87d5-d7cef38ea59a/volumes" Mar 18 16:59:42.222069 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:59:42.222039 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qktrm_a51f93e2-c750-4700-bb46-109456c7c78a/ovn-acl-logging/0.log" Mar 18 16:59:42.224234 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:59:42.224214 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qktrm_a51f93e2-c750-4700-bb46-109456c7c78a/ovn-acl-logging/0.log" Mar 18 16:59:45.287573 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:59:45.287534 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 16:59:46.233463 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:59:46.233425 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-7c89745488-b4d74" podUID="2876d9b4-827f-4ccc-8ff0-fcbebcffb615" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Mar 18 16:59:56.233835 ip-10-0-139-49 kubenswrapper[2575]: I0318 16:59:56.233791 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-7c89745488-b4d74" podUID="2876d9b4-827f-4ccc-8ff0-fcbebcffb615" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Mar 18 16:59:57.287385 ip-10-0-139-49 kubenswrapper[2575]: E0318 16:59:57.287333 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:00:06.233072 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:00:06.233028 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-7c89745488-b4d74" podUID="2876d9b4-827f-4ccc-8ff0-fcbebcffb615" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Mar 18 17:00:11.286619 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:00:11.286580 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 17:00:11.286927 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:00:11.286858 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:00:16.233837 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:00:16.233788 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-7c89745488-b4d74" podUID="2876d9b4-827f-4ccc-8ff0-fcbebcffb615" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Mar 18 17:00:25.287412 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:00:25.287345 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:00:26.233257 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:00:26.233215 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-7c89745488-b4d74" podUID="2876d9b4-827f-4ccc-8ff0-fcbebcffb615" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Mar 18 17:00:36.233247 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:00:36.233205 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-7c89745488-b4d74" podUID="2876d9b4-827f-4ccc-8ff0-fcbebcffb615" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Mar 18 17:00:39.286653 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:00:39.286604 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:00:44.286719 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:00:44.286672 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-7c89745488-b4d74" podUID="2876d9b4-827f-4ccc-8ff0-fcbebcffb615" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Mar 18 17:00:54.287200 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:00:54.287168 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:00:54.291344 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:00:54.291318 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-7c89745488-b4d74" Mar 18 17:00:59.081903 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:00:59.081866 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-7c89745488-b4d74"] Mar 18 17:00:59.082255 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:00:59.082214 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-7c89745488-b4d74" podUID="2876d9b4-827f-4ccc-8ff0-fcbebcffb615" containerName="kserve-container" containerID="cri-o://4a2798d238c341d90613f657df0577809b1686ce8f5f2b7ea3735d3c5f714341" gracePeriod=30 Mar 18 17:00:59.446836 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:00:59.446798 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-7b74c59fb-ccq49"] Mar 18 17:00:59.447194 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:00:59.447177 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6e3eb22b-19c9-438b-87d5-d7cef38ea59a" containerName="kserve-container" Mar 18 17:00:59.447274 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:00:59.447197 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e3eb22b-19c9-438b-87d5-d7cef38ea59a" containerName="kserve-container" Mar 18 17:00:59.447274 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:00:59.447216 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6e3eb22b-19c9-438b-87d5-d7cef38ea59a" containerName="storage-initializer" Mar 18 17:00:59.447274 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:00:59.447224 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e3eb22b-19c9-438b-87d5-d7cef38ea59a" containerName="storage-initializer" Mar 18 17:00:59.447467 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:00:59.447318 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="6e3eb22b-19c9-438b-87d5-d7cef38ea59a" containerName="kserve-container" Mar 18 17:00:59.450500 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:00:59.450478 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-7b74c59fb-ccq49" Mar 18 17:00:59.458008 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:00:59.457983 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-7b74c59fb-ccq49"] Mar 18 17:00:59.621450 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:00:59.621415 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1943771e-5313-48a4-8e8c-ea563c08702c-kserve-provision-location\") pod \"isvc-lightgbm-v2-runtime-predictor-7b74c59fb-ccq49\" (UID: \"1943771e-5313-48a4-8e8c-ea563c08702c\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-7b74c59fb-ccq49" Mar 18 17:00:59.621623 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:00:59.621466 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qbrr\" (UniqueName: \"kubernetes.io/projected/1943771e-5313-48a4-8e8c-ea563c08702c-kube-api-access-4qbrr\") pod \"isvc-lightgbm-v2-runtime-predictor-7b74c59fb-ccq49\" (UID: \"1943771e-5313-48a4-8e8c-ea563c08702c\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-7b74c59fb-ccq49" Mar 18 17:00:59.722936 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:00:59.722842 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1943771e-5313-48a4-8e8c-ea563c08702c-kserve-provision-location\") pod \"isvc-lightgbm-v2-runtime-predictor-7b74c59fb-ccq49\" (UID: \"1943771e-5313-48a4-8e8c-ea563c08702c\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-7b74c59fb-ccq49" Mar 18 17:00:59.722936 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:00:59.722911 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4qbrr\" (UniqueName: \"kubernetes.io/projected/1943771e-5313-48a4-8e8c-ea563c08702c-kube-api-access-4qbrr\") pod \"isvc-lightgbm-v2-runtime-predictor-7b74c59fb-ccq49\" (UID: \"1943771e-5313-48a4-8e8c-ea563c08702c\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-7b74c59fb-ccq49" Mar 18 17:00:59.723241 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:00:59.723222 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1943771e-5313-48a4-8e8c-ea563c08702c-kserve-provision-location\") pod \"isvc-lightgbm-v2-runtime-predictor-7b74c59fb-ccq49\" (UID: \"1943771e-5313-48a4-8e8c-ea563c08702c\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-7b74c59fb-ccq49" Mar 18 17:00:59.730908 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:00:59.730882 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qbrr\" (UniqueName: \"kubernetes.io/projected/1943771e-5313-48a4-8e8c-ea563c08702c-kube-api-access-4qbrr\") pod \"isvc-lightgbm-v2-runtime-predictor-7b74c59fb-ccq49\" (UID: \"1943771e-5313-48a4-8e8c-ea563c08702c\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-7b74c59fb-ccq49" Mar 18 17:00:59.761907 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:00:59.761887 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-7b74c59fb-ccq49" Mar 18 17:00:59.883208 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:00:59.883186 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-7b74c59fb-ccq49"] Mar 18 17:00:59.885900 ip-10-0-139-49 kubenswrapper[2575]: W0318 17:00:59.885863 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1943771e_5313_48a4_8e8c_ea563c08702c.slice/crio-9169acf8e9bfd556c48c58eb54f978c2e2dc308aa9736bb9e64db532635c7514 WatchSource:0}: Error finding container 9169acf8e9bfd556c48c58eb54f978c2e2dc308aa9736bb9e64db532635c7514: Status 404 returned error can't find the container with id 9169acf8e9bfd556c48c58eb54f978c2e2dc308aa9736bb9e64db532635c7514 Mar 18 17:01:00.504472 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:01:00.504438 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-7b74c59fb-ccq49" event={"ID":"1943771e-5313-48a4-8e8c-ea563c08702c","Type":"ContainerStarted","Data":"72ee0bb9477ae03294b071b64a989498d90c9a956318b6826380acb1f49af404"} Mar 18 17:01:00.504472 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:01:00.504473 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-7b74c59fb-ccq49" event={"ID":"1943771e-5313-48a4-8e8c-ea563c08702c","Type":"ContainerStarted","Data":"9169acf8e9bfd556c48c58eb54f978c2e2dc308aa9736bb9e64db532635c7514"} Mar 18 17:01:04.115806 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:01:04.115782 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-7c89745488-b4d74" Mar 18 17:01:04.159801 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:01:04.159737 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2876d9b4-827f-4ccc-8ff0-fcbebcffb615-kserve-provision-location\") pod \"2876d9b4-827f-4ccc-8ff0-fcbebcffb615\" (UID: \"2876d9b4-827f-4ccc-8ff0-fcbebcffb615\") " Mar 18 17:01:04.159901 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:01:04.159837 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bhd8d\" (UniqueName: \"kubernetes.io/projected/2876d9b4-827f-4ccc-8ff0-fcbebcffb615-kube-api-access-bhd8d\") pod \"2876d9b4-827f-4ccc-8ff0-fcbebcffb615\" (UID: \"2876d9b4-827f-4ccc-8ff0-fcbebcffb615\") " Mar 18 17:01:04.160099 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:01:04.160075 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2876d9b4-827f-4ccc-8ff0-fcbebcffb615-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "2876d9b4-827f-4ccc-8ff0-fcbebcffb615" (UID: "2876d9b4-827f-4ccc-8ff0-fcbebcffb615"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 18 17:01:04.161953 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:01:04.161932 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2876d9b4-827f-4ccc-8ff0-fcbebcffb615-kube-api-access-bhd8d" (OuterVolumeSpecName: "kube-api-access-bhd8d") pod "2876d9b4-827f-4ccc-8ff0-fcbebcffb615" (UID: "2876d9b4-827f-4ccc-8ff0-fcbebcffb615"). InnerVolumeSpecName "kube-api-access-bhd8d". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 17:01:04.260395 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:01:04.260339 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bhd8d\" (UniqueName: \"kubernetes.io/projected/2876d9b4-827f-4ccc-8ff0-fcbebcffb615-kube-api-access-bhd8d\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 17:01:04.260395 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:01:04.260397 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2876d9b4-827f-4ccc-8ff0-fcbebcffb615-kserve-provision-location\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 17:01:04.518812 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:01:04.518727 2575 generic.go:358] "Generic (PLEG): container finished" podID="1943771e-5313-48a4-8e8c-ea563c08702c" containerID="72ee0bb9477ae03294b071b64a989498d90c9a956318b6826380acb1f49af404" exitCode=0 Mar 18 17:01:04.518812 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:01:04.518801 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-7b74c59fb-ccq49" event={"ID":"1943771e-5313-48a4-8e8c-ea563c08702c","Type":"ContainerDied","Data":"72ee0bb9477ae03294b071b64a989498d90c9a956318b6826380acb1f49af404"} Mar 18 17:01:04.520336 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:01:04.520314 2575 generic.go:358] "Generic (PLEG): container finished" podID="2876d9b4-827f-4ccc-8ff0-fcbebcffb615" containerID="4a2798d238c341d90613f657df0577809b1686ce8f5f2b7ea3735d3c5f714341" exitCode=0 Mar 18 17:01:04.520420 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:01:04.520381 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-7c89745488-b4d74" event={"ID":"2876d9b4-827f-4ccc-8ff0-fcbebcffb615","Type":"ContainerDied","Data":"4a2798d238c341d90613f657df0577809b1686ce8f5f2b7ea3735d3c5f714341"} Mar 18 17:01:04.520420 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:01:04.520393 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-7c89745488-b4d74" Mar 18 17:01:04.520420 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:01:04.520411 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-7c89745488-b4d74" event={"ID":"2876d9b4-827f-4ccc-8ff0-fcbebcffb615","Type":"ContainerDied","Data":"9a10604af2a864267cbbe1ad5c50b3a00b375a95e9e3448935af7cabb6008044"} Mar 18 17:01:04.520560 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:01:04.520432 2575 scope.go:117] "RemoveContainer" containerID="4a2798d238c341d90613f657df0577809b1686ce8f5f2b7ea3735d3c5f714341" Mar 18 17:01:04.528447 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:01:04.528425 2575 scope.go:117] "RemoveContainer" containerID="cc9a6b92a9ba72ddcbe346663b9db8eea1ae874a7a8b407cebfde049ba397eb4" Mar 18 17:01:04.537808 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:01:04.537790 2575 scope.go:117] "RemoveContainer" containerID="4a2798d238c341d90613f657df0577809b1686ce8f5f2b7ea3735d3c5f714341" Mar 18 17:01:04.538070 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:01:04.538039 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a2798d238c341d90613f657df0577809b1686ce8f5f2b7ea3735d3c5f714341\": container with ID starting with 4a2798d238c341d90613f657df0577809b1686ce8f5f2b7ea3735d3c5f714341 not found: ID does not exist" containerID="4a2798d238c341d90613f657df0577809b1686ce8f5f2b7ea3735d3c5f714341" Mar 18 17:01:04.538132 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:01:04.538068 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a2798d238c341d90613f657df0577809b1686ce8f5f2b7ea3735d3c5f714341"} err="failed to get container status \"4a2798d238c341d90613f657df0577809b1686ce8f5f2b7ea3735d3c5f714341\": rpc error: code = NotFound desc = could not find container \"4a2798d238c341d90613f657df0577809b1686ce8f5f2b7ea3735d3c5f714341\": container with ID starting with 4a2798d238c341d90613f657df0577809b1686ce8f5f2b7ea3735d3c5f714341 not found: ID does not exist" Mar 18 17:01:04.538132 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:01:04.538092 2575 scope.go:117] "RemoveContainer" containerID="cc9a6b92a9ba72ddcbe346663b9db8eea1ae874a7a8b407cebfde049ba397eb4" Mar 18 17:01:04.538349 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:01:04.538331 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc9a6b92a9ba72ddcbe346663b9db8eea1ae874a7a8b407cebfde049ba397eb4\": container with ID starting with cc9a6b92a9ba72ddcbe346663b9db8eea1ae874a7a8b407cebfde049ba397eb4 not found: ID does not exist" containerID="cc9a6b92a9ba72ddcbe346663b9db8eea1ae874a7a8b407cebfde049ba397eb4" Mar 18 17:01:04.538422 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:01:04.538373 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc9a6b92a9ba72ddcbe346663b9db8eea1ae874a7a8b407cebfde049ba397eb4"} err="failed to get container status \"cc9a6b92a9ba72ddcbe346663b9db8eea1ae874a7a8b407cebfde049ba397eb4\": rpc error: code = NotFound desc = could not find container \"cc9a6b92a9ba72ddcbe346663b9db8eea1ae874a7a8b407cebfde049ba397eb4\": container with ID starting with cc9a6b92a9ba72ddcbe346663b9db8eea1ae874a7a8b407cebfde049ba397eb4 not found: ID does not exist" Mar 18 17:01:04.544705 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:01:04.544683 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-7c89745488-b4d74"] Mar 18 17:01:04.548779 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:01:04.548761 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-7c89745488-b4d74"] Mar 18 17:01:05.287463 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:01:05.287418 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:01:06.293518 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:01:06.293477 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2876d9b4-827f-4ccc-8ff0-fcbebcffb615" path="/var/lib/kubelet/pods/2876d9b4-827f-4ccc-8ff0-fcbebcffb615/volumes" Mar 18 17:01:20.287014 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:01:20.286925 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:01:33.287782 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:01:33.287751 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:01:46.287765 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:01:46.287723 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:01:57.287343 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:01:57.287287 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:02:10.287292 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:02:10.287251 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:02:23.287328 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:02:23.287277 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:02:34.563824 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:02:34.563715 2575 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized" image="quay.io/opendatahub/odh-model-serving-api:fast" Mar 18 17:02:34.564234 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:02:34.563980 2575 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:server,Image:quay.io/opendatahub/odh-model-serving-api:fast,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},ContainerPort{Name:metrics,HostPort:0,ContainerPort:9090,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:TLS_CERT_DIR,Value:/tls,ValueFrom:nil,},EnvVar{Name:GATEWAY_LABEL_SELECTOR,Value:,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{50 -3} {} 50m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tls-certs,ReadOnly:true,MountPath:/tls,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vhs26,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000640000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod model-serving-api-9699c8d45-ttb7x_kserve(e41abd12-239d-46fb-9cdb-9d35fa51024d): ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized" logger="UnhandledError" Mar 18 17:02:34.565192 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:02:34.565161 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ErrImagePull: \"unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:02:46.287570 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:02:46.287522 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:02:59.287426 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:02:59.287384 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:03:11.286843 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:03:11.286795 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:03:13.998348 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:03:13.998311 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-7b74c59fb-ccq49" event={"ID":"1943771e-5313-48a4-8e8c-ea563c08702c","Type":"ContainerStarted","Data":"d12951aed5f01e51a3d5192389a4aab65bef692a85943f7ba93c8d254db17d35"} Mar 18 17:03:13.998740 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:03:13.998408 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-7b74c59fb-ccq49" Mar 18 17:03:14.015302 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:03:14.015248 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-7b74c59fb-ccq49" podStartSLOduration=6.073269763 podStartE2EDuration="2m15.015237537s" podCreationTimestamp="2026-03-18 17:00:59 +0000 UTC" firstStartedPulling="2026-03-18 17:01:04.519959758 +0000 UTC m=+982.831212657" lastFinishedPulling="2026-03-18 17:03:13.461927529 +0000 UTC m=+1111.773180431" observedRunningTime="2026-03-18 17:03:14.013841333 +0000 UTC m=+1112.325094256" watchObservedRunningTime="2026-03-18 17:03:14.015237537 +0000 UTC m=+1112.326490457" Mar 18 17:03:24.286831 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:03:24.286778 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:03:37.286616 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:03:37.286586 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:03:45.006290 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:03:45.006263 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-7b74c59fb-ccq49" Mar 18 17:03:48.286688 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:03:48.286583 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:03:49.575960 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:03:49.575927 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-7b74c59fb-ccq49"] Mar 18 17:03:49.576417 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:03:49.576288 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-7b74c59fb-ccq49" podUID="1943771e-5313-48a4-8e8c-ea563c08702c" containerName="kserve-container" containerID="cri-o://d12951aed5f01e51a3d5192389a4aab65bef692a85943f7ba93c8d254db17d35" gracePeriod=30 Mar 18 17:03:49.649479 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:03:49.649446 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-548584944f-l9hbr"] Mar 18 17:03:49.649802 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:03:49.649787 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2876d9b4-827f-4ccc-8ff0-fcbebcffb615" containerName="kserve-container" Mar 18 17:03:49.649802 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:03:49.649801 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="2876d9b4-827f-4ccc-8ff0-fcbebcffb615" containerName="kserve-container" Mar 18 17:03:49.649952 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:03:49.649819 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2876d9b4-827f-4ccc-8ff0-fcbebcffb615" containerName="storage-initializer" Mar 18 17:03:49.649952 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:03:49.649827 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="2876d9b4-827f-4ccc-8ff0-fcbebcffb615" containerName="storage-initializer" Mar 18 17:03:49.649952 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:03:49.649897 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="2876d9b4-827f-4ccc-8ff0-fcbebcffb615" containerName="kserve-container" Mar 18 17:03:49.689820 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:03:49.689786 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-548584944f-l9hbr"] Mar 18 17:03:49.689983 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:03:49.689905 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-548584944f-l9hbr" Mar 18 17:03:49.776541 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:03:49.776501 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/93616781-bc34-4623-9464-8c9da06de097-kserve-provision-location\") pod \"isvc-lightgbm-v2-kserve-predictor-548584944f-l9hbr\" (UID: \"93616781-bc34-4623-9464-8c9da06de097\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-548584944f-l9hbr" Mar 18 17:03:49.776541 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:03:49.776547 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ssbw9\" (UniqueName: \"kubernetes.io/projected/93616781-bc34-4623-9464-8c9da06de097-kube-api-access-ssbw9\") pod \"isvc-lightgbm-v2-kserve-predictor-548584944f-l9hbr\" (UID: \"93616781-bc34-4623-9464-8c9da06de097\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-548584944f-l9hbr" Mar 18 17:03:49.877740 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:03:49.877634 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/93616781-bc34-4623-9464-8c9da06de097-kserve-provision-location\") pod \"isvc-lightgbm-v2-kserve-predictor-548584944f-l9hbr\" (UID: \"93616781-bc34-4623-9464-8c9da06de097\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-548584944f-l9hbr" Mar 18 17:03:49.877740 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:03:49.877676 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ssbw9\" (UniqueName: \"kubernetes.io/projected/93616781-bc34-4623-9464-8c9da06de097-kube-api-access-ssbw9\") pod \"isvc-lightgbm-v2-kserve-predictor-548584944f-l9hbr\" (UID: \"93616781-bc34-4623-9464-8c9da06de097\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-548584944f-l9hbr" Mar 18 17:03:49.878007 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:03:49.877985 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/93616781-bc34-4623-9464-8c9da06de097-kserve-provision-location\") pod \"isvc-lightgbm-v2-kserve-predictor-548584944f-l9hbr\" (UID: \"93616781-bc34-4623-9464-8c9da06de097\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-548584944f-l9hbr" Mar 18 17:03:49.885812 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:03:49.885785 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ssbw9\" (UniqueName: \"kubernetes.io/projected/93616781-bc34-4623-9464-8c9da06de097-kube-api-access-ssbw9\") pod \"isvc-lightgbm-v2-kserve-predictor-548584944f-l9hbr\" (UID: \"93616781-bc34-4623-9464-8c9da06de097\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-548584944f-l9hbr" Mar 18 17:03:50.000261 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:03:50.000234 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-548584944f-l9hbr" Mar 18 17:03:50.136210 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:03:50.136176 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-548584944f-l9hbr"] Mar 18 17:03:50.137535 ip-10-0-139-49 kubenswrapper[2575]: W0318 17:03:50.137504 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93616781_bc34_4623_9464_8c9da06de097.slice/crio-3cd4b69ed84ed507eddcb2091094134d4cedbbd0a9425ef6cc2502b3f8d5e4c1 WatchSource:0}: Error finding container 3cd4b69ed84ed507eddcb2091094134d4cedbbd0a9425ef6cc2502b3f8d5e4c1: Status 404 returned error can't find the container with id 3cd4b69ed84ed507eddcb2091094134d4cedbbd0a9425ef6cc2502b3f8d5e4c1 Mar 18 17:03:50.534741 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:03:50.534720 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-7b74c59fb-ccq49" Mar 18 17:03:50.583869 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:03:50.583845 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4qbrr\" (UniqueName: \"kubernetes.io/projected/1943771e-5313-48a4-8e8c-ea563c08702c-kube-api-access-4qbrr\") pod \"1943771e-5313-48a4-8e8c-ea563c08702c\" (UID: \"1943771e-5313-48a4-8e8c-ea563c08702c\") " Mar 18 17:03:50.584197 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:03:50.583902 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1943771e-5313-48a4-8e8c-ea563c08702c-kserve-provision-location\") pod \"1943771e-5313-48a4-8e8c-ea563c08702c\" (UID: \"1943771e-5313-48a4-8e8c-ea563c08702c\") " Mar 18 17:03:50.584301 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:03:50.584270 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1943771e-5313-48a4-8e8c-ea563c08702c-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "1943771e-5313-48a4-8e8c-ea563c08702c" (UID: "1943771e-5313-48a4-8e8c-ea563c08702c"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 18 17:03:50.586062 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:03:50.586034 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1943771e-5313-48a4-8e8c-ea563c08702c-kube-api-access-4qbrr" (OuterVolumeSpecName: "kube-api-access-4qbrr") pod "1943771e-5313-48a4-8e8c-ea563c08702c" (UID: "1943771e-5313-48a4-8e8c-ea563c08702c"). InnerVolumeSpecName "kube-api-access-4qbrr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 17:03:50.685263 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:03:50.685183 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4qbrr\" (UniqueName: \"kubernetes.io/projected/1943771e-5313-48a4-8e8c-ea563c08702c-kube-api-access-4qbrr\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 17:03:50.685263 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:03:50.685216 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1943771e-5313-48a4-8e8c-ea563c08702c-kserve-provision-location\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 17:03:51.125556 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:03:51.125467 2575 generic.go:358] "Generic (PLEG): container finished" podID="1943771e-5313-48a4-8e8c-ea563c08702c" containerID="d12951aed5f01e51a3d5192389a4aab65bef692a85943f7ba93c8d254db17d35" exitCode=0 Mar 18 17:03:51.125556 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:03:51.125533 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-7b74c59fb-ccq49" Mar 18 17:03:51.125556 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:03:51.125545 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-7b74c59fb-ccq49" event={"ID":"1943771e-5313-48a4-8e8c-ea563c08702c","Type":"ContainerDied","Data":"d12951aed5f01e51a3d5192389a4aab65bef692a85943f7ba93c8d254db17d35"} Mar 18 17:03:51.125828 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:03:51.125591 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-7b74c59fb-ccq49" event={"ID":"1943771e-5313-48a4-8e8c-ea563c08702c","Type":"ContainerDied","Data":"9169acf8e9bfd556c48c58eb54f978c2e2dc308aa9736bb9e64db532635c7514"} Mar 18 17:03:51.125828 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:03:51.125607 2575 scope.go:117] "RemoveContainer" containerID="d12951aed5f01e51a3d5192389a4aab65bef692a85943f7ba93c8d254db17d35" Mar 18 17:03:51.127100 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:03:51.127070 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-548584944f-l9hbr" event={"ID":"93616781-bc34-4623-9464-8c9da06de097","Type":"ContainerStarted","Data":"9c622dc7defbe1747eeaff4fed249b10b2e100bb7769cc17c26ac82486c3c1bd"} Mar 18 17:03:51.127215 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:03:51.127108 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-548584944f-l9hbr" event={"ID":"93616781-bc34-4623-9464-8c9da06de097","Type":"ContainerStarted","Data":"3cd4b69ed84ed507eddcb2091094134d4cedbbd0a9425ef6cc2502b3f8d5e4c1"} Mar 18 17:03:51.139303 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:03:51.139266 2575 scope.go:117] "RemoveContainer" containerID="72ee0bb9477ae03294b071b64a989498d90c9a956318b6826380acb1f49af404" Mar 18 17:03:51.149033 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:03:51.148903 2575 scope.go:117] "RemoveContainer" containerID="d12951aed5f01e51a3d5192389a4aab65bef692a85943f7ba93c8d254db17d35" Mar 18 17:03:51.149248 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:03:51.149214 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d12951aed5f01e51a3d5192389a4aab65bef692a85943f7ba93c8d254db17d35\": container with ID starting with d12951aed5f01e51a3d5192389a4aab65bef692a85943f7ba93c8d254db17d35 not found: ID does not exist" containerID="d12951aed5f01e51a3d5192389a4aab65bef692a85943f7ba93c8d254db17d35" Mar 18 17:03:51.149248 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:03:51.149239 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d12951aed5f01e51a3d5192389a4aab65bef692a85943f7ba93c8d254db17d35"} err="failed to get container status \"d12951aed5f01e51a3d5192389a4aab65bef692a85943f7ba93c8d254db17d35\": rpc error: code = NotFound desc = could not find container \"d12951aed5f01e51a3d5192389a4aab65bef692a85943f7ba93c8d254db17d35\": container with ID starting with d12951aed5f01e51a3d5192389a4aab65bef692a85943f7ba93c8d254db17d35 not found: ID does not exist" Mar 18 17:03:51.149338 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:03:51.149256 2575 scope.go:117] "RemoveContainer" containerID="72ee0bb9477ae03294b071b64a989498d90c9a956318b6826380acb1f49af404" Mar 18 17:03:51.149519 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:03:51.149493 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72ee0bb9477ae03294b071b64a989498d90c9a956318b6826380acb1f49af404\": container with ID starting with 72ee0bb9477ae03294b071b64a989498d90c9a956318b6826380acb1f49af404 not found: ID does not exist" containerID="72ee0bb9477ae03294b071b64a989498d90c9a956318b6826380acb1f49af404" Mar 18 17:03:51.149619 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:03:51.149523 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72ee0bb9477ae03294b071b64a989498d90c9a956318b6826380acb1f49af404"} err="failed to get container status \"72ee0bb9477ae03294b071b64a989498d90c9a956318b6826380acb1f49af404\": rpc error: code = NotFound desc = could not find container \"72ee0bb9477ae03294b071b64a989498d90c9a956318b6826380acb1f49af404\": container with ID starting with 72ee0bb9477ae03294b071b64a989498d90c9a956318b6826380acb1f49af404 not found: ID does not exist" Mar 18 17:03:51.166481 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:03:51.166459 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-7b74c59fb-ccq49"] Mar 18 17:03:51.170996 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:03:51.170975 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-7b74c59fb-ccq49"] Mar 18 17:03:52.289903 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:03:52.289871 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1943771e-5313-48a4-8e8c-ea563c08702c" path="/var/lib/kubelet/pods/1943771e-5313-48a4-8e8c-ea563c08702c/volumes" Mar 18 17:03:54.140530 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:03:54.140491 2575 generic.go:358] "Generic (PLEG): container finished" podID="93616781-bc34-4623-9464-8c9da06de097" containerID="9c622dc7defbe1747eeaff4fed249b10b2e100bb7769cc17c26ac82486c3c1bd" exitCode=0 Mar 18 17:03:54.140822 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:03:54.140562 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-548584944f-l9hbr" event={"ID":"93616781-bc34-4623-9464-8c9da06de097","Type":"ContainerDied","Data":"9c622dc7defbe1747eeaff4fed249b10b2e100bb7769cc17c26ac82486c3c1bd"} Mar 18 17:03:55.145874 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:03:55.145842 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-548584944f-l9hbr" event={"ID":"93616781-bc34-4623-9464-8c9da06de097","Type":"ContainerStarted","Data":"9e590cc1d17d0a1be644726adc420c170769934d0bbb7135da4cf2678e623834"} Mar 18 17:03:55.146259 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:03:55.146152 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-548584944f-l9hbr" Mar 18 17:03:55.147271 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:03:55.147237 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-548584944f-l9hbr" podUID="93616781-bc34-4623-9464-8c9da06de097" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Mar 18 17:03:55.161792 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:03:55.161734 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-548584944f-l9hbr" podStartSLOduration=6.16171607 podStartE2EDuration="6.16171607s" podCreationTimestamp="2026-03-18 17:03:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:03:55.161061456 +0000 UTC m=+1153.472314377" watchObservedRunningTime="2026-03-18 17:03:55.16171607 +0000 UTC m=+1153.472968992" Mar 18 17:03:56.149716 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:03:56.149680 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-548584944f-l9hbr" podUID="93616781-bc34-4623-9464-8c9da06de097" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Mar 18 17:04:01.287043 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:04:01.287013 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:04:06.150531 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:04:06.150503 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-548584944f-l9hbr" Mar 18 17:04:09.408163 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:04:09.408133 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-548584944f-l9hbr"] Mar 18 17:04:09.408611 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:04:09.408412 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-548584944f-l9hbr" podUID="93616781-bc34-4623-9464-8c9da06de097" containerName="kserve-container" containerID="cri-o://9e590cc1d17d0a1be644726adc420c170769934d0bbb7135da4cf2678e623834" gracePeriod=30 Mar 18 17:04:09.571287 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:04:09.571255 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-54fd74b699-jmgxr"] Mar 18 17:04:09.571730 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:04:09.571711 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1943771e-5313-48a4-8e8c-ea563c08702c" containerName="kserve-container" Mar 18 17:04:09.571803 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:04:09.571732 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="1943771e-5313-48a4-8e8c-ea563c08702c" containerName="kserve-container" Mar 18 17:04:09.571803 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:04:09.571755 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1943771e-5313-48a4-8e8c-ea563c08702c" containerName="storage-initializer" Mar 18 17:04:09.571803 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:04:09.571761 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="1943771e-5313-48a4-8e8c-ea563c08702c" containerName="storage-initializer" Mar 18 17:04:09.571964 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:04:09.571820 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="1943771e-5313-48a4-8e8c-ea563c08702c" containerName="kserve-container" Mar 18 17:04:09.574932 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:04:09.574903 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-54fd74b699-jmgxr" Mar 18 17:04:09.582078 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:04:09.582056 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-54fd74b699-jmgxr"] Mar 18 17:04:09.635796 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:04:09.635759 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/42232f00-58d9-41bb-9f87-940aab77d326-kserve-provision-location\") pod \"isvc-mlflow-v2-runtime-predictor-54fd74b699-jmgxr\" (UID: \"42232f00-58d9-41bb-9f87-940aab77d326\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-54fd74b699-jmgxr" Mar 18 17:04:09.635971 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:04:09.635875 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzq47\" (UniqueName: \"kubernetes.io/projected/42232f00-58d9-41bb-9f87-940aab77d326-kube-api-access-tzq47\") pod \"isvc-mlflow-v2-runtime-predictor-54fd74b699-jmgxr\" (UID: \"42232f00-58d9-41bb-9f87-940aab77d326\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-54fd74b699-jmgxr" Mar 18 17:04:09.736823 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:04:09.736738 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tzq47\" (UniqueName: \"kubernetes.io/projected/42232f00-58d9-41bb-9f87-940aab77d326-kube-api-access-tzq47\") pod \"isvc-mlflow-v2-runtime-predictor-54fd74b699-jmgxr\" (UID: \"42232f00-58d9-41bb-9f87-940aab77d326\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-54fd74b699-jmgxr" Mar 18 17:04:09.736823 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:04:09.736784 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/42232f00-58d9-41bb-9f87-940aab77d326-kserve-provision-location\") pod \"isvc-mlflow-v2-runtime-predictor-54fd74b699-jmgxr\" (UID: \"42232f00-58d9-41bb-9f87-940aab77d326\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-54fd74b699-jmgxr" Mar 18 17:04:09.737142 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:04:09.737127 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/42232f00-58d9-41bb-9f87-940aab77d326-kserve-provision-location\") pod \"isvc-mlflow-v2-runtime-predictor-54fd74b699-jmgxr\" (UID: \"42232f00-58d9-41bb-9f87-940aab77d326\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-54fd74b699-jmgxr" Mar 18 17:04:09.744663 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:04:09.744639 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzq47\" (UniqueName: \"kubernetes.io/projected/42232f00-58d9-41bb-9f87-940aab77d326-kube-api-access-tzq47\") pod \"isvc-mlflow-v2-runtime-predictor-54fd74b699-jmgxr\" (UID: \"42232f00-58d9-41bb-9f87-940aab77d326\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-54fd74b699-jmgxr" Mar 18 17:04:09.887644 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:04:09.887619 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-54fd74b699-jmgxr" Mar 18 17:04:10.083999 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:04:10.083972 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-548584944f-l9hbr" Mar 18 17:04:10.139958 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:04:10.139929 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/93616781-bc34-4623-9464-8c9da06de097-kserve-provision-location\") pod \"93616781-bc34-4623-9464-8c9da06de097\" (UID: \"93616781-bc34-4623-9464-8c9da06de097\") " Mar 18 17:04:10.140126 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:04:10.140004 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ssbw9\" (UniqueName: \"kubernetes.io/projected/93616781-bc34-4623-9464-8c9da06de097-kube-api-access-ssbw9\") pod \"93616781-bc34-4623-9464-8c9da06de097\" (UID: \"93616781-bc34-4623-9464-8c9da06de097\") " Mar 18 17:04:10.140375 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:04:10.140322 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93616781-bc34-4623-9464-8c9da06de097-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "93616781-bc34-4623-9464-8c9da06de097" (UID: "93616781-bc34-4623-9464-8c9da06de097"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 18 17:04:10.142392 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:04:10.142348 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93616781-bc34-4623-9464-8c9da06de097-kube-api-access-ssbw9" (OuterVolumeSpecName: "kube-api-access-ssbw9") pod "93616781-bc34-4623-9464-8c9da06de097" (UID: "93616781-bc34-4623-9464-8c9da06de097"). InnerVolumeSpecName "kube-api-access-ssbw9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 17:04:10.195537 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:04:10.195506 2575 generic.go:358] "Generic (PLEG): container finished" podID="93616781-bc34-4623-9464-8c9da06de097" containerID="9e590cc1d17d0a1be644726adc420c170769934d0bbb7135da4cf2678e623834" exitCode=0 Mar 18 17:04:10.195696 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:04:10.195588 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-548584944f-l9hbr" event={"ID":"93616781-bc34-4623-9464-8c9da06de097","Type":"ContainerDied","Data":"9e590cc1d17d0a1be644726adc420c170769934d0bbb7135da4cf2678e623834"} Mar 18 17:04:10.195696 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:04:10.195595 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-548584944f-l9hbr" Mar 18 17:04:10.195696 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:04:10.195622 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-548584944f-l9hbr" event={"ID":"93616781-bc34-4623-9464-8c9da06de097","Type":"ContainerDied","Data":"3cd4b69ed84ed507eddcb2091094134d4cedbbd0a9425ef6cc2502b3f8d5e4c1"} Mar 18 17:04:10.195696 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:04:10.195642 2575 scope.go:117] "RemoveContainer" containerID="9e590cc1d17d0a1be644726adc420c170769934d0bbb7135da4cf2678e623834" Mar 18 17:04:10.204692 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:04:10.204678 2575 scope.go:117] "RemoveContainer" containerID="9c622dc7defbe1747eeaff4fed249b10b2e100bb7769cc17c26ac82486c3c1bd" Mar 18 17:04:10.214494 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:04:10.214459 2575 scope.go:117] "RemoveContainer" containerID="9e590cc1d17d0a1be644726adc420c170769934d0bbb7135da4cf2678e623834" Mar 18 17:04:10.215803 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:04:10.215767 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e590cc1d17d0a1be644726adc420c170769934d0bbb7135da4cf2678e623834\": container with ID starting with 9e590cc1d17d0a1be644726adc420c170769934d0bbb7135da4cf2678e623834 not found: ID does not exist" containerID="9e590cc1d17d0a1be644726adc420c170769934d0bbb7135da4cf2678e623834" Mar 18 17:04:10.215907 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:04:10.215828 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e590cc1d17d0a1be644726adc420c170769934d0bbb7135da4cf2678e623834"} err="failed to get container status \"9e590cc1d17d0a1be644726adc420c170769934d0bbb7135da4cf2678e623834\": rpc error: code = NotFound desc = could not find container \"9e590cc1d17d0a1be644726adc420c170769934d0bbb7135da4cf2678e623834\": container with ID starting with 9e590cc1d17d0a1be644726adc420c170769934d0bbb7135da4cf2678e623834 not found: ID does not exist" Mar 18 17:04:10.215907 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:04:10.215852 2575 scope.go:117] "RemoveContainer" containerID="9c622dc7defbe1747eeaff4fed249b10b2e100bb7769cc17c26ac82486c3c1bd" Mar 18 17:04:10.216188 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:04:10.216167 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c622dc7defbe1747eeaff4fed249b10b2e100bb7769cc17c26ac82486c3c1bd\": container with ID starting with 9c622dc7defbe1747eeaff4fed249b10b2e100bb7769cc17c26ac82486c3c1bd not found: ID does not exist" containerID="9c622dc7defbe1747eeaff4fed249b10b2e100bb7769cc17c26ac82486c3c1bd" Mar 18 17:04:10.216244 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:04:10.216200 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c622dc7defbe1747eeaff4fed249b10b2e100bb7769cc17c26ac82486c3c1bd"} err="failed to get container status \"9c622dc7defbe1747eeaff4fed249b10b2e100bb7769cc17c26ac82486c3c1bd\": rpc error: code = NotFound desc = could not find container \"9c622dc7defbe1747eeaff4fed249b10b2e100bb7769cc17c26ac82486c3c1bd\": container with ID starting with 9c622dc7defbe1747eeaff4fed249b10b2e100bb7769cc17c26ac82486c3c1bd not found: ID does not exist" Mar 18 17:04:10.216724 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:04:10.216706 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-548584944f-l9hbr"] Mar 18 17:04:10.223550 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:04:10.223532 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-548584944f-l9hbr"] Mar 18 17:04:10.227163 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:04:10.227144 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-54fd74b699-jmgxr"] Mar 18 17:04:10.230611 ip-10-0-139-49 kubenswrapper[2575]: W0318 17:04:10.230574 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42232f00_58d9_41bb_9f87_940aab77d326.slice/crio-d8c23d81015dbed85e354f7beb253c949fd3bc18c8dcdaefe19b318884accd7b WatchSource:0}: Error finding container d8c23d81015dbed85e354f7beb253c949fd3bc18c8dcdaefe19b318884accd7b: Status 404 returned error can't find the container with id d8c23d81015dbed85e354f7beb253c949fd3bc18c8dcdaefe19b318884accd7b Mar 18 17:04:10.240550 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:04:10.240533 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ssbw9\" (UniqueName: \"kubernetes.io/projected/93616781-bc34-4623-9464-8c9da06de097-kube-api-access-ssbw9\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 17:04:10.240617 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:04:10.240553 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/93616781-bc34-4623-9464-8c9da06de097-kserve-provision-location\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 17:04:10.291072 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:04:10.291047 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93616781-bc34-4623-9464-8c9da06de097" path="/var/lib/kubelet/pods/93616781-bc34-4623-9464-8c9da06de097/volumes" Mar 18 17:04:11.200505 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:04:11.200467 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-54fd74b699-jmgxr" event={"ID":"42232f00-58d9-41bb-9f87-940aab77d326","Type":"ContainerStarted","Data":"cf561f99b0fea8142b20a2af71fe1b03f1e76c2f8fc9b892b06ab8bf1cfee746"} Mar 18 17:04:11.200505 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:04:11.200509 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-54fd74b699-jmgxr" event={"ID":"42232f00-58d9-41bb-9f87-940aab77d326","Type":"ContainerStarted","Data":"d8c23d81015dbed85e354f7beb253c949fd3bc18c8dcdaefe19b318884accd7b"} Mar 18 17:04:13.286678 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:04:13.286646 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:04:15.214288 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:04:15.214254 2575 generic.go:358] "Generic (PLEG): container finished" podID="42232f00-58d9-41bb-9f87-940aab77d326" containerID="cf561f99b0fea8142b20a2af71fe1b03f1e76c2f8fc9b892b06ab8bf1cfee746" exitCode=0 Mar 18 17:04:15.214678 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:04:15.214326 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-54fd74b699-jmgxr" event={"ID":"42232f00-58d9-41bb-9f87-940aab77d326","Type":"ContainerDied","Data":"cf561f99b0fea8142b20a2af71fe1b03f1e76c2f8fc9b892b06ab8bf1cfee746"} Mar 18 17:04:16.220189 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:04:16.220158 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-54fd74b699-jmgxr" event={"ID":"42232f00-58d9-41bb-9f87-940aab77d326","Type":"ContainerStarted","Data":"94217e23f429699c97539412e1c0fb0a3e8f2958fe819a81391dfb8f51d5ea67"} Mar 18 17:04:16.220637 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:04:16.220407 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-54fd74b699-jmgxr" Mar 18 17:04:16.237239 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:04:16.237191 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-54fd74b699-jmgxr" podStartSLOduration=7.237177668 podStartE2EDuration="7.237177668s" podCreationTimestamp="2026-03-18 17:04:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:04:16.235289293 +0000 UTC m=+1174.546542214" watchObservedRunningTime="2026-03-18 17:04:16.237177668 +0000 UTC m=+1174.548430592" Mar 18 17:04:28.286684 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:04:28.286647 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:04:41.287024 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:04:41.286978 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:04:42.243877 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:04:42.243847 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qktrm_a51f93e2-c750-4700-bb46-109456c7c78a/ovn-acl-logging/0.log" Mar 18 17:04:42.246823 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:04:42.246803 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qktrm_a51f93e2-c750-4700-bb46-109456c7c78a/ovn-acl-logging/0.log" Mar 18 17:04:47.228891 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:04:47.228859 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-54fd74b699-jmgxr" Mar 18 17:04:49.580809 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:04:49.580777 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-54fd74b699-jmgxr"] Mar 18 17:04:49.581186 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:04:49.581004 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-54fd74b699-jmgxr" podUID="42232f00-58d9-41bb-9f87-940aab77d326" containerName="kserve-container" containerID="cri-o://94217e23f429699c97539412e1c0fb0a3e8f2958fe819a81391dfb8f51d5ea67" gracePeriod=30 Mar 18 17:04:49.762505 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:04:49.762465 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-78475f6cdc-msk7r"] Mar 18 17:04:49.762961 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:04:49.762940 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="93616781-bc34-4623-9464-8c9da06de097" containerName="storage-initializer" Mar 18 17:04:49.763008 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:04:49.762969 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="93616781-bc34-4623-9464-8c9da06de097" containerName="storage-initializer" Mar 18 17:04:49.763008 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:04:49.762984 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="93616781-bc34-4623-9464-8c9da06de097" containerName="kserve-container" Mar 18 17:04:49.763008 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:04:49.762992 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="93616781-bc34-4623-9464-8c9da06de097" containerName="kserve-container" Mar 18 17:04:49.763118 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:04:49.763104 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="93616781-bc34-4623-9464-8c9da06de097" containerName="kserve-container" Mar 18 17:04:49.768240 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:04:49.768218 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-78475f6cdc-msk7r" Mar 18 17:04:49.775225 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:04:49.775185 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-78475f6cdc-msk7r"] Mar 18 17:04:49.868347 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:04:49.868244 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/358cc23f-85d1-4a45-bd9f-a896e54b6f7e-kserve-provision-location\") pod \"isvc-sklearn-mcp-predictor-78475f6cdc-msk7r\" (UID: \"358cc23f-85d1-4a45-bd9f-a896e54b6f7e\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-78475f6cdc-msk7r" Mar 18 17:04:49.868347 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:04:49.868285 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhjh9\" (UniqueName: \"kubernetes.io/projected/358cc23f-85d1-4a45-bd9f-a896e54b6f7e-kube-api-access-qhjh9\") pod \"isvc-sklearn-mcp-predictor-78475f6cdc-msk7r\" (UID: \"358cc23f-85d1-4a45-bd9f-a896e54b6f7e\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-78475f6cdc-msk7r" Mar 18 17:04:49.969369 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:04:49.969325 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/358cc23f-85d1-4a45-bd9f-a896e54b6f7e-kserve-provision-location\") pod \"isvc-sklearn-mcp-predictor-78475f6cdc-msk7r\" (UID: \"358cc23f-85d1-4a45-bd9f-a896e54b6f7e\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-78475f6cdc-msk7r" Mar 18 17:04:49.969553 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:04:49.969391 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qhjh9\" (UniqueName: \"kubernetes.io/projected/358cc23f-85d1-4a45-bd9f-a896e54b6f7e-kube-api-access-qhjh9\") pod \"isvc-sklearn-mcp-predictor-78475f6cdc-msk7r\" (UID: \"358cc23f-85d1-4a45-bd9f-a896e54b6f7e\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-78475f6cdc-msk7r" Mar 18 17:04:49.969763 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:04:49.969736 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/358cc23f-85d1-4a45-bd9f-a896e54b6f7e-kserve-provision-location\") pod \"isvc-sklearn-mcp-predictor-78475f6cdc-msk7r\" (UID: \"358cc23f-85d1-4a45-bd9f-a896e54b6f7e\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-78475f6cdc-msk7r" Mar 18 17:04:49.977572 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:04:49.977553 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhjh9\" (UniqueName: \"kubernetes.io/projected/358cc23f-85d1-4a45-bd9f-a896e54b6f7e-kube-api-access-qhjh9\") pod \"isvc-sklearn-mcp-predictor-78475f6cdc-msk7r\" (UID: \"358cc23f-85d1-4a45-bd9f-a896e54b6f7e\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-78475f6cdc-msk7r" Mar 18 17:04:50.079564 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:04:50.079535 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-78475f6cdc-msk7r" Mar 18 17:04:50.218964 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:04:50.218938 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-78475f6cdc-msk7r"] Mar 18 17:04:50.222187 ip-10-0-139-49 kubenswrapper[2575]: W0318 17:04:50.222152 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod358cc23f_85d1_4a45_bd9f_a896e54b6f7e.slice/crio-a4a515aeb11e64233dbccd22fefdd339bb447747da3782a227c6722d5bf4efa6 WatchSource:0}: Error finding container a4a515aeb11e64233dbccd22fefdd339bb447747da3782a227c6722d5bf4efa6: Status 404 returned error can't find the container with id a4a515aeb11e64233dbccd22fefdd339bb447747da3782a227c6722d5bf4efa6 Mar 18 17:04:50.335691 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:04:50.335655 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-78475f6cdc-msk7r" event={"ID":"358cc23f-85d1-4a45-bd9f-a896e54b6f7e","Type":"ContainerStarted","Data":"706f0e7ff35872a61a0ee50b14e88ee01467c18834f609c85a74d6541ba3d80f"} Mar 18 17:04:50.335691 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:04:50.335695 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-78475f6cdc-msk7r" event={"ID":"358cc23f-85d1-4a45-bd9f-a896e54b6f7e","Type":"ContainerStarted","Data":"a4a515aeb11e64233dbccd22fefdd339bb447747da3782a227c6722d5bf4efa6"} Mar 18 17:04:50.528194 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:04:50.528162 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-54fd74b699-jmgxr" Mar 18 17:04:50.574783 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:04:50.574758 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/42232f00-58d9-41bb-9f87-940aab77d326-kserve-provision-location\") pod \"42232f00-58d9-41bb-9f87-940aab77d326\" (UID: \"42232f00-58d9-41bb-9f87-940aab77d326\") " Mar 18 17:04:50.574939 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:04:50.574819 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tzq47\" (UniqueName: \"kubernetes.io/projected/42232f00-58d9-41bb-9f87-940aab77d326-kube-api-access-tzq47\") pod \"42232f00-58d9-41bb-9f87-940aab77d326\" (UID: \"42232f00-58d9-41bb-9f87-940aab77d326\") " Mar 18 17:04:50.575122 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:04:50.575099 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42232f00-58d9-41bb-9f87-940aab77d326-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "42232f00-58d9-41bb-9f87-940aab77d326" (UID: "42232f00-58d9-41bb-9f87-940aab77d326"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 18 17:04:50.576921 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:04:50.576899 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42232f00-58d9-41bb-9f87-940aab77d326-kube-api-access-tzq47" (OuterVolumeSpecName: "kube-api-access-tzq47") pod "42232f00-58d9-41bb-9f87-940aab77d326" (UID: "42232f00-58d9-41bb-9f87-940aab77d326"). InnerVolumeSpecName "kube-api-access-tzq47". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 17:04:50.675691 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:04:50.675619 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tzq47\" (UniqueName: \"kubernetes.io/projected/42232f00-58d9-41bb-9f87-940aab77d326-kube-api-access-tzq47\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 17:04:50.675691 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:04:50.675646 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/42232f00-58d9-41bb-9f87-940aab77d326-kserve-provision-location\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 17:04:51.343433 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:04:51.343321 2575 generic.go:358] "Generic (PLEG): container finished" podID="42232f00-58d9-41bb-9f87-940aab77d326" containerID="94217e23f429699c97539412e1c0fb0a3e8f2958fe819a81391dfb8f51d5ea67" exitCode=0 Mar 18 17:04:51.343433 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:04:51.343401 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-54fd74b699-jmgxr" event={"ID":"42232f00-58d9-41bb-9f87-940aab77d326","Type":"ContainerDied","Data":"94217e23f429699c97539412e1c0fb0a3e8f2958fe819a81391dfb8f51d5ea67"} Mar 18 17:04:51.343433 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:04:51.343419 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-54fd74b699-jmgxr" Mar 18 17:04:51.343685 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:04:51.343444 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-54fd74b699-jmgxr" event={"ID":"42232f00-58d9-41bb-9f87-940aab77d326","Type":"ContainerDied","Data":"d8c23d81015dbed85e354f7beb253c949fd3bc18c8dcdaefe19b318884accd7b"} Mar 18 17:04:51.343685 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:04:51.343461 2575 scope.go:117] "RemoveContainer" containerID="94217e23f429699c97539412e1c0fb0a3e8f2958fe819a81391dfb8f51d5ea67" Mar 18 17:04:51.352231 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:04:51.352213 2575 scope.go:117] "RemoveContainer" containerID="cf561f99b0fea8142b20a2af71fe1b03f1e76c2f8fc9b892b06ab8bf1cfee746" Mar 18 17:04:51.363938 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:04:51.363914 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-54fd74b699-jmgxr"] Mar 18 17:04:51.368036 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:04:51.368016 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-54fd74b699-jmgxr"] Mar 18 17:04:51.374419 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:04:51.374400 2575 scope.go:117] "RemoveContainer" containerID="94217e23f429699c97539412e1c0fb0a3e8f2958fe819a81391dfb8f51d5ea67" Mar 18 17:04:51.374710 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:04:51.374691 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94217e23f429699c97539412e1c0fb0a3e8f2958fe819a81391dfb8f51d5ea67\": container with ID starting with 94217e23f429699c97539412e1c0fb0a3e8f2958fe819a81391dfb8f51d5ea67 not found: ID does not exist" containerID="94217e23f429699c97539412e1c0fb0a3e8f2958fe819a81391dfb8f51d5ea67" Mar 18 17:04:51.374765 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:04:51.374721 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94217e23f429699c97539412e1c0fb0a3e8f2958fe819a81391dfb8f51d5ea67"} err="failed to get container status \"94217e23f429699c97539412e1c0fb0a3e8f2958fe819a81391dfb8f51d5ea67\": rpc error: code = NotFound desc = could not find container \"94217e23f429699c97539412e1c0fb0a3e8f2958fe819a81391dfb8f51d5ea67\": container with ID starting with 94217e23f429699c97539412e1c0fb0a3e8f2958fe819a81391dfb8f51d5ea67 not found: ID does not exist" Mar 18 17:04:51.374765 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:04:51.374738 2575 scope.go:117] "RemoveContainer" containerID="cf561f99b0fea8142b20a2af71fe1b03f1e76c2f8fc9b892b06ab8bf1cfee746" Mar 18 17:04:51.374976 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:04:51.374961 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf561f99b0fea8142b20a2af71fe1b03f1e76c2f8fc9b892b06ab8bf1cfee746\": container with ID starting with cf561f99b0fea8142b20a2af71fe1b03f1e76c2f8fc9b892b06ab8bf1cfee746 not found: ID does not exist" containerID="cf561f99b0fea8142b20a2af71fe1b03f1e76c2f8fc9b892b06ab8bf1cfee746" Mar 18 17:04:51.375012 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:04:51.374981 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf561f99b0fea8142b20a2af71fe1b03f1e76c2f8fc9b892b06ab8bf1cfee746"} err="failed to get container status \"cf561f99b0fea8142b20a2af71fe1b03f1e76c2f8fc9b892b06ab8bf1cfee746\": rpc error: code = NotFound desc = could not find container \"cf561f99b0fea8142b20a2af71fe1b03f1e76c2f8fc9b892b06ab8bf1cfee746\": container with ID starting with cf561f99b0fea8142b20a2af71fe1b03f1e76c2f8fc9b892b06ab8bf1cfee746 not found: ID does not exist" Mar 18 17:04:52.290969 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:04:52.290938 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42232f00-58d9-41bb-9f87-940aab77d326" path="/var/lib/kubelet/pods/42232f00-58d9-41bb-9f87-940aab77d326/volumes" Mar 18 17:04:54.354918 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:04:54.354890 2575 generic.go:358] "Generic (PLEG): container finished" podID="358cc23f-85d1-4a45-bd9f-a896e54b6f7e" containerID="706f0e7ff35872a61a0ee50b14e88ee01467c18834f609c85a74d6541ba3d80f" exitCode=0 Mar 18 17:04:54.355226 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:04:54.354937 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-78475f6cdc-msk7r" event={"ID":"358cc23f-85d1-4a45-bd9f-a896e54b6f7e","Type":"ContainerDied","Data":"706f0e7ff35872a61a0ee50b14e88ee01467c18834f609c85a74d6541ba3d80f"} Mar 18 17:04:55.361215 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:04:55.361143 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-78475f6cdc-msk7r" event={"ID":"358cc23f-85d1-4a45-bd9f-a896e54b6f7e","Type":"ContainerStarted","Data":"6b5629a2128d125865a08f5f020e4ed8bcbfaf86df6b5cc5e21c6cc5a4faecee"} Mar 18 17:04:56.306679 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:04:56.287435 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:04:58.373009 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:04:58.372976 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-78475f6cdc-msk7r" event={"ID":"358cc23f-85d1-4a45-bd9f-a896e54b6f7e","Type":"ContainerStarted","Data":"8f6ae194699008e985e7c9ce9173953a81dc0d45ff85141063e823ca1a6204b7"} Mar 18 17:04:58.373542 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:04:58.373204 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-78475f6cdc-msk7r" Mar 18 17:04:58.390878 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:04:58.390814 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-78475f6cdc-msk7r" podStartSLOduration=6.331189184 podStartE2EDuration="9.390798715s" podCreationTimestamp="2026-03-18 17:04:49 +0000 UTC" firstStartedPulling="2026-03-18 17:04:54.416261586 +0000 UTC m=+1212.727514486" lastFinishedPulling="2026-03-18 17:04:57.475871104 +0000 UTC m=+1215.787124017" observedRunningTime="2026-03-18 17:04:58.389385332 +0000 UTC m=+1216.700638250" watchObservedRunningTime="2026-03-18 17:04:58.390798715 +0000 UTC m=+1216.702051635" Mar 18 17:04:59.376517 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:04:59.376482 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-78475f6cdc-msk7r" Mar 18 17:05:10.287654 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:05:10.287576 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:05:23.287252 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:05:23.287155 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 17:05:23.287510 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:05:23.287323 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:05:30.081850 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:05:30.081802 2575 prober.go:120] "Probe failed" probeType="Liveness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-78475f6cdc-msk7r" podUID="358cc23f-85d1-4a45-bd9f-a896e54b6f7e" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.41:8080/v1/models/isvc-sklearn-mcp\": dial tcp 10.134.0.41:8080: connect: connection refused" Mar 18 17:05:30.380824 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:05:30.380729 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-78475f6cdc-msk7r" podUID="358cc23f-85d1-4a45-bd9f-a896e54b6f7e" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.41:8080/v1/models/isvc-sklearn-mcp\": dial tcp 10.134.0.41:8080: connect: connection refused" Mar 18 17:05:38.287152 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:05:38.287120 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:05:40.382215 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:05:40.382181 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-78475f6cdc-msk7r" Mar 18 17:05:51.286779 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:05:51.286748 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:06:00.383553 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:06:00.383517 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-78475f6cdc-msk7r" Mar 18 17:06:03.287649 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:06:03.287618 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:06:09.702268 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:06:09.702234 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-78475f6cdc-msk7r"] Mar 18 17:06:09.702850 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:06:09.702640 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-78475f6cdc-msk7r" podUID="358cc23f-85d1-4a45-bd9f-a896e54b6f7e" containerName="kserve-container" containerID="cri-o://6b5629a2128d125865a08f5f020e4ed8bcbfaf86df6b5cc5e21c6cc5a4faecee" gracePeriod=30 Mar 18 17:06:09.702850 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:06:09.702723 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-78475f6cdc-msk7r" podUID="358cc23f-85d1-4a45-bd9f-a896e54b6f7e" containerName="kserve-agent" containerID="cri-o://8f6ae194699008e985e7c9ce9173953a81dc0d45ff85141063e823ca1a6204b7" gracePeriod=30 Mar 18 17:06:09.945380 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:06:09.945332 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-776bd4c848-s9prw"] Mar 18 17:06:09.945707 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:06:09.945694 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="42232f00-58d9-41bb-9f87-940aab77d326" containerName="storage-initializer" Mar 18 17:06:09.945754 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:06:09.945709 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="42232f00-58d9-41bb-9f87-940aab77d326" containerName="storage-initializer" Mar 18 17:06:09.945754 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:06:09.945725 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="42232f00-58d9-41bb-9f87-940aab77d326" containerName="kserve-container" Mar 18 17:06:09.945754 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:06:09.945730 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="42232f00-58d9-41bb-9f87-940aab77d326" containerName="kserve-container" Mar 18 17:06:09.945852 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:06:09.945781 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="42232f00-58d9-41bb-9f87-940aab77d326" containerName="kserve-container" Mar 18 17:06:09.948741 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:06:09.948726 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-776bd4c848-s9prw" Mar 18 17:06:09.957211 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:06:09.957188 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-776bd4c848-s9prw"] Mar 18 17:06:09.991671 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:06:09.991646 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wn8jq\" (UniqueName: \"kubernetes.io/projected/f5183ae5-a793-41f7-b3dd-ad93667451a3-kube-api-access-wn8jq\") pod \"isvc-paddle-predictor-776bd4c848-s9prw\" (UID: \"f5183ae5-a793-41f7-b3dd-ad93667451a3\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-776bd4c848-s9prw" Mar 18 17:06:09.991798 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:06:09.991692 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f5183ae5-a793-41f7-b3dd-ad93667451a3-kserve-provision-location\") pod \"isvc-paddle-predictor-776bd4c848-s9prw\" (UID: \"f5183ae5-a793-41f7-b3dd-ad93667451a3\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-776bd4c848-s9prw" Mar 18 17:06:10.092074 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:06:10.092050 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f5183ae5-a793-41f7-b3dd-ad93667451a3-kserve-provision-location\") pod \"isvc-paddle-predictor-776bd4c848-s9prw\" (UID: \"f5183ae5-a793-41f7-b3dd-ad93667451a3\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-776bd4c848-s9prw" Mar 18 17:06:10.092183 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:06:10.092117 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wn8jq\" (UniqueName: \"kubernetes.io/projected/f5183ae5-a793-41f7-b3dd-ad93667451a3-kube-api-access-wn8jq\") pod \"isvc-paddle-predictor-776bd4c848-s9prw\" (UID: \"f5183ae5-a793-41f7-b3dd-ad93667451a3\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-776bd4c848-s9prw" Mar 18 17:06:10.092444 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:06:10.092426 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f5183ae5-a793-41f7-b3dd-ad93667451a3-kserve-provision-location\") pod \"isvc-paddle-predictor-776bd4c848-s9prw\" (UID: \"f5183ae5-a793-41f7-b3dd-ad93667451a3\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-776bd4c848-s9prw" Mar 18 17:06:10.100012 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:06:10.099987 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wn8jq\" (UniqueName: \"kubernetes.io/projected/f5183ae5-a793-41f7-b3dd-ad93667451a3-kube-api-access-wn8jq\") pod \"isvc-paddle-predictor-776bd4c848-s9prw\" (UID: \"f5183ae5-a793-41f7-b3dd-ad93667451a3\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-776bd4c848-s9prw" Mar 18 17:06:10.259257 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:06:10.259179 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-776bd4c848-s9prw" Mar 18 17:06:10.381055 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:06:10.381014 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-78475f6cdc-msk7r" podUID="358cc23f-85d1-4a45-bd9f-a896e54b6f7e" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.41:8080/v1/models/isvc-sklearn-mcp\": dial tcp 10.134.0.41:8080: connect: connection refused" Mar 18 17:06:10.382498 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:06:10.382433 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-776bd4c848-s9prw"] Mar 18 17:06:10.383376 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:06:10.383336 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-78475f6cdc-msk7r" podUID="358cc23f-85d1-4a45-bd9f-a896e54b6f7e" containerName="kserve-agent" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Mar 18 17:06:10.385389 ip-10-0-139-49 kubenswrapper[2575]: W0318 17:06:10.385342 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf5183ae5_a793_41f7_b3dd_ad93667451a3.slice/crio-a89f781702af6909f22880dc706aebec96bdf1fc75ab5fff53e84a0468636292 WatchSource:0}: Error finding container a89f781702af6909f22880dc706aebec96bdf1fc75ab5fff53e84a0468636292: Status 404 returned error can't find the container with id a89f781702af6909f22880dc706aebec96bdf1fc75ab5fff53e84a0468636292 Mar 18 17:06:10.597781 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:06:10.597688 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-776bd4c848-s9prw" event={"ID":"f5183ae5-a793-41f7-b3dd-ad93667451a3","Type":"ContainerStarted","Data":"d4e73081e164a517013c9cdd3cef08df2d67b53db8d6b04ef7fcef821efa8b5a"} Mar 18 17:06:10.597781 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:06:10.597726 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-776bd4c848-s9prw" event={"ID":"f5183ae5-a793-41f7-b3dd-ad93667451a3","Type":"ContainerStarted","Data":"a89f781702af6909f22880dc706aebec96bdf1fc75ab5fff53e84a0468636292"} Mar 18 17:06:12.606536 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:06:12.606502 2575 generic.go:358] "Generic (PLEG): container finished" podID="358cc23f-85d1-4a45-bd9f-a896e54b6f7e" containerID="6b5629a2128d125865a08f5f020e4ed8bcbfaf86df6b5cc5e21c6cc5a4faecee" exitCode=0 Mar 18 17:06:12.606908 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:06:12.606558 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-78475f6cdc-msk7r" event={"ID":"358cc23f-85d1-4a45-bd9f-a896e54b6f7e","Type":"ContainerDied","Data":"6b5629a2128d125865a08f5f020e4ed8bcbfaf86df6b5cc5e21c6cc5a4faecee"} Mar 18 17:06:15.617852 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:06:15.617813 2575 generic.go:358] "Generic (PLEG): container finished" podID="f5183ae5-a793-41f7-b3dd-ad93667451a3" containerID="d4e73081e164a517013c9cdd3cef08df2d67b53db8d6b04ef7fcef821efa8b5a" exitCode=0 Mar 18 17:06:15.618203 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:06:15.617885 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-776bd4c848-s9prw" event={"ID":"f5183ae5-a793-41f7-b3dd-ad93667451a3","Type":"ContainerDied","Data":"d4e73081e164a517013c9cdd3cef08df2d67b53db8d6b04ef7fcef821efa8b5a"} Mar 18 17:06:17.287065 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:06:17.287026 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:06:20.380113 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:06:20.380069 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-78475f6cdc-msk7r" podUID="358cc23f-85d1-4a45-bd9f-a896e54b6f7e" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.41:8080/v1/models/isvc-sklearn-mcp\": dial tcp 10.134.0.41:8080: connect: connection refused" Mar 18 17:06:20.383433 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:06:20.383408 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-78475f6cdc-msk7r" podUID="358cc23f-85d1-4a45-bd9f-a896e54b6f7e" containerName="kserve-agent" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Mar 18 17:06:27.674184 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:06:27.674145 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-776bd4c848-s9prw" event={"ID":"f5183ae5-a793-41f7-b3dd-ad93667451a3","Type":"ContainerStarted","Data":"1e17259371e8d11b044c54deb4a5e898e558426173103daa384ceede092be666"} Mar 18 17:06:27.674768 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:06:27.674737 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-predictor-776bd4c848-s9prw" Mar 18 17:06:27.675914 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:06:27.675886 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-776bd4c848-s9prw" podUID="f5183ae5-a793-41f7-b3dd-ad93667451a3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Mar 18 17:06:27.692142 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:06:27.692103 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-paddle-predictor-776bd4c848-s9prw" podStartSLOduration=7.396139109 podStartE2EDuration="18.692091525s" podCreationTimestamp="2026-03-18 17:06:09 +0000 UTC" firstStartedPulling="2026-03-18 17:06:15.618976253 +0000 UTC m=+1293.930229152" lastFinishedPulling="2026-03-18 17:06:26.914928666 +0000 UTC m=+1305.226181568" observedRunningTime="2026-03-18 17:06:27.691219517 +0000 UTC m=+1306.002472439" watchObservedRunningTime="2026-03-18 17:06:27.692091525 +0000 UTC m=+1306.003344436" Mar 18 17:06:28.678253 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:06:28.678207 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-776bd4c848-s9prw" podUID="f5183ae5-a793-41f7-b3dd-ad93667451a3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Mar 18 17:06:30.381021 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:06:30.380980 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-78475f6cdc-msk7r" podUID="358cc23f-85d1-4a45-bd9f-a896e54b6f7e" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.41:8080/v1/models/isvc-sklearn-mcp\": dial tcp 10.134.0.41:8080: connect: connection refused" Mar 18 17:06:30.381414 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:06:30.381111 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-78475f6cdc-msk7r" Mar 18 17:06:30.383296 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:06:30.383272 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-78475f6cdc-msk7r" podUID="358cc23f-85d1-4a45-bd9f-a896e54b6f7e" containerName="kserve-agent" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Mar 18 17:06:30.383399 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:06:30.383384 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-78475f6cdc-msk7r" Mar 18 17:06:32.290338 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:06:32.290307 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:06:38.678296 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:06:38.678249 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-776bd4c848-s9prw" podUID="f5183ae5-a793-41f7-b3dd-ad93667451a3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Mar 18 17:06:39.914986 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:06:39.914961 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-78475f6cdc-msk7r" Mar 18 17:06:40.051585 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:06:40.051509 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qhjh9\" (UniqueName: \"kubernetes.io/projected/358cc23f-85d1-4a45-bd9f-a896e54b6f7e-kube-api-access-qhjh9\") pod \"358cc23f-85d1-4a45-bd9f-a896e54b6f7e\" (UID: \"358cc23f-85d1-4a45-bd9f-a896e54b6f7e\") " Mar 18 17:06:40.051733 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:06:40.051586 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/358cc23f-85d1-4a45-bd9f-a896e54b6f7e-kserve-provision-location\") pod \"358cc23f-85d1-4a45-bd9f-a896e54b6f7e\" (UID: \"358cc23f-85d1-4a45-bd9f-a896e54b6f7e\") " Mar 18 17:06:40.051923 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:06:40.051887 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/358cc23f-85d1-4a45-bd9f-a896e54b6f7e-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "358cc23f-85d1-4a45-bd9f-a896e54b6f7e" (UID: "358cc23f-85d1-4a45-bd9f-a896e54b6f7e"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 18 17:06:40.053797 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:06:40.053769 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/358cc23f-85d1-4a45-bd9f-a896e54b6f7e-kube-api-access-qhjh9" (OuterVolumeSpecName: "kube-api-access-qhjh9") pod "358cc23f-85d1-4a45-bd9f-a896e54b6f7e" (UID: "358cc23f-85d1-4a45-bd9f-a896e54b6f7e"). InnerVolumeSpecName "kube-api-access-qhjh9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 17:06:40.152941 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:06:40.152910 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/358cc23f-85d1-4a45-bd9f-a896e54b6f7e-kserve-provision-location\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 17:06:40.152941 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:06:40.152937 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qhjh9\" (UniqueName: \"kubernetes.io/projected/358cc23f-85d1-4a45-bd9f-a896e54b6f7e-kube-api-access-qhjh9\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 17:06:40.719643 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:06:40.719611 2575 generic.go:358] "Generic (PLEG): container finished" podID="358cc23f-85d1-4a45-bd9f-a896e54b6f7e" containerID="8f6ae194699008e985e7c9ce9173953a81dc0d45ff85141063e823ca1a6204b7" exitCode=137 Mar 18 17:06:40.719811 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:06:40.719704 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-78475f6cdc-msk7r" Mar 18 17:06:40.719811 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:06:40.719703 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-78475f6cdc-msk7r" event={"ID":"358cc23f-85d1-4a45-bd9f-a896e54b6f7e","Type":"ContainerDied","Data":"8f6ae194699008e985e7c9ce9173953a81dc0d45ff85141063e823ca1a6204b7"} Mar 18 17:06:40.719884 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:06:40.719815 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-78475f6cdc-msk7r" event={"ID":"358cc23f-85d1-4a45-bd9f-a896e54b6f7e","Type":"ContainerDied","Data":"a4a515aeb11e64233dbccd22fefdd339bb447747da3782a227c6722d5bf4efa6"} Mar 18 17:06:40.719884 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:06:40.719830 2575 scope.go:117] "RemoveContainer" containerID="8f6ae194699008e985e7c9ce9173953a81dc0d45ff85141063e823ca1a6204b7" Mar 18 17:06:40.730170 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:06:40.730149 2575 scope.go:117] "RemoveContainer" containerID="6b5629a2128d125865a08f5f020e4ed8bcbfaf86df6b5cc5e21c6cc5a4faecee" Mar 18 17:06:40.737871 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:06:40.737849 2575 scope.go:117] "RemoveContainer" containerID="706f0e7ff35872a61a0ee50b14e88ee01467c18834f609c85a74d6541ba3d80f" Mar 18 17:06:40.745792 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:06:40.745768 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-78475f6cdc-msk7r"] Mar 18 17:06:40.747038 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:06:40.747020 2575 scope.go:117] "RemoveContainer" containerID="8f6ae194699008e985e7c9ce9173953a81dc0d45ff85141063e823ca1a6204b7" Mar 18 17:06:40.747302 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:06:40.747285 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f6ae194699008e985e7c9ce9173953a81dc0d45ff85141063e823ca1a6204b7\": container with ID starting with 8f6ae194699008e985e7c9ce9173953a81dc0d45ff85141063e823ca1a6204b7 not found: ID does not exist" containerID="8f6ae194699008e985e7c9ce9173953a81dc0d45ff85141063e823ca1a6204b7" Mar 18 17:06:40.747392 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:06:40.747309 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f6ae194699008e985e7c9ce9173953a81dc0d45ff85141063e823ca1a6204b7"} err="failed to get container status \"8f6ae194699008e985e7c9ce9173953a81dc0d45ff85141063e823ca1a6204b7\": rpc error: code = NotFound desc = could not find container \"8f6ae194699008e985e7c9ce9173953a81dc0d45ff85141063e823ca1a6204b7\": container with ID starting with 8f6ae194699008e985e7c9ce9173953a81dc0d45ff85141063e823ca1a6204b7 not found: ID does not exist" Mar 18 17:06:40.747392 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:06:40.747327 2575 scope.go:117] "RemoveContainer" containerID="6b5629a2128d125865a08f5f020e4ed8bcbfaf86df6b5cc5e21c6cc5a4faecee" Mar 18 17:06:40.747588 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:06:40.747569 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b5629a2128d125865a08f5f020e4ed8bcbfaf86df6b5cc5e21c6cc5a4faecee\": container with ID starting with 6b5629a2128d125865a08f5f020e4ed8bcbfaf86df6b5cc5e21c6cc5a4faecee not found: ID does not exist" containerID="6b5629a2128d125865a08f5f020e4ed8bcbfaf86df6b5cc5e21c6cc5a4faecee" Mar 18 17:06:40.747640 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:06:40.747599 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b5629a2128d125865a08f5f020e4ed8bcbfaf86df6b5cc5e21c6cc5a4faecee"} err="failed to get container status \"6b5629a2128d125865a08f5f020e4ed8bcbfaf86df6b5cc5e21c6cc5a4faecee\": rpc error: code = NotFound desc = could not find container \"6b5629a2128d125865a08f5f020e4ed8bcbfaf86df6b5cc5e21c6cc5a4faecee\": container with ID starting with 6b5629a2128d125865a08f5f020e4ed8bcbfaf86df6b5cc5e21c6cc5a4faecee not found: ID does not exist" Mar 18 17:06:40.747640 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:06:40.747623 2575 scope.go:117] "RemoveContainer" containerID="706f0e7ff35872a61a0ee50b14e88ee01467c18834f609c85a74d6541ba3d80f" Mar 18 17:06:40.747971 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:06:40.747830 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"706f0e7ff35872a61a0ee50b14e88ee01467c18834f609c85a74d6541ba3d80f\": container with ID starting with 706f0e7ff35872a61a0ee50b14e88ee01467c18834f609c85a74d6541ba3d80f not found: ID does not exist" containerID="706f0e7ff35872a61a0ee50b14e88ee01467c18834f609c85a74d6541ba3d80f" Mar 18 17:06:40.747971 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:06:40.747846 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"706f0e7ff35872a61a0ee50b14e88ee01467c18834f609c85a74d6541ba3d80f"} err="failed to get container status \"706f0e7ff35872a61a0ee50b14e88ee01467c18834f609c85a74d6541ba3d80f\": rpc error: code = NotFound desc = could not find container \"706f0e7ff35872a61a0ee50b14e88ee01467c18834f609c85a74d6541ba3d80f\": container with ID starting with 706f0e7ff35872a61a0ee50b14e88ee01467c18834f609c85a74d6541ba3d80f not found: ID does not exist" Mar 18 17:06:40.756813 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:06:40.756792 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-78475f6cdc-msk7r"] Mar 18 17:06:42.291517 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:06:42.291479 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="358cc23f-85d1-4a45-bd9f-a896e54b6f7e" path="/var/lib/kubelet/pods/358cc23f-85d1-4a45-bd9f-a896e54b6f7e/volumes" Mar 18 17:06:45.287614 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:06:45.287581 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:06:48.678502 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:06:48.678458 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-776bd4c848-s9prw" podUID="f5183ae5-a793-41f7-b3dd-ad93667451a3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Mar 18 17:06:57.287505 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:06:57.287421 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:06:58.679053 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:06:58.679011 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-776bd4c848-s9prw" podUID="f5183ae5-a793-41f7-b3dd-ad93667451a3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Mar 18 17:07:08.679684 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:07:08.679655 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-predictor-776bd4c848-s9prw" Mar 18 17:07:10.287166 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:07:10.287137 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:07:11.213456 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:07:11.213415 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-776bd4c848-s9prw"] Mar 18 17:07:11.213753 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:07:11.213725 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-predictor-776bd4c848-s9prw" podUID="f5183ae5-a793-41f7-b3dd-ad93667451a3" containerName="kserve-container" containerID="cri-o://1e17259371e8d11b044c54deb4a5e898e558426173103daa384ceede092be666" gracePeriod=30 Mar 18 17:07:11.353898 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:07:11.353862 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-85747bd85d-mgm6z"] Mar 18 17:07:11.354318 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:07:11.354190 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="358cc23f-85d1-4a45-bd9f-a896e54b6f7e" containerName="kserve-container" Mar 18 17:07:11.354318 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:07:11.354202 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="358cc23f-85d1-4a45-bd9f-a896e54b6f7e" containerName="kserve-container" Mar 18 17:07:11.354318 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:07:11.354218 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="358cc23f-85d1-4a45-bd9f-a896e54b6f7e" containerName="storage-initializer" Mar 18 17:07:11.354318 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:07:11.354224 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="358cc23f-85d1-4a45-bd9f-a896e54b6f7e" containerName="storage-initializer" Mar 18 17:07:11.354318 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:07:11.354236 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="358cc23f-85d1-4a45-bd9f-a896e54b6f7e" containerName="kserve-agent" Mar 18 17:07:11.354318 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:07:11.354242 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="358cc23f-85d1-4a45-bd9f-a896e54b6f7e" containerName="kserve-agent" Mar 18 17:07:11.354318 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:07:11.354291 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="358cc23f-85d1-4a45-bd9f-a896e54b6f7e" containerName="kserve-agent" Mar 18 17:07:11.354318 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:07:11.354300 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="358cc23f-85d1-4a45-bd9f-a896e54b6f7e" containerName="kserve-container" Mar 18 17:07:11.368440 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:07:11.368408 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-85747bd85d-mgm6z"] Mar 18 17:07:11.368617 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:07:11.368528 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-85747bd85d-mgm6z" Mar 18 17:07:11.402550 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:07:11.402511 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6mlb\" (UniqueName: \"kubernetes.io/projected/e08444d3-95a0-445d-bd2b-dede7d68aef3-kube-api-access-g6mlb\") pod \"isvc-paddle-runtime-predictor-85747bd85d-mgm6z\" (UID: \"e08444d3-95a0-445d-bd2b-dede7d68aef3\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-85747bd85d-mgm6z" Mar 18 17:07:11.402707 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:07:11.402641 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e08444d3-95a0-445d-bd2b-dede7d68aef3-kserve-provision-location\") pod \"isvc-paddle-runtime-predictor-85747bd85d-mgm6z\" (UID: \"e08444d3-95a0-445d-bd2b-dede7d68aef3\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-85747bd85d-mgm6z" Mar 18 17:07:11.504064 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:07:11.503961 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e08444d3-95a0-445d-bd2b-dede7d68aef3-kserve-provision-location\") pod \"isvc-paddle-runtime-predictor-85747bd85d-mgm6z\" (UID: \"e08444d3-95a0-445d-bd2b-dede7d68aef3\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-85747bd85d-mgm6z" Mar 18 17:07:11.504064 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:07:11.504026 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g6mlb\" (UniqueName: \"kubernetes.io/projected/e08444d3-95a0-445d-bd2b-dede7d68aef3-kube-api-access-g6mlb\") pod \"isvc-paddle-runtime-predictor-85747bd85d-mgm6z\" (UID: \"e08444d3-95a0-445d-bd2b-dede7d68aef3\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-85747bd85d-mgm6z" Mar 18 17:07:11.504315 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:07:11.504295 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e08444d3-95a0-445d-bd2b-dede7d68aef3-kserve-provision-location\") pod \"isvc-paddle-runtime-predictor-85747bd85d-mgm6z\" (UID: \"e08444d3-95a0-445d-bd2b-dede7d68aef3\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-85747bd85d-mgm6z" Mar 18 17:07:11.511272 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:07:11.511239 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6mlb\" (UniqueName: \"kubernetes.io/projected/e08444d3-95a0-445d-bd2b-dede7d68aef3-kube-api-access-g6mlb\") pod \"isvc-paddle-runtime-predictor-85747bd85d-mgm6z\" (UID: \"e08444d3-95a0-445d-bd2b-dede7d68aef3\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-85747bd85d-mgm6z" Mar 18 17:07:11.679701 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:07:11.679670 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-85747bd85d-mgm6z" Mar 18 17:07:11.806812 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:07:11.806786 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-85747bd85d-mgm6z"] Mar 18 17:07:11.809640 ip-10-0-139-49 kubenswrapper[2575]: W0318 17:07:11.809610 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode08444d3_95a0_445d_bd2b_dede7d68aef3.slice/crio-a12dfa90ce64e0f380e5e20a17011c2228de68e6a7bb61fe755c8e688cc7adaa WatchSource:0}: Error finding container a12dfa90ce64e0f380e5e20a17011c2228de68e6a7bb61fe755c8e688cc7adaa: Status 404 returned error can't find the container with id a12dfa90ce64e0f380e5e20a17011c2228de68e6a7bb61fe755c8e688cc7adaa Mar 18 17:07:11.823127 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:07:11.823101 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-85747bd85d-mgm6z" event={"ID":"e08444d3-95a0-445d-bd2b-dede7d68aef3","Type":"ContainerStarted","Data":"a12dfa90ce64e0f380e5e20a17011c2228de68e6a7bb61fe755c8e688cc7adaa"} Mar 18 17:07:12.827638 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:07:12.827601 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-85747bd85d-mgm6z" event={"ID":"e08444d3-95a0-445d-bd2b-dede7d68aef3","Type":"ContainerStarted","Data":"f5274ce5787ed573e6040453a7f72b95c9f8b03a34558b450f4efbe90adb3aeb"} Mar 18 17:07:13.832573 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:07:13.832543 2575 generic.go:358] "Generic (PLEG): container finished" podID="f5183ae5-a793-41f7-b3dd-ad93667451a3" containerID="1e17259371e8d11b044c54deb4a5e898e558426173103daa384ceede092be666" exitCode=0 Mar 18 17:07:13.832928 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:07:13.832618 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-776bd4c848-s9prw" event={"ID":"f5183ae5-a793-41f7-b3dd-ad93667451a3","Type":"ContainerDied","Data":"1e17259371e8d11b044c54deb4a5e898e558426173103daa384ceede092be666"} Mar 18 17:07:13.866462 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:07:13.866441 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-776bd4c848-s9prw" Mar 18 17:07:13.924549 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:07:13.924523 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wn8jq\" (UniqueName: \"kubernetes.io/projected/f5183ae5-a793-41f7-b3dd-ad93667451a3-kube-api-access-wn8jq\") pod \"f5183ae5-a793-41f7-b3dd-ad93667451a3\" (UID: \"f5183ae5-a793-41f7-b3dd-ad93667451a3\") " Mar 18 17:07:13.924697 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:07:13.924591 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f5183ae5-a793-41f7-b3dd-ad93667451a3-kserve-provision-location\") pod \"f5183ae5-a793-41f7-b3dd-ad93667451a3\" (UID: \"f5183ae5-a793-41f7-b3dd-ad93667451a3\") " Mar 18 17:07:13.926729 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:07:13.926700 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5183ae5-a793-41f7-b3dd-ad93667451a3-kube-api-access-wn8jq" (OuterVolumeSpecName: "kube-api-access-wn8jq") pod "f5183ae5-a793-41f7-b3dd-ad93667451a3" (UID: "f5183ae5-a793-41f7-b3dd-ad93667451a3"). InnerVolumeSpecName "kube-api-access-wn8jq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 17:07:13.934168 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:07:13.934117 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f5183ae5-a793-41f7-b3dd-ad93667451a3-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f5183ae5-a793-41f7-b3dd-ad93667451a3" (UID: "f5183ae5-a793-41f7-b3dd-ad93667451a3"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 18 17:07:14.026105 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:07:14.026076 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wn8jq\" (UniqueName: \"kubernetes.io/projected/f5183ae5-a793-41f7-b3dd-ad93667451a3-kube-api-access-wn8jq\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 17:07:14.026105 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:07:14.026101 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f5183ae5-a793-41f7-b3dd-ad93667451a3-kserve-provision-location\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 17:07:14.837020 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:07:14.836978 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-776bd4c848-s9prw" event={"ID":"f5183ae5-a793-41f7-b3dd-ad93667451a3","Type":"ContainerDied","Data":"a89f781702af6909f22880dc706aebec96bdf1fc75ab5fff53e84a0468636292"} Mar 18 17:07:14.837435 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:07:14.837028 2575 scope.go:117] "RemoveContainer" containerID="1e17259371e8d11b044c54deb4a5e898e558426173103daa384ceede092be666" Mar 18 17:07:14.837435 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:07:14.837035 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-776bd4c848-s9prw" Mar 18 17:07:14.845634 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:07:14.845612 2575 scope.go:117] "RemoveContainer" containerID="d4e73081e164a517013c9cdd3cef08df2d67b53db8d6b04ef7fcef821efa8b5a" Mar 18 17:07:14.852896 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:07:14.852871 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-776bd4c848-s9prw"] Mar 18 17:07:14.857529 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:07:14.857500 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-776bd4c848-s9prw"] Mar 18 17:07:16.290113 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:07:16.290083 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5183ae5-a793-41f7-b3dd-ad93667451a3" path="/var/lib/kubelet/pods/f5183ae5-a793-41f7-b3dd-ad93667451a3/volumes" Mar 18 17:07:16.846412 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:07:16.846354 2575 generic.go:358] "Generic (PLEG): container finished" podID="e08444d3-95a0-445d-bd2b-dede7d68aef3" containerID="f5274ce5787ed573e6040453a7f72b95c9f8b03a34558b450f4efbe90adb3aeb" exitCode=0 Mar 18 17:07:16.846553 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:07:16.846416 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-85747bd85d-mgm6z" event={"ID":"e08444d3-95a0-445d-bd2b-dede7d68aef3","Type":"ContainerDied","Data":"f5274ce5787ed573e6040453a7f72b95c9f8b03a34558b450f4efbe90adb3aeb"} Mar 18 17:07:17.854469 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:07:17.854438 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-85747bd85d-mgm6z" event={"ID":"e08444d3-95a0-445d-bd2b-dede7d68aef3","Type":"ContainerStarted","Data":"6cd5b23e0e40892201f335f9563226c2f564f03460d204810f784e3a5aacab32"} Mar 18 17:07:17.854928 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:07:17.854720 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-85747bd85d-mgm6z" Mar 18 17:07:17.856136 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:07:17.856109 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-85747bd85d-mgm6z" podUID="e08444d3-95a0-445d-bd2b-dede7d68aef3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.43:8080: connect: connection refused" Mar 18 17:07:17.872380 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:07:17.872324 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-85747bd85d-mgm6z" podStartSLOduration=6.8723127680000005 podStartE2EDuration="6.872312768s" podCreationTimestamp="2026-03-18 17:07:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:07:17.870427687 +0000 UTC m=+1356.181680621" watchObservedRunningTime="2026-03-18 17:07:17.872312768 +0000 UTC m=+1356.183565688" Mar 18 17:07:18.863164 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:07:18.863122 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-85747bd85d-mgm6z" podUID="e08444d3-95a0-445d-bd2b-dede7d68aef3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.43:8080: connect: connection refused" Mar 18 17:07:24.286938 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:07:24.286899 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:07:28.863523 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:07:28.863484 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-85747bd85d-mgm6z" podUID="e08444d3-95a0-445d-bd2b-dede7d68aef3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.43:8080: connect: connection refused" Mar 18 17:07:38.614855 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:07:38.614760 2575 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized" image="quay.io/opendatahub/odh-model-serving-api:fast" Mar 18 17:07:38.615265 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:07:38.614927 2575 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:server,Image:quay.io/opendatahub/odh-model-serving-api:fast,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},ContainerPort{Name:metrics,HostPort:0,ContainerPort:9090,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:TLS_CERT_DIR,Value:/tls,ValueFrom:nil,},EnvVar{Name:GATEWAY_LABEL_SELECTOR,Value:,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{50 -3} {} 50m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tls-certs,ReadOnly:true,MountPath:/tls,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vhs26,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000640000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod model-serving-api-9699c8d45-ttb7x_kserve(e41abd12-239d-46fb-9cdb-9d35fa51024d): ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized" logger="UnhandledError" Mar 18 17:07:38.616097 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:07:38.616069 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ErrImagePull: \"unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:07:38.863292 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:07:38.863250 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-85747bd85d-mgm6z" podUID="e08444d3-95a0-445d-bd2b-dede7d68aef3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.43:8080: connect: connection refused" Mar 18 17:07:48.863722 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:07:48.863679 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-85747bd85d-mgm6z" podUID="e08444d3-95a0-445d-bd2b-dede7d68aef3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.43:8080: connect: connection refused" Mar 18 17:07:49.286645 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:07:49.286613 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:07:58.864519 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:07:58.864492 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-85747bd85d-mgm6z" Mar 18 17:08:02.789593 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:08:02.789563 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-85747bd85d-mgm6z"] Mar 18 17:08:02.789986 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:08:02.789809 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-85747bd85d-mgm6z" podUID="e08444d3-95a0-445d-bd2b-dede7d68aef3" containerName="kserve-container" containerID="cri-o://6cd5b23e0e40892201f335f9563226c2f564f03460d204810f784e3a5aacab32" gracePeriod=30 Mar 18 17:08:02.952060 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:08:02.952024 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-59dbd8f4bd-wtm52"] Mar 18 17:08:02.952411 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:08:02.952392 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f5183ae5-a793-41f7-b3dd-ad93667451a3" containerName="kserve-container" Mar 18 17:08:02.952411 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:08:02.952413 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5183ae5-a793-41f7-b3dd-ad93667451a3" containerName="kserve-container" Mar 18 17:08:02.952537 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:08:02.952431 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f5183ae5-a793-41f7-b3dd-ad93667451a3" containerName="storage-initializer" Mar 18 17:08:02.952537 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:08:02.952436 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5183ae5-a793-41f7-b3dd-ad93667451a3" containerName="storage-initializer" Mar 18 17:08:02.952537 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:08:02.952513 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="f5183ae5-a793-41f7-b3dd-ad93667451a3" containerName="kserve-container" Mar 18 17:08:02.955681 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:08:02.955664 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-59dbd8f4bd-wtm52" Mar 18 17:08:02.963022 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:08:02.962999 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-59dbd8f4bd-wtm52"] Mar 18 17:08:03.125550 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:08:03.125466 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/da54fccd-dac7-4642-adca-5ef9afac042b-kserve-provision-location\") pod \"isvc-paddle-v2-kserve-predictor-59dbd8f4bd-wtm52\" (UID: \"da54fccd-dac7-4642-adca-5ef9afac042b\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-59dbd8f4bd-wtm52" Mar 18 17:08:03.125550 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:08:03.125512 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdlvj\" (UniqueName: \"kubernetes.io/projected/da54fccd-dac7-4642-adca-5ef9afac042b-kube-api-access-jdlvj\") pod \"isvc-paddle-v2-kserve-predictor-59dbd8f4bd-wtm52\" (UID: \"da54fccd-dac7-4642-adca-5ef9afac042b\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-59dbd8f4bd-wtm52" Mar 18 17:08:03.226298 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:08:03.226260 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/da54fccd-dac7-4642-adca-5ef9afac042b-kserve-provision-location\") pod \"isvc-paddle-v2-kserve-predictor-59dbd8f4bd-wtm52\" (UID: \"da54fccd-dac7-4642-adca-5ef9afac042b\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-59dbd8f4bd-wtm52" Mar 18 17:08:03.226490 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:08:03.226311 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jdlvj\" (UniqueName: \"kubernetes.io/projected/da54fccd-dac7-4642-adca-5ef9afac042b-kube-api-access-jdlvj\") pod \"isvc-paddle-v2-kserve-predictor-59dbd8f4bd-wtm52\" (UID: \"da54fccd-dac7-4642-adca-5ef9afac042b\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-59dbd8f4bd-wtm52" Mar 18 17:08:03.226651 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:08:03.226630 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/da54fccd-dac7-4642-adca-5ef9afac042b-kserve-provision-location\") pod \"isvc-paddle-v2-kserve-predictor-59dbd8f4bd-wtm52\" (UID: \"da54fccd-dac7-4642-adca-5ef9afac042b\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-59dbd8f4bd-wtm52" Mar 18 17:08:03.234737 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:08:03.234714 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdlvj\" (UniqueName: \"kubernetes.io/projected/da54fccd-dac7-4642-adca-5ef9afac042b-kube-api-access-jdlvj\") pod \"isvc-paddle-v2-kserve-predictor-59dbd8f4bd-wtm52\" (UID: \"da54fccd-dac7-4642-adca-5ef9afac042b\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-59dbd8f4bd-wtm52" Mar 18 17:08:03.266549 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:08:03.266524 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-59dbd8f4bd-wtm52" Mar 18 17:08:03.393903 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:08:03.393815 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-59dbd8f4bd-wtm52"] Mar 18 17:08:03.396260 ip-10-0-139-49 kubenswrapper[2575]: W0318 17:08:03.396232 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda54fccd_dac7_4642_adca_5ef9afac042b.slice/crio-b7168f11e87945447eb19faeced010346714489b51c876169441514db52c2750 WatchSource:0}: Error finding container b7168f11e87945447eb19faeced010346714489b51c876169441514db52c2750: Status 404 returned error can't find the container with id b7168f11e87945447eb19faeced010346714489b51c876169441514db52c2750 Mar 18 17:08:04.015651 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:08:04.015608 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-59dbd8f4bd-wtm52" event={"ID":"da54fccd-dac7-4642-adca-5ef9afac042b","Type":"ContainerStarted","Data":"4e85e245b6b497bec6878b1f1fdcb8c2902c1196985a01fd984cd1531181f641"} Mar 18 17:08:04.015651 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:08:04.015657 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-59dbd8f4bd-wtm52" event={"ID":"da54fccd-dac7-4642-adca-5ef9afac042b","Type":"ContainerStarted","Data":"b7168f11e87945447eb19faeced010346714489b51c876169441514db52c2750"} Mar 18 17:08:04.287485 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:08:04.287455 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:08:05.330481 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:08:05.330458 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-85747bd85d-mgm6z" Mar 18 17:08:05.442348 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:08:05.442265 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g6mlb\" (UniqueName: \"kubernetes.io/projected/e08444d3-95a0-445d-bd2b-dede7d68aef3-kube-api-access-g6mlb\") pod \"e08444d3-95a0-445d-bd2b-dede7d68aef3\" (UID: \"e08444d3-95a0-445d-bd2b-dede7d68aef3\") " Mar 18 17:08:05.442348 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:08:05.442309 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e08444d3-95a0-445d-bd2b-dede7d68aef3-kserve-provision-location\") pod \"e08444d3-95a0-445d-bd2b-dede7d68aef3\" (UID: \"e08444d3-95a0-445d-bd2b-dede7d68aef3\") " Mar 18 17:08:05.444506 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:08:05.444482 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e08444d3-95a0-445d-bd2b-dede7d68aef3-kube-api-access-g6mlb" (OuterVolumeSpecName: "kube-api-access-g6mlb") pod "e08444d3-95a0-445d-bd2b-dede7d68aef3" (UID: "e08444d3-95a0-445d-bd2b-dede7d68aef3"). InnerVolumeSpecName "kube-api-access-g6mlb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 17:08:05.452153 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:08:05.452127 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e08444d3-95a0-445d-bd2b-dede7d68aef3-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "e08444d3-95a0-445d-bd2b-dede7d68aef3" (UID: "e08444d3-95a0-445d-bd2b-dede7d68aef3"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 18 17:08:05.542964 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:08:05.542924 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-g6mlb\" (UniqueName: \"kubernetes.io/projected/e08444d3-95a0-445d-bd2b-dede7d68aef3-kube-api-access-g6mlb\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 17:08:05.542964 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:08:05.542958 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e08444d3-95a0-445d-bd2b-dede7d68aef3-kserve-provision-location\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 17:08:06.022986 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:08:06.022952 2575 generic.go:358] "Generic (PLEG): container finished" podID="e08444d3-95a0-445d-bd2b-dede7d68aef3" containerID="6cd5b23e0e40892201f335f9563226c2f564f03460d204810f784e3a5aacab32" exitCode=0 Mar 18 17:08:06.023147 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:08:06.023013 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-85747bd85d-mgm6z" event={"ID":"e08444d3-95a0-445d-bd2b-dede7d68aef3","Type":"ContainerDied","Data":"6cd5b23e0e40892201f335f9563226c2f564f03460d204810f784e3a5aacab32"} Mar 18 17:08:06.023147 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:08:06.023017 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-85747bd85d-mgm6z" Mar 18 17:08:06.023147 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:08:06.023039 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-85747bd85d-mgm6z" event={"ID":"e08444d3-95a0-445d-bd2b-dede7d68aef3","Type":"ContainerDied","Data":"a12dfa90ce64e0f380e5e20a17011c2228de68e6a7bb61fe755c8e688cc7adaa"} Mar 18 17:08:06.023147 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:08:06.023053 2575 scope.go:117] "RemoveContainer" containerID="6cd5b23e0e40892201f335f9563226c2f564f03460d204810f784e3a5aacab32" Mar 18 17:08:06.032696 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:08:06.032677 2575 scope.go:117] "RemoveContainer" containerID="f5274ce5787ed573e6040453a7f72b95c9f8b03a34558b450f4efbe90adb3aeb" Mar 18 17:08:06.041750 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:08:06.041608 2575 scope.go:117] "RemoveContainer" containerID="6cd5b23e0e40892201f335f9563226c2f564f03460d204810f784e3a5aacab32" Mar 18 17:08:06.041959 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:08:06.041914 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6cd5b23e0e40892201f335f9563226c2f564f03460d204810f784e3a5aacab32\": container with ID starting with 6cd5b23e0e40892201f335f9563226c2f564f03460d204810f784e3a5aacab32 not found: ID does not exist" containerID="6cd5b23e0e40892201f335f9563226c2f564f03460d204810f784e3a5aacab32" Mar 18 17:08:06.041959 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:08:06.041948 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6cd5b23e0e40892201f335f9563226c2f564f03460d204810f784e3a5aacab32"} err="failed to get container status \"6cd5b23e0e40892201f335f9563226c2f564f03460d204810f784e3a5aacab32\": rpc error: code = NotFound desc = could not find container \"6cd5b23e0e40892201f335f9563226c2f564f03460d204810f784e3a5aacab32\": container with ID starting with 6cd5b23e0e40892201f335f9563226c2f564f03460d204810f784e3a5aacab32 not found: ID does not exist" Mar 18 17:08:06.042132 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:08:06.041972 2575 scope.go:117] "RemoveContainer" containerID="f5274ce5787ed573e6040453a7f72b95c9f8b03a34558b450f4efbe90adb3aeb" Mar 18 17:08:06.042388 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:08:06.042248 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5274ce5787ed573e6040453a7f72b95c9f8b03a34558b450f4efbe90adb3aeb\": container with ID starting with f5274ce5787ed573e6040453a7f72b95c9f8b03a34558b450f4efbe90adb3aeb not found: ID does not exist" containerID="f5274ce5787ed573e6040453a7f72b95c9f8b03a34558b450f4efbe90adb3aeb" Mar 18 17:08:06.042388 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:08:06.042275 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5274ce5787ed573e6040453a7f72b95c9f8b03a34558b450f4efbe90adb3aeb"} err="failed to get container status \"f5274ce5787ed573e6040453a7f72b95c9f8b03a34558b450f4efbe90adb3aeb\": rpc error: code = NotFound desc = could not find container \"f5274ce5787ed573e6040453a7f72b95c9f8b03a34558b450f4efbe90adb3aeb\": container with ID starting with f5274ce5787ed573e6040453a7f72b95c9f8b03a34558b450f4efbe90adb3aeb not found: ID does not exist" Mar 18 17:08:06.043271 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:08:06.043254 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-85747bd85d-mgm6z"] Mar 18 17:08:06.047257 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:08:06.047236 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-85747bd85d-mgm6z"] Mar 18 17:08:06.291324 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:08:06.291249 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e08444d3-95a0-445d-bd2b-dede7d68aef3" path="/var/lib/kubelet/pods/e08444d3-95a0-445d-bd2b-dede7d68aef3/volumes" Mar 18 17:08:08.033318 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:08:08.033288 2575 generic.go:358] "Generic (PLEG): container finished" podID="da54fccd-dac7-4642-adca-5ef9afac042b" containerID="4e85e245b6b497bec6878b1f1fdcb8c2902c1196985a01fd984cd1531181f641" exitCode=0 Mar 18 17:08:08.033680 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:08:08.033348 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-59dbd8f4bd-wtm52" event={"ID":"da54fccd-dac7-4642-adca-5ef9afac042b","Type":"ContainerDied","Data":"4e85e245b6b497bec6878b1f1fdcb8c2902c1196985a01fd984cd1531181f641"} Mar 18 17:08:09.037735 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:08:09.037701 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-59dbd8f4bd-wtm52" event={"ID":"da54fccd-dac7-4642-adca-5ef9afac042b","Type":"ContainerStarted","Data":"469b2916930c64230f7ce1a588f120593eef117977dcd83b9772299461cee9de"} Mar 18 17:08:09.038115 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:08:09.037996 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-59dbd8f4bd-wtm52" Mar 18 17:08:09.039315 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:08:09.039288 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-59dbd8f4bd-wtm52" podUID="da54fccd-dac7-4642-adca-5ef9afac042b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.44:8080: connect: connection refused" Mar 18 17:08:09.055392 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:08:09.055323 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-59dbd8f4bd-wtm52" podStartSLOduration=7.055309848 podStartE2EDuration="7.055309848s" podCreationTimestamp="2026-03-18 17:08:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:08:09.054201823 +0000 UTC m=+1407.365454744" watchObservedRunningTime="2026-03-18 17:08:09.055309848 +0000 UTC m=+1407.366562770" Mar 18 17:08:10.041424 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:08:10.041384 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-59dbd8f4bd-wtm52" podUID="da54fccd-dac7-4642-adca-5ef9afac042b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.44:8080: connect: connection refused" Mar 18 17:08:19.287453 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:08:19.287218 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:08:20.042015 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:08:20.041919 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-59dbd8f4bd-wtm52" podUID="da54fccd-dac7-4642-adca-5ef9afac042b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.44:8080: connect: connection refused" Mar 18 17:08:30.041584 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:08:30.041541 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-59dbd8f4bd-wtm52" podUID="da54fccd-dac7-4642-adca-5ef9afac042b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.44:8080: connect: connection refused" Mar 18 17:08:32.290193 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:08:32.290160 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:08:40.042233 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:08:40.042192 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-59dbd8f4bd-wtm52" podUID="da54fccd-dac7-4642-adca-5ef9afac042b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.44:8080: connect: connection refused" Mar 18 17:08:46.287453 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:08:46.287414 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:08:50.042545 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:08:50.042518 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-59dbd8f4bd-wtm52" Mar 18 17:08:54.502939 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:08:54.502905 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-59dbd8f4bd-wtm52"] Mar 18 17:08:54.503420 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:08:54.503259 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-59dbd8f4bd-wtm52" podUID="da54fccd-dac7-4642-adca-5ef9afac042b" containerName="kserve-container" containerID="cri-o://469b2916930c64230f7ce1a588f120593eef117977dcd83b9772299461cee9de" gracePeriod=30 Mar 18 17:08:54.662938 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:08:54.662900 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-bbb95f64c-nf9m4"] Mar 18 17:08:54.663253 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:08:54.663241 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e08444d3-95a0-445d-bd2b-dede7d68aef3" containerName="storage-initializer" Mar 18 17:08:54.663295 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:08:54.663255 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="e08444d3-95a0-445d-bd2b-dede7d68aef3" containerName="storage-initializer" Mar 18 17:08:54.663295 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:08:54.663267 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e08444d3-95a0-445d-bd2b-dede7d68aef3" containerName="kserve-container" Mar 18 17:08:54.663295 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:08:54.663273 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="e08444d3-95a0-445d-bd2b-dede7d68aef3" containerName="kserve-container" Mar 18 17:08:54.663477 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:08:54.663335 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="e08444d3-95a0-445d-bd2b-dede7d68aef3" containerName="kserve-container" Mar 18 17:08:54.667829 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:08:54.667810 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-bbb95f64c-nf9m4" Mar 18 17:08:54.674046 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:08:54.674023 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-bbb95f64c-nf9m4"] Mar 18 17:08:54.725575 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:08:54.725544 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a42992c6-f3b5-44aa-a2e9-6281900a9307-kserve-provision-location\") pod \"isvc-pmml-predictor-bbb95f64c-nf9m4\" (UID: \"a42992c6-f3b5-44aa-a2e9-6281900a9307\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-bbb95f64c-nf9m4" Mar 18 17:08:54.725575 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:08:54.725586 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncpvs\" (UniqueName: \"kubernetes.io/projected/a42992c6-f3b5-44aa-a2e9-6281900a9307-kube-api-access-ncpvs\") pod \"isvc-pmml-predictor-bbb95f64c-nf9m4\" (UID: \"a42992c6-f3b5-44aa-a2e9-6281900a9307\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-bbb95f64c-nf9m4" Mar 18 17:08:54.826304 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:08:54.826224 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a42992c6-f3b5-44aa-a2e9-6281900a9307-kserve-provision-location\") pod \"isvc-pmml-predictor-bbb95f64c-nf9m4\" (UID: \"a42992c6-f3b5-44aa-a2e9-6281900a9307\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-bbb95f64c-nf9m4" Mar 18 17:08:54.826304 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:08:54.826260 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ncpvs\" (UniqueName: \"kubernetes.io/projected/a42992c6-f3b5-44aa-a2e9-6281900a9307-kube-api-access-ncpvs\") pod \"isvc-pmml-predictor-bbb95f64c-nf9m4\" (UID: \"a42992c6-f3b5-44aa-a2e9-6281900a9307\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-bbb95f64c-nf9m4" Mar 18 17:08:54.826622 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:08:54.826605 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a42992c6-f3b5-44aa-a2e9-6281900a9307-kserve-provision-location\") pod \"isvc-pmml-predictor-bbb95f64c-nf9m4\" (UID: \"a42992c6-f3b5-44aa-a2e9-6281900a9307\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-bbb95f64c-nf9m4" Mar 18 17:08:54.834191 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:08:54.834167 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncpvs\" (UniqueName: \"kubernetes.io/projected/a42992c6-f3b5-44aa-a2e9-6281900a9307-kube-api-access-ncpvs\") pod \"isvc-pmml-predictor-bbb95f64c-nf9m4\" (UID: \"a42992c6-f3b5-44aa-a2e9-6281900a9307\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-bbb95f64c-nf9m4" Mar 18 17:08:54.978808 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:08:54.978779 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-bbb95f64c-nf9m4" Mar 18 17:08:55.098507 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:08:55.098419 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-bbb95f64c-nf9m4"] Mar 18 17:08:55.101190 ip-10-0-139-49 kubenswrapper[2575]: W0318 17:08:55.101155 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda42992c6_f3b5_44aa_a2e9_6281900a9307.slice/crio-ca7d5826d58c255c36d6b42653a172434f3cef19b51b5108719d91a8c9998887 WatchSource:0}: Error finding container ca7d5826d58c255c36d6b42653a172434f3cef19b51b5108719d91a8c9998887: Status 404 returned error can't find the container with id ca7d5826d58c255c36d6b42653a172434f3cef19b51b5108719d91a8c9998887 Mar 18 17:08:55.190600 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:08:55.190565 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-bbb95f64c-nf9m4" event={"ID":"a42992c6-f3b5-44aa-a2e9-6281900a9307","Type":"ContainerStarted","Data":"9b9f458b915757dcd8f7c1a0ac46fbf0474570bdd220d8dc02cde53541cbdef5"} Mar 18 17:08:55.190600 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:08:55.190602 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-bbb95f64c-nf9m4" event={"ID":"a42992c6-f3b5-44aa-a2e9-6281900a9307","Type":"ContainerStarted","Data":"ca7d5826d58c255c36d6b42653a172434f3cef19b51b5108719d91a8c9998887"} Mar 18 17:08:57.052932 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:08:57.052908 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-59dbd8f4bd-wtm52" Mar 18 17:08:57.148197 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:08:57.148124 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/da54fccd-dac7-4642-adca-5ef9afac042b-kserve-provision-location\") pod \"da54fccd-dac7-4642-adca-5ef9afac042b\" (UID: \"da54fccd-dac7-4642-adca-5ef9afac042b\") " Mar 18 17:08:57.148327 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:08:57.148203 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jdlvj\" (UniqueName: \"kubernetes.io/projected/da54fccd-dac7-4642-adca-5ef9afac042b-kube-api-access-jdlvj\") pod \"da54fccd-dac7-4642-adca-5ef9afac042b\" (UID: \"da54fccd-dac7-4642-adca-5ef9afac042b\") " Mar 18 17:08:57.150507 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:08:57.150481 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da54fccd-dac7-4642-adca-5ef9afac042b-kube-api-access-jdlvj" (OuterVolumeSpecName: "kube-api-access-jdlvj") pod "da54fccd-dac7-4642-adca-5ef9afac042b" (UID: "da54fccd-dac7-4642-adca-5ef9afac042b"). InnerVolumeSpecName "kube-api-access-jdlvj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 17:08:57.158061 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:08:57.158038 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da54fccd-dac7-4642-adca-5ef9afac042b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "da54fccd-dac7-4642-adca-5ef9afac042b" (UID: "da54fccd-dac7-4642-adca-5ef9afac042b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 18 17:08:57.199352 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:08:57.199327 2575 generic.go:358] "Generic (PLEG): container finished" podID="da54fccd-dac7-4642-adca-5ef9afac042b" containerID="469b2916930c64230f7ce1a588f120593eef117977dcd83b9772299461cee9de" exitCode=0 Mar 18 17:08:57.199467 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:08:57.199414 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-59dbd8f4bd-wtm52" Mar 18 17:08:57.199513 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:08:57.199408 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-59dbd8f4bd-wtm52" event={"ID":"da54fccd-dac7-4642-adca-5ef9afac042b","Type":"ContainerDied","Data":"469b2916930c64230f7ce1a588f120593eef117977dcd83b9772299461cee9de"} Mar 18 17:08:57.199552 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:08:57.199528 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-59dbd8f4bd-wtm52" event={"ID":"da54fccd-dac7-4642-adca-5ef9afac042b","Type":"ContainerDied","Data":"b7168f11e87945447eb19faeced010346714489b51c876169441514db52c2750"} Mar 18 17:08:57.199552 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:08:57.199549 2575 scope.go:117] "RemoveContainer" containerID="469b2916930c64230f7ce1a588f120593eef117977dcd83b9772299461cee9de" Mar 18 17:08:57.207880 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:08:57.207866 2575 scope.go:117] "RemoveContainer" containerID="4e85e245b6b497bec6878b1f1fdcb8c2902c1196985a01fd984cd1531181f641" Mar 18 17:08:57.216532 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:08:57.216516 2575 scope.go:117] "RemoveContainer" containerID="469b2916930c64230f7ce1a588f120593eef117977dcd83b9772299461cee9de" Mar 18 17:08:57.216760 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:08:57.216741 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"469b2916930c64230f7ce1a588f120593eef117977dcd83b9772299461cee9de\": container with ID starting with 469b2916930c64230f7ce1a588f120593eef117977dcd83b9772299461cee9de not found: ID does not exist" containerID="469b2916930c64230f7ce1a588f120593eef117977dcd83b9772299461cee9de" Mar 18 17:08:57.216814 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:08:57.216768 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"469b2916930c64230f7ce1a588f120593eef117977dcd83b9772299461cee9de"} err="failed to get container status \"469b2916930c64230f7ce1a588f120593eef117977dcd83b9772299461cee9de\": rpc error: code = NotFound desc = could not find container \"469b2916930c64230f7ce1a588f120593eef117977dcd83b9772299461cee9de\": container with ID starting with 469b2916930c64230f7ce1a588f120593eef117977dcd83b9772299461cee9de not found: ID does not exist" Mar 18 17:08:57.216814 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:08:57.216784 2575 scope.go:117] "RemoveContainer" containerID="4e85e245b6b497bec6878b1f1fdcb8c2902c1196985a01fd984cd1531181f641" Mar 18 17:08:57.217041 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:08:57.217001 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e85e245b6b497bec6878b1f1fdcb8c2902c1196985a01fd984cd1531181f641\": container with ID starting with 4e85e245b6b497bec6878b1f1fdcb8c2902c1196985a01fd984cd1531181f641 not found: ID does not exist" containerID="4e85e245b6b497bec6878b1f1fdcb8c2902c1196985a01fd984cd1531181f641" Mar 18 17:08:57.217041 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:08:57.217026 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e85e245b6b497bec6878b1f1fdcb8c2902c1196985a01fd984cd1531181f641"} err="failed to get container status \"4e85e245b6b497bec6878b1f1fdcb8c2902c1196985a01fd984cd1531181f641\": rpc error: code = NotFound desc = could not find container \"4e85e245b6b497bec6878b1f1fdcb8c2902c1196985a01fd984cd1531181f641\": container with ID starting with 4e85e245b6b497bec6878b1f1fdcb8c2902c1196985a01fd984cd1531181f641 not found: ID does not exist" Mar 18 17:08:57.219590 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:08:57.219568 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-59dbd8f4bd-wtm52"] Mar 18 17:08:57.221014 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:08:57.220995 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-59dbd8f4bd-wtm52"] Mar 18 17:08:57.248891 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:08:57.248869 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jdlvj\" (UniqueName: \"kubernetes.io/projected/da54fccd-dac7-4642-adca-5ef9afac042b-kube-api-access-jdlvj\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 17:08:57.248891 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:08:57.248891 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/da54fccd-dac7-4642-adca-5ef9afac042b-kserve-provision-location\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 17:08:58.289566 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:08:58.289533 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da54fccd-dac7-4642-adca-5ef9afac042b" path="/var/lib/kubelet/pods/da54fccd-dac7-4642-adca-5ef9afac042b/volumes" Mar 18 17:08:59.209005 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:08:59.208970 2575 generic.go:358] "Generic (PLEG): container finished" podID="a42992c6-f3b5-44aa-a2e9-6281900a9307" containerID="9b9f458b915757dcd8f7c1a0ac46fbf0474570bdd220d8dc02cde53541cbdef5" exitCode=0 Mar 18 17:08:59.209166 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:08:59.209015 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-bbb95f64c-nf9m4" event={"ID":"a42992c6-f3b5-44aa-a2e9-6281900a9307","Type":"ContainerDied","Data":"9b9f458b915757dcd8f7c1a0ac46fbf0474570bdd220d8dc02cde53541cbdef5"} Mar 18 17:08:59.287296 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:08:59.287268 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:09:06.242301 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:09:06.242271 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-bbb95f64c-nf9m4" event={"ID":"a42992c6-f3b5-44aa-a2e9-6281900a9307","Type":"ContainerStarted","Data":"d98143f31050dee9c0db25581b9dbeb3ed3edfd2b36803370168f9ab4410cd06"} Mar 18 17:09:06.242726 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:09:06.242583 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-predictor-bbb95f64c-nf9m4" Mar 18 17:09:06.243964 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:09:06.243936 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-bbb95f64c-nf9m4" podUID="a42992c6-f3b5-44aa-a2e9-6281900a9307" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.45:8080: connect: connection refused" Mar 18 17:09:06.259877 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:09:06.259838 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-pmml-predictor-bbb95f64c-nf9m4" podStartSLOduration=5.354289629 podStartE2EDuration="12.259825809s" podCreationTimestamp="2026-03-18 17:08:54 +0000 UTC" firstStartedPulling="2026-03-18 17:08:59.210194944 +0000 UTC m=+1457.521447842" lastFinishedPulling="2026-03-18 17:09:06.115731124 +0000 UTC m=+1464.426984022" observedRunningTime="2026-03-18 17:09:06.257883069 +0000 UTC m=+1464.569135989" watchObservedRunningTime="2026-03-18 17:09:06.259825809 +0000 UTC m=+1464.571078729" Mar 18 17:09:07.246025 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:09:07.245987 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-bbb95f64c-nf9m4" podUID="a42992c6-f3b5-44aa-a2e9-6281900a9307" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.45:8080: connect: connection refused" Mar 18 17:09:10.286561 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:09:10.286520 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:09:17.246267 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:09:17.246226 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-bbb95f64c-nf9m4" podUID="a42992c6-f3b5-44aa-a2e9-6281900a9307" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.45:8080: connect: connection refused" Mar 18 17:09:21.287385 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:09:21.287341 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:09:27.246825 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:09:27.246782 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-bbb95f64c-nf9m4" podUID="a42992c6-f3b5-44aa-a2e9-6281900a9307" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.45:8080: connect: connection refused" Mar 18 17:09:36.287700 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:09:36.287668 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:09:37.246786 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:09:37.246741 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-bbb95f64c-nf9m4" podUID="a42992c6-f3b5-44aa-a2e9-6281900a9307" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.45:8080: connect: connection refused" Mar 18 17:09:42.265602 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:09:42.265579 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qktrm_a51f93e2-c750-4700-bb46-109456c7c78a/ovn-acl-logging/0.log" Mar 18 17:09:42.271496 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:09:42.271476 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qktrm_a51f93e2-c750-4700-bb46-109456c7c78a/ovn-acl-logging/0.log" Mar 18 17:09:47.246407 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:09:47.246343 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-bbb95f64c-nf9m4" podUID="a42992c6-f3b5-44aa-a2e9-6281900a9307" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.45:8080: connect: connection refused" Mar 18 17:09:50.287278 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:09:50.287248 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:09:57.246209 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:09:57.246160 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-bbb95f64c-nf9m4" podUID="a42992c6-f3b5-44aa-a2e9-6281900a9307" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.45:8080: connect: connection refused" Mar 18 17:10:05.287019 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:10:05.286986 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:10:07.246648 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:10:07.246601 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-bbb95f64c-nf9m4" podUID="a42992c6-f3b5-44aa-a2e9-6281900a9307" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.45:8080: connect: connection refused" Mar 18 17:10:17.246276 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:10:17.246236 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-bbb95f64c-nf9m4" podUID="a42992c6-f3b5-44aa-a2e9-6281900a9307" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.45:8080: connect: connection refused" Mar 18 17:10:18.286846 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:10:18.286811 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:10:27.247545 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:10:27.247512 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-predictor-bbb95f64c-nf9m4" Mar 18 17:10:32.288471 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:10:32.288428 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 17:10:32.288717 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:10:32.288640 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:10:35.775870 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:10:35.775834 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-bbb95f64c-nf9m4"] Mar 18 17:10:35.776314 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:10:35.776074 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-predictor-bbb95f64c-nf9m4" podUID="a42992c6-f3b5-44aa-a2e9-6281900a9307" containerName="kserve-container" containerID="cri-o://d98143f31050dee9c0db25581b9dbeb3ed3edfd2b36803370168f9ab4410cd06" gracePeriod=30 Mar 18 17:10:36.047161 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:10:36.047090 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7d448dc787-rxvbn"] Mar 18 17:10:36.047438 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:10:36.047426 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="da54fccd-dac7-4642-adca-5ef9afac042b" containerName="storage-initializer" Mar 18 17:10:36.047484 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:10:36.047440 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="da54fccd-dac7-4642-adca-5ef9afac042b" containerName="storage-initializer" Mar 18 17:10:36.047484 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:10:36.047457 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="da54fccd-dac7-4642-adca-5ef9afac042b" containerName="kserve-container" Mar 18 17:10:36.047484 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:10:36.047463 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="da54fccd-dac7-4642-adca-5ef9afac042b" containerName="kserve-container" Mar 18 17:10:36.047593 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:10:36.047517 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="da54fccd-dac7-4642-adca-5ef9afac042b" containerName="kserve-container" Mar 18 17:10:36.050433 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:10:36.050417 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7d448dc787-rxvbn" Mar 18 17:10:36.057613 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:10:36.057592 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7d448dc787-rxvbn"] Mar 18 17:10:36.140187 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:10:36.140159 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pww44\" (UniqueName: \"kubernetes.io/projected/7a3f566a-0aad-4697-9707-43c4760df7c8-kube-api-access-pww44\") pod \"isvc-pmml-runtime-predictor-7d448dc787-rxvbn\" (UID: \"7a3f566a-0aad-4697-9707-43c4760df7c8\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7d448dc787-rxvbn" Mar 18 17:10:36.140187 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:10:36.140186 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7a3f566a-0aad-4697-9707-43c4760df7c8-kserve-provision-location\") pod \"isvc-pmml-runtime-predictor-7d448dc787-rxvbn\" (UID: \"7a3f566a-0aad-4697-9707-43c4760df7c8\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7d448dc787-rxvbn" Mar 18 17:10:36.240977 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:10:36.240943 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pww44\" (UniqueName: \"kubernetes.io/projected/7a3f566a-0aad-4697-9707-43c4760df7c8-kube-api-access-pww44\") pod \"isvc-pmml-runtime-predictor-7d448dc787-rxvbn\" (UID: \"7a3f566a-0aad-4697-9707-43c4760df7c8\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7d448dc787-rxvbn" Mar 18 17:10:36.241131 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:10:36.240979 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7a3f566a-0aad-4697-9707-43c4760df7c8-kserve-provision-location\") pod \"isvc-pmml-runtime-predictor-7d448dc787-rxvbn\" (UID: \"7a3f566a-0aad-4697-9707-43c4760df7c8\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7d448dc787-rxvbn" Mar 18 17:10:36.241387 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:10:36.241334 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7a3f566a-0aad-4697-9707-43c4760df7c8-kserve-provision-location\") pod \"isvc-pmml-runtime-predictor-7d448dc787-rxvbn\" (UID: \"7a3f566a-0aad-4697-9707-43c4760df7c8\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7d448dc787-rxvbn" Mar 18 17:10:36.248562 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:10:36.248540 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pww44\" (UniqueName: \"kubernetes.io/projected/7a3f566a-0aad-4697-9707-43c4760df7c8-kube-api-access-pww44\") pod \"isvc-pmml-runtime-predictor-7d448dc787-rxvbn\" (UID: \"7a3f566a-0aad-4697-9707-43c4760df7c8\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7d448dc787-rxvbn" Mar 18 17:10:36.360763 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:10:36.360669 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7d448dc787-rxvbn" Mar 18 17:10:36.484666 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:10:36.484636 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7d448dc787-rxvbn"] Mar 18 17:10:36.485780 ip-10-0-139-49 kubenswrapper[2575]: W0318 17:10:36.485750 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a3f566a_0aad_4697_9707_43c4760df7c8.slice/crio-9813f1ef83f497dd0be6dc629f9a2ac673b3cfe19d8f7e0c685b3bc5406bc2db WatchSource:0}: Error finding container 9813f1ef83f497dd0be6dc629f9a2ac673b3cfe19d8f7e0c685b3bc5406bc2db: Status 404 returned error can't find the container with id 9813f1ef83f497dd0be6dc629f9a2ac673b3cfe19d8f7e0c685b3bc5406bc2db Mar 18 17:10:36.544529 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:10:36.544500 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7d448dc787-rxvbn" event={"ID":"7a3f566a-0aad-4697-9707-43c4760df7c8","Type":"ContainerStarted","Data":"9813f1ef83f497dd0be6dc629f9a2ac673b3cfe19d8f7e0c685b3bc5406bc2db"} Mar 18 17:10:37.246447 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:10:37.246405 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-bbb95f64c-nf9m4" podUID="a42992c6-f3b5-44aa-a2e9-6281900a9307" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.45:8080: connect: connection refused" Mar 18 17:10:37.550008 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:10:37.549892 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7d448dc787-rxvbn" event={"ID":"7a3f566a-0aad-4697-9707-43c4760df7c8","Type":"ContainerStarted","Data":"5ff149c4de879ff7ca91aa2b236ffe14a7188c5e7cd381135a1189c1dee06b9e"} Mar 18 17:10:39.121139 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:10:39.121116 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-bbb95f64c-nf9m4" Mar 18 17:10:39.164907 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:10:39.164882 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ncpvs\" (UniqueName: \"kubernetes.io/projected/a42992c6-f3b5-44aa-a2e9-6281900a9307-kube-api-access-ncpvs\") pod \"a42992c6-f3b5-44aa-a2e9-6281900a9307\" (UID: \"a42992c6-f3b5-44aa-a2e9-6281900a9307\") " Mar 18 17:10:39.165040 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:10:39.164920 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a42992c6-f3b5-44aa-a2e9-6281900a9307-kserve-provision-location\") pod \"a42992c6-f3b5-44aa-a2e9-6281900a9307\" (UID: \"a42992c6-f3b5-44aa-a2e9-6281900a9307\") " Mar 18 17:10:39.165261 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:10:39.165239 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a42992c6-f3b5-44aa-a2e9-6281900a9307-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a42992c6-f3b5-44aa-a2e9-6281900a9307" (UID: "a42992c6-f3b5-44aa-a2e9-6281900a9307"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 18 17:10:39.167117 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:10:39.167094 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a42992c6-f3b5-44aa-a2e9-6281900a9307-kube-api-access-ncpvs" (OuterVolumeSpecName: "kube-api-access-ncpvs") pod "a42992c6-f3b5-44aa-a2e9-6281900a9307" (UID: "a42992c6-f3b5-44aa-a2e9-6281900a9307"). InnerVolumeSpecName "kube-api-access-ncpvs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 17:10:39.266088 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:10:39.266065 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ncpvs\" (UniqueName: \"kubernetes.io/projected/a42992c6-f3b5-44aa-a2e9-6281900a9307-kube-api-access-ncpvs\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 17:10:39.266088 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:10:39.266088 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a42992c6-f3b5-44aa-a2e9-6281900a9307-kserve-provision-location\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 17:10:39.558094 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:10:39.557999 2575 generic.go:358] "Generic (PLEG): container finished" podID="a42992c6-f3b5-44aa-a2e9-6281900a9307" containerID="d98143f31050dee9c0db25581b9dbeb3ed3edfd2b36803370168f9ab4410cd06" exitCode=0 Mar 18 17:10:39.558094 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:10:39.558047 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-bbb95f64c-nf9m4" event={"ID":"a42992c6-f3b5-44aa-a2e9-6281900a9307","Type":"ContainerDied","Data":"d98143f31050dee9c0db25581b9dbeb3ed3edfd2b36803370168f9ab4410cd06"} Mar 18 17:10:39.558094 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:10:39.558072 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-bbb95f64c-nf9m4" Mar 18 17:10:39.558094 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:10:39.558082 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-bbb95f64c-nf9m4" event={"ID":"a42992c6-f3b5-44aa-a2e9-6281900a9307","Type":"ContainerDied","Data":"ca7d5826d58c255c36d6b42653a172434f3cef19b51b5108719d91a8c9998887"} Mar 18 17:10:39.558443 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:10:39.558102 2575 scope.go:117] "RemoveContainer" containerID="d98143f31050dee9c0db25581b9dbeb3ed3edfd2b36803370168f9ab4410cd06" Mar 18 17:10:39.566814 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:10:39.566792 2575 scope.go:117] "RemoveContainer" containerID="9b9f458b915757dcd8f7c1a0ac46fbf0474570bdd220d8dc02cde53541cbdef5" Mar 18 17:10:39.578596 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:10:39.578532 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-bbb95f64c-nf9m4"] Mar 18 17:10:39.579993 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:10:39.579012 2575 scope.go:117] "RemoveContainer" containerID="d98143f31050dee9c0db25581b9dbeb3ed3edfd2b36803370168f9ab4410cd06" Mar 18 17:10:39.580124 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:10:39.580095 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d98143f31050dee9c0db25581b9dbeb3ed3edfd2b36803370168f9ab4410cd06\": container with ID starting with d98143f31050dee9c0db25581b9dbeb3ed3edfd2b36803370168f9ab4410cd06 not found: ID does not exist" containerID="d98143f31050dee9c0db25581b9dbeb3ed3edfd2b36803370168f9ab4410cd06" Mar 18 17:10:39.580215 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:10:39.580139 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d98143f31050dee9c0db25581b9dbeb3ed3edfd2b36803370168f9ab4410cd06"} err="failed to get container status \"d98143f31050dee9c0db25581b9dbeb3ed3edfd2b36803370168f9ab4410cd06\": rpc error: code = NotFound desc = could not find container \"d98143f31050dee9c0db25581b9dbeb3ed3edfd2b36803370168f9ab4410cd06\": container with ID starting with d98143f31050dee9c0db25581b9dbeb3ed3edfd2b36803370168f9ab4410cd06 not found: ID does not exist" Mar 18 17:10:39.580215 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:10:39.580166 2575 scope.go:117] "RemoveContainer" containerID="9b9f458b915757dcd8f7c1a0ac46fbf0474570bdd220d8dc02cde53541cbdef5" Mar 18 17:10:39.580614 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:10:39.580578 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b9f458b915757dcd8f7c1a0ac46fbf0474570bdd220d8dc02cde53541cbdef5\": container with ID starting with 9b9f458b915757dcd8f7c1a0ac46fbf0474570bdd220d8dc02cde53541cbdef5 not found: ID does not exist" containerID="9b9f458b915757dcd8f7c1a0ac46fbf0474570bdd220d8dc02cde53541cbdef5" Mar 18 17:10:39.580702 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:10:39.580628 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b9f458b915757dcd8f7c1a0ac46fbf0474570bdd220d8dc02cde53541cbdef5"} err="failed to get container status \"9b9f458b915757dcd8f7c1a0ac46fbf0474570bdd220d8dc02cde53541cbdef5\": rpc error: code = NotFound desc = could not find container \"9b9f458b915757dcd8f7c1a0ac46fbf0474570bdd220d8dc02cde53541cbdef5\": container with ID starting with 9b9f458b915757dcd8f7c1a0ac46fbf0474570bdd220d8dc02cde53541cbdef5 not found: ID does not exist" Mar 18 17:10:39.582904 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:10:39.582877 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-bbb95f64c-nf9m4"] Mar 18 17:10:40.291226 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:10:40.291163 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a42992c6-f3b5-44aa-a2e9-6281900a9307" path="/var/lib/kubelet/pods/a42992c6-f3b5-44aa-a2e9-6281900a9307/volumes" Mar 18 17:10:40.562931 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:10:40.562836 2575 generic.go:358] "Generic (PLEG): container finished" podID="7a3f566a-0aad-4697-9707-43c4760df7c8" containerID="5ff149c4de879ff7ca91aa2b236ffe14a7188c5e7cd381135a1189c1dee06b9e" exitCode=0 Mar 18 17:10:40.562931 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:10:40.562889 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7d448dc787-rxvbn" event={"ID":"7a3f566a-0aad-4697-9707-43c4760df7c8","Type":"ContainerDied","Data":"5ff149c4de879ff7ca91aa2b236ffe14a7188c5e7cd381135a1189c1dee06b9e"} Mar 18 17:10:41.567601 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:10:41.567566 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7d448dc787-rxvbn" event={"ID":"7a3f566a-0aad-4697-9707-43c4760df7c8","Type":"ContainerStarted","Data":"f8ab9dc15c8ebdfb8ae7dde65356530256f64366d0de9784e41f196cf525ab82"} Mar 18 17:10:41.568031 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:10:41.567900 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7d448dc787-rxvbn" Mar 18 17:10:41.568803 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:10:41.568782 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7d448dc787-rxvbn" podUID="7a3f566a-0aad-4697-9707-43c4760df7c8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.46:8080: connect: connection refused" Mar 18 17:10:41.587158 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:10:41.585151 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7d448dc787-rxvbn" podStartSLOduration=5.585131306 podStartE2EDuration="5.585131306s" podCreationTimestamp="2026-03-18 17:10:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:10:41.581812468 +0000 UTC m=+1559.893065383" watchObservedRunningTime="2026-03-18 17:10:41.585131306 +0000 UTC m=+1559.896384223" Mar 18 17:10:42.571145 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:10:42.571109 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7d448dc787-rxvbn" podUID="7a3f566a-0aad-4697-9707-43c4760df7c8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.46:8080: connect: connection refused" Mar 18 17:10:44.287225 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:10:44.287185 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:10:52.572110 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:10:52.572068 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7d448dc787-rxvbn" podUID="7a3f566a-0aad-4697-9707-43c4760df7c8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.46:8080: connect: connection refused" Mar 18 17:10:57.287366 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:10:57.287328 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:11:02.571351 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:11:02.571305 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7d448dc787-rxvbn" podUID="7a3f566a-0aad-4697-9707-43c4760df7c8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.46:8080: connect: connection refused" Mar 18 17:11:12.288596 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:11:12.288471 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:11:12.571767 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:11:12.571674 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7d448dc787-rxvbn" podUID="7a3f566a-0aad-4697-9707-43c4760df7c8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.46:8080: connect: connection refused" Mar 18 17:11:22.571539 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:11:22.571447 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7d448dc787-rxvbn" podUID="7a3f566a-0aad-4697-9707-43c4760df7c8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.46:8080: connect: connection refused" Mar 18 17:11:23.287287 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:11:23.287256 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:11:32.571095 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:11:32.571054 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7d448dc787-rxvbn" podUID="7a3f566a-0aad-4697-9707-43c4760df7c8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.46:8080: connect: connection refused" Mar 18 17:11:34.287497 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:11:34.287453 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:11:42.571605 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:11:42.571566 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7d448dc787-rxvbn" podUID="7a3f566a-0aad-4697-9707-43c4760df7c8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.46:8080: connect: connection refused" Mar 18 17:11:46.287036 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:11:46.286936 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:11:50.286006 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:11:50.285963 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7d448dc787-rxvbn" podUID="7a3f566a-0aad-4697-9707-43c4760df7c8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.46:8080: connect: connection refused" Mar 18 17:12:00.290149 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:12:00.290123 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7d448dc787-rxvbn" Mar 18 17:12:01.287280 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:12:01.287242 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:12:06.871080 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:12:06.871047 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7d448dc787-rxvbn"] Mar 18 17:12:06.871455 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:12:06.871319 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7d448dc787-rxvbn" podUID="7a3f566a-0aad-4697-9707-43c4760df7c8" containerName="kserve-container" containerID="cri-o://f8ab9dc15c8ebdfb8ae7dde65356530256f64366d0de9784e41f196cf525ab82" gracePeriod=30 Mar 18 17:12:07.249161 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:12:07.249121 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-799cc7bf5-mgwp5"] Mar 18 17:12:07.249572 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:12:07.249553 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a42992c6-f3b5-44aa-a2e9-6281900a9307" containerName="storage-initializer" Mar 18 17:12:07.249681 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:12:07.249575 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="a42992c6-f3b5-44aa-a2e9-6281900a9307" containerName="storage-initializer" Mar 18 17:12:07.249681 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:12:07.249593 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a42992c6-f3b5-44aa-a2e9-6281900a9307" containerName="kserve-container" Mar 18 17:12:07.249681 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:12:07.249601 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="a42992c6-f3b5-44aa-a2e9-6281900a9307" containerName="kserve-container" Mar 18 17:12:07.249847 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:12:07.249686 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="a42992c6-f3b5-44aa-a2e9-6281900a9307" containerName="kserve-container" Mar 18 17:12:07.252709 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:12:07.252688 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-799cc7bf5-mgwp5" Mar 18 17:12:07.260802 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:12:07.260776 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-799cc7bf5-mgwp5"] Mar 18 17:12:07.303089 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:12:07.303056 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8af629a9-50bc-4252-898a-50744552feb1-kserve-provision-location\") pod \"isvc-pmml-v2-kserve-predictor-799cc7bf5-mgwp5\" (UID: \"8af629a9-50bc-4252-898a-50744552feb1\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-799cc7bf5-mgwp5" Mar 18 17:12:07.303226 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:12:07.303093 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w64wf\" (UniqueName: \"kubernetes.io/projected/8af629a9-50bc-4252-898a-50744552feb1-kube-api-access-w64wf\") pod \"isvc-pmml-v2-kserve-predictor-799cc7bf5-mgwp5\" (UID: \"8af629a9-50bc-4252-898a-50744552feb1\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-799cc7bf5-mgwp5" Mar 18 17:12:07.404323 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:12:07.404286 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8af629a9-50bc-4252-898a-50744552feb1-kserve-provision-location\") pod \"isvc-pmml-v2-kserve-predictor-799cc7bf5-mgwp5\" (UID: \"8af629a9-50bc-4252-898a-50744552feb1\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-799cc7bf5-mgwp5" Mar 18 17:12:07.404536 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:12:07.404332 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w64wf\" (UniqueName: \"kubernetes.io/projected/8af629a9-50bc-4252-898a-50744552feb1-kube-api-access-w64wf\") pod \"isvc-pmml-v2-kserve-predictor-799cc7bf5-mgwp5\" (UID: \"8af629a9-50bc-4252-898a-50744552feb1\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-799cc7bf5-mgwp5" Mar 18 17:12:07.404727 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:12:07.404704 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8af629a9-50bc-4252-898a-50744552feb1-kserve-provision-location\") pod \"isvc-pmml-v2-kserve-predictor-799cc7bf5-mgwp5\" (UID: \"8af629a9-50bc-4252-898a-50744552feb1\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-799cc7bf5-mgwp5" Mar 18 17:12:07.412178 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:12:07.412159 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w64wf\" (UniqueName: \"kubernetes.io/projected/8af629a9-50bc-4252-898a-50744552feb1-kube-api-access-w64wf\") pod \"isvc-pmml-v2-kserve-predictor-799cc7bf5-mgwp5\" (UID: \"8af629a9-50bc-4252-898a-50744552feb1\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-799cc7bf5-mgwp5" Mar 18 17:12:07.563630 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:12:07.563550 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-799cc7bf5-mgwp5" Mar 18 17:12:07.685296 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:12:07.685267 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-799cc7bf5-mgwp5"] Mar 18 17:12:07.688301 ip-10-0-139-49 kubenswrapper[2575]: W0318 17:12:07.688268 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8af629a9_50bc_4252_898a_50744552feb1.slice/crio-c156c9add36ae109e7f4e4efd7a79aa33913478864cf94963112abd17ee0490f WatchSource:0}: Error finding container c156c9add36ae109e7f4e4efd7a79aa33913478864cf94963112abd17ee0490f: Status 404 returned error can't find the container with id c156c9add36ae109e7f4e4efd7a79aa33913478864cf94963112abd17ee0490f Mar 18 17:12:07.854095 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:12:07.853993 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-799cc7bf5-mgwp5" event={"ID":"8af629a9-50bc-4252-898a-50744552feb1","Type":"ContainerStarted","Data":"17c988f370a60ec3d40fe86e71b011af9c8e1267aba25d03a6da12b0ba9c3096"} Mar 18 17:12:07.854095 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:12:07.854038 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-799cc7bf5-mgwp5" event={"ID":"8af629a9-50bc-4252-898a-50744552feb1","Type":"ContainerStarted","Data":"c156c9add36ae109e7f4e4efd7a79aa33913478864cf94963112abd17ee0490f"} Mar 18 17:12:10.286314 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:12:10.286275 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7d448dc787-rxvbn" podUID="7a3f566a-0aad-4697-9707-43c4760df7c8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.46:8080: connect: connection refused" Mar 18 17:12:10.412704 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:12:10.412683 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7d448dc787-rxvbn" Mar 18 17:12:10.428631 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:12:10.428547 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pww44\" (UniqueName: \"kubernetes.io/projected/7a3f566a-0aad-4697-9707-43c4760df7c8-kube-api-access-pww44\") pod \"7a3f566a-0aad-4697-9707-43c4760df7c8\" (UID: \"7a3f566a-0aad-4697-9707-43c4760df7c8\") " Mar 18 17:12:10.428631 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:12:10.428618 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7a3f566a-0aad-4697-9707-43c4760df7c8-kserve-provision-location\") pod \"7a3f566a-0aad-4697-9707-43c4760df7c8\" (UID: \"7a3f566a-0aad-4697-9707-43c4760df7c8\") " Mar 18 17:12:10.428955 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:12:10.428927 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a3f566a-0aad-4697-9707-43c4760df7c8-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "7a3f566a-0aad-4697-9707-43c4760df7c8" (UID: "7a3f566a-0aad-4697-9707-43c4760df7c8"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 18 17:12:10.430945 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:12:10.430921 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a3f566a-0aad-4697-9707-43c4760df7c8-kube-api-access-pww44" (OuterVolumeSpecName: "kube-api-access-pww44") pod "7a3f566a-0aad-4697-9707-43c4760df7c8" (UID: "7a3f566a-0aad-4697-9707-43c4760df7c8"). InnerVolumeSpecName "kube-api-access-pww44". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 17:12:10.530033 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:12:10.530012 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pww44\" (UniqueName: \"kubernetes.io/projected/7a3f566a-0aad-4697-9707-43c4760df7c8-kube-api-access-pww44\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 17:12:10.530033 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:12:10.530033 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7a3f566a-0aad-4697-9707-43c4760df7c8-kserve-provision-location\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 17:12:10.867462 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:12:10.867339 2575 generic.go:358] "Generic (PLEG): container finished" podID="7a3f566a-0aad-4697-9707-43c4760df7c8" containerID="f8ab9dc15c8ebdfb8ae7dde65356530256f64366d0de9784e41f196cf525ab82" exitCode=0 Mar 18 17:12:10.867462 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:12:10.867428 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7d448dc787-rxvbn" Mar 18 17:12:10.867692 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:12:10.867423 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7d448dc787-rxvbn" event={"ID":"7a3f566a-0aad-4697-9707-43c4760df7c8","Type":"ContainerDied","Data":"f8ab9dc15c8ebdfb8ae7dde65356530256f64366d0de9784e41f196cf525ab82"} Mar 18 17:12:10.867692 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:12:10.867540 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7d448dc787-rxvbn" event={"ID":"7a3f566a-0aad-4697-9707-43c4760df7c8","Type":"ContainerDied","Data":"9813f1ef83f497dd0be6dc629f9a2ac673b3cfe19d8f7e0c685b3bc5406bc2db"} Mar 18 17:12:10.867692 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:12:10.867562 2575 scope.go:117] "RemoveContainer" containerID="f8ab9dc15c8ebdfb8ae7dde65356530256f64366d0de9784e41f196cf525ab82" Mar 18 17:12:10.881949 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:12:10.881930 2575 scope.go:117] "RemoveContainer" containerID="5ff149c4de879ff7ca91aa2b236ffe14a7188c5e7cd381135a1189c1dee06b9e" Mar 18 17:12:10.890968 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:12:10.890945 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7d448dc787-rxvbn"] Mar 18 17:12:10.897080 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:12:10.897055 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7d448dc787-rxvbn"] Mar 18 17:12:10.900754 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:12:10.900730 2575 scope.go:117] "RemoveContainer" containerID="f8ab9dc15c8ebdfb8ae7dde65356530256f64366d0de9784e41f196cf525ab82" Mar 18 17:12:10.901052 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:12:10.901032 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8ab9dc15c8ebdfb8ae7dde65356530256f64366d0de9784e41f196cf525ab82\": container with ID starting with f8ab9dc15c8ebdfb8ae7dde65356530256f64366d0de9784e41f196cf525ab82 not found: ID does not exist" containerID="f8ab9dc15c8ebdfb8ae7dde65356530256f64366d0de9784e41f196cf525ab82" Mar 18 17:12:10.901131 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:12:10.901066 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8ab9dc15c8ebdfb8ae7dde65356530256f64366d0de9784e41f196cf525ab82"} err="failed to get container status \"f8ab9dc15c8ebdfb8ae7dde65356530256f64366d0de9784e41f196cf525ab82\": rpc error: code = NotFound desc = could not find container \"f8ab9dc15c8ebdfb8ae7dde65356530256f64366d0de9784e41f196cf525ab82\": container with ID starting with f8ab9dc15c8ebdfb8ae7dde65356530256f64366d0de9784e41f196cf525ab82 not found: ID does not exist" Mar 18 17:12:10.901131 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:12:10.901092 2575 scope.go:117] "RemoveContainer" containerID="5ff149c4de879ff7ca91aa2b236ffe14a7188c5e7cd381135a1189c1dee06b9e" Mar 18 17:12:10.901394 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:12:10.901351 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ff149c4de879ff7ca91aa2b236ffe14a7188c5e7cd381135a1189c1dee06b9e\": container with ID starting with 5ff149c4de879ff7ca91aa2b236ffe14a7188c5e7cd381135a1189c1dee06b9e not found: ID does not exist" containerID="5ff149c4de879ff7ca91aa2b236ffe14a7188c5e7cd381135a1189c1dee06b9e" Mar 18 17:12:10.901436 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:12:10.901402 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ff149c4de879ff7ca91aa2b236ffe14a7188c5e7cd381135a1189c1dee06b9e"} err="failed to get container status \"5ff149c4de879ff7ca91aa2b236ffe14a7188c5e7cd381135a1189c1dee06b9e\": rpc error: code = NotFound desc = could not find container \"5ff149c4de879ff7ca91aa2b236ffe14a7188c5e7cd381135a1189c1dee06b9e\": container with ID starting with 5ff149c4de879ff7ca91aa2b236ffe14a7188c5e7cd381135a1189c1dee06b9e not found: ID does not exist" Mar 18 17:12:11.872550 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:12:11.872462 2575 generic.go:358] "Generic (PLEG): container finished" podID="8af629a9-50bc-4252-898a-50744552feb1" containerID="17c988f370a60ec3d40fe86e71b011af9c8e1267aba25d03a6da12b0ba9c3096" exitCode=0 Mar 18 17:12:11.872550 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:12:11.872503 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-799cc7bf5-mgwp5" event={"ID":"8af629a9-50bc-4252-898a-50744552feb1","Type":"ContainerDied","Data":"17c988f370a60ec3d40fe86e71b011af9c8e1267aba25d03a6da12b0ba9c3096"} Mar 18 17:12:12.291372 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:12:12.291337 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a3f566a-0aad-4697-9707-43c4760df7c8" path="/var/lib/kubelet/pods/7a3f566a-0aad-4697-9707-43c4760df7c8/volumes" Mar 18 17:12:12.877261 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:12:12.877229 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-799cc7bf5-mgwp5" event={"ID":"8af629a9-50bc-4252-898a-50744552feb1","Type":"ContainerStarted","Data":"1608c9ce3061643b6af2cd82fd0893b447018f71c3989c3d1459f797a39a2ea3"} Mar 18 17:12:12.877715 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:12:12.877551 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-799cc7bf5-mgwp5" Mar 18 17:12:12.878856 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:12:12.878831 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-799cc7bf5-mgwp5" podUID="8af629a9-50bc-4252-898a-50744552feb1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.47:8080: connect: connection refused" Mar 18 17:12:12.892528 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:12:12.892488 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-799cc7bf5-mgwp5" podStartSLOduration=5.892475965 podStartE2EDuration="5.892475965s" podCreationTimestamp="2026-03-18 17:12:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:12:12.89062173 +0000 UTC m=+1651.201874672" watchObservedRunningTime="2026-03-18 17:12:12.892475965 +0000 UTC m=+1651.203728886" Mar 18 17:12:13.881106 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:12:13.881062 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-799cc7bf5-mgwp5" podUID="8af629a9-50bc-4252-898a-50744552feb1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.47:8080: connect: connection refused" Mar 18 17:12:15.287554 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:12:15.287523 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:12:23.881669 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:12:23.881617 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-799cc7bf5-mgwp5" podUID="8af629a9-50bc-4252-898a-50744552feb1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.47:8080: connect: connection refused" Mar 18 17:12:26.287509 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:12:26.287444 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:12:33.881799 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:12:33.881748 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-799cc7bf5-mgwp5" podUID="8af629a9-50bc-4252-898a-50744552feb1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.47:8080: connect: connection refused" Mar 18 17:12:38.286735 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:12:38.286703 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:12:43.881683 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:12:43.881641 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-799cc7bf5-mgwp5" podUID="8af629a9-50bc-4252-898a-50744552feb1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.47:8080: connect: connection refused" Mar 18 17:12:53.881601 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:12:53.881565 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-799cc7bf5-mgwp5" podUID="8af629a9-50bc-4252-898a-50744552feb1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.47:8080: connect: connection refused" Mar 18 17:12:53.884303 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:12:53.884273 2575 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized" image="quay.io/opendatahub/odh-model-serving-api:fast" Mar 18 17:12:53.884497 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:12:53.884459 2575 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:server,Image:quay.io/opendatahub/odh-model-serving-api:fast,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},ContainerPort{Name:metrics,HostPort:0,ContainerPort:9090,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:TLS_CERT_DIR,Value:/tls,ValueFrom:nil,},EnvVar{Name:GATEWAY_LABEL_SELECTOR,Value:,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{50 -3} {} 50m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tls-certs,ReadOnly:true,MountPath:/tls,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vhs26,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000640000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod model-serving-api-9699c8d45-ttb7x_kserve(e41abd12-239d-46fb-9cdb-9d35fa51024d): ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized" logger="UnhandledError" Mar 18 17:12:53.885632 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:12:53.885603 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ErrImagePull: \"unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:13:03.881834 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:13:03.881786 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-799cc7bf5-mgwp5" podUID="8af629a9-50bc-4252-898a-50744552feb1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.47:8080: connect: connection refused" Mar 18 17:13:08.286898 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:13:08.286851 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:13:13.881186 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:13:13.881147 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-799cc7bf5-mgwp5" podUID="8af629a9-50bc-4252-898a-50744552feb1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.47:8080: connect: connection refused" Mar 18 17:13:19.287052 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:13:19.287024 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:13:23.881663 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:13:23.881618 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-799cc7bf5-mgwp5" podUID="8af629a9-50bc-4252-898a-50744552feb1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.47:8080: connect: connection refused" Mar 18 17:13:30.286729 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:13:30.286700 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:13:33.882551 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:13:33.882516 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-799cc7bf5-mgwp5" Mar 18 17:13:38.274702 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:13:38.274668 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-799cc7bf5-mgwp5"] Mar 18 17:13:38.275138 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:13:38.274893 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-799cc7bf5-mgwp5" podUID="8af629a9-50bc-4252-898a-50744552feb1" containerName="kserve-container" containerID="cri-o://1608c9ce3061643b6af2cd82fd0893b447018f71c3989c3d1459f797a39a2ea3" gracePeriod=30 Mar 18 17:13:38.547135 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:13:38.547046 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-primary-c6d0b3-predictor-678d4cb6f9-9nhhx"] Mar 18 17:13:38.547443 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:13:38.547429 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7a3f566a-0aad-4697-9707-43c4760df7c8" containerName="storage-initializer" Mar 18 17:13:38.547502 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:13:38.547444 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a3f566a-0aad-4697-9707-43c4760df7c8" containerName="storage-initializer" Mar 18 17:13:38.547502 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:13:38.547463 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7a3f566a-0aad-4697-9707-43c4760df7c8" containerName="kserve-container" Mar 18 17:13:38.547502 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:13:38.547471 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a3f566a-0aad-4697-9707-43c4760df7c8" containerName="kserve-container" Mar 18 17:13:38.547597 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:13:38.547525 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="7a3f566a-0aad-4697-9707-43c4760df7c8" containerName="kserve-container" Mar 18 17:13:38.550623 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:13:38.550605 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-c6d0b3-predictor-678d4cb6f9-9nhhx" Mar 18 17:13:38.556736 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:13:38.556703 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-c6d0b3-predictor-678d4cb6f9-9nhhx"] Mar 18 17:13:38.622665 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:13:38.622629 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65rpb\" (UniqueName: \"kubernetes.io/projected/8cc86311-8a4b-4bba-adcc-46ffe55d6bbf-kube-api-access-65rpb\") pod \"isvc-primary-c6d0b3-predictor-678d4cb6f9-9nhhx\" (UID: \"8cc86311-8a4b-4bba-adcc-46ffe55d6bbf\") " pod="kserve-ci-e2e-test/isvc-primary-c6d0b3-predictor-678d4cb6f9-9nhhx" Mar 18 17:13:38.622839 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:13:38.622723 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8cc86311-8a4b-4bba-adcc-46ffe55d6bbf-kserve-provision-location\") pod \"isvc-primary-c6d0b3-predictor-678d4cb6f9-9nhhx\" (UID: \"8cc86311-8a4b-4bba-adcc-46ffe55d6bbf\") " pod="kserve-ci-e2e-test/isvc-primary-c6d0b3-predictor-678d4cb6f9-9nhhx" Mar 18 17:13:38.723829 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:13:38.723796 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-65rpb\" (UniqueName: \"kubernetes.io/projected/8cc86311-8a4b-4bba-adcc-46ffe55d6bbf-kube-api-access-65rpb\") pod \"isvc-primary-c6d0b3-predictor-678d4cb6f9-9nhhx\" (UID: \"8cc86311-8a4b-4bba-adcc-46ffe55d6bbf\") " pod="kserve-ci-e2e-test/isvc-primary-c6d0b3-predictor-678d4cb6f9-9nhhx" Mar 18 17:13:38.724014 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:13:38.723859 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8cc86311-8a4b-4bba-adcc-46ffe55d6bbf-kserve-provision-location\") pod \"isvc-primary-c6d0b3-predictor-678d4cb6f9-9nhhx\" (UID: \"8cc86311-8a4b-4bba-adcc-46ffe55d6bbf\") " pod="kserve-ci-e2e-test/isvc-primary-c6d0b3-predictor-678d4cb6f9-9nhhx" Mar 18 17:13:38.724247 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:13:38.724227 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8cc86311-8a4b-4bba-adcc-46ffe55d6bbf-kserve-provision-location\") pod \"isvc-primary-c6d0b3-predictor-678d4cb6f9-9nhhx\" (UID: \"8cc86311-8a4b-4bba-adcc-46ffe55d6bbf\") " pod="kserve-ci-e2e-test/isvc-primary-c6d0b3-predictor-678d4cb6f9-9nhhx" Mar 18 17:13:38.732137 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:13:38.732103 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-65rpb\" (UniqueName: \"kubernetes.io/projected/8cc86311-8a4b-4bba-adcc-46ffe55d6bbf-kube-api-access-65rpb\") pod \"isvc-primary-c6d0b3-predictor-678d4cb6f9-9nhhx\" (UID: \"8cc86311-8a4b-4bba-adcc-46ffe55d6bbf\") " pod="kserve-ci-e2e-test/isvc-primary-c6d0b3-predictor-678d4cb6f9-9nhhx" Mar 18 17:13:38.862191 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:13:38.862100 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-c6d0b3-predictor-678d4cb6f9-9nhhx" Mar 18 17:13:38.987984 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:13:38.987951 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-c6d0b3-predictor-678d4cb6f9-9nhhx"] Mar 18 17:13:38.991621 ip-10-0-139-49 kubenswrapper[2575]: W0318 17:13:38.991592 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8cc86311_8a4b_4bba_adcc_46ffe55d6bbf.slice/crio-ce3ab32e3e5c1267c25cf66d589d6c06a0443acd4be1740395cb9f6fe0d2b0e1 WatchSource:0}: Error finding container ce3ab32e3e5c1267c25cf66d589d6c06a0443acd4be1740395cb9f6fe0d2b0e1: Status 404 returned error can't find the container with id ce3ab32e3e5c1267c25cf66d589d6c06a0443acd4be1740395cb9f6fe0d2b0e1 Mar 18 17:13:39.164392 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:13:39.164328 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-c6d0b3-predictor-678d4cb6f9-9nhhx" event={"ID":"8cc86311-8a4b-4bba-adcc-46ffe55d6bbf","Type":"ContainerStarted","Data":"fcb7de9e30f01b6d67e6e9e70ce045609858f4574f1a34e213325724e948abdf"} Mar 18 17:13:39.164590 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:13:39.164401 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-c6d0b3-predictor-678d4cb6f9-9nhhx" event={"ID":"8cc86311-8a4b-4bba-adcc-46ffe55d6bbf","Type":"ContainerStarted","Data":"ce3ab32e3e5c1267c25cf66d589d6c06a0443acd4be1740395cb9f6fe0d2b0e1"} Mar 18 17:13:41.286984 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:13:41.286953 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:13:41.523490 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:13:41.523467 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-799cc7bf5-mgwp5" Mar 18 17:13:41.649858 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:13:41.649835 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w64wf\" (UniqueName: \"kubernetes.io/projected/8af629a9-50bc-4252-898a-50744552feb1-kube-api-access-w64wf\") pod \"8af629a9-50bc-4252-898a-50744552feb1\" (UID: \"8af629a9-50bc-4252-898a-50744552feb1\") " Mar 18 17:13:41.650000 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:13:41.649898 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8af629a9-50bc-4252-898a-50744552feb1-kserve-provision-location\") pod \"8af629a9-50bc-4252-898a-50744552feb1\" (UID: \"8af629a9-50bc-4252-898a-50744552feb1\") " Mar 18 17:13:41.650247 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:13:41.650222 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8af629a9-50bc-4252-898a-50744552feb1-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "8af629a9-50bc-4252-898a-50744552feb1" (UID: "8af629a9-50bc-4252-898a-50744552feb1"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 18 17:13:41.652020 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:13:41.652003 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8af629a9-50bc-4252-898a-50744552feb1-kube-api-access-w64wf" (OuterVolumeSpecName: "kube-api-access-w64wf") pod "8af629a9-50bc-4252-898a-50744552feb1" (UID: "8af629a9-50bc-4252-898a-50744552feb1"). InnerVolumeSpecName "kube-api-access-w64wf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 17:13:41.750618 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:13:41.750576 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-w64wf\" (UniqueName: \"kubernetes.io/projected/8af629a9-50bc-4252-898a-50744552feb1-kube-api-access-w64wf\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 17:13:41.750618 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:13:41.750612 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8af629a9-50bc-4252-898a-50744552feb1-kserve-provision-location\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 17:13:42.175993 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:13:42.175964 2575 generic.go:358] "Generic (PLEG): container finished" podID="8af629a9-50bc-4252-898a-50744552feb1" containerID="1608c9ce3061643b6af2cd82fd0893b447018f71c3989c3d1459f797a39a2ea3" exitCode=0 Mar 18 17:13:42.176152 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:13:42.176035 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-799cc7bf5-mgwp5" Mar 18 17:13:42.176152 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:13:42.176043 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-799cc7bf5-mgwp5" event={"ID":"8af629a9-50bc-4252-898a-50744552feb1","Type":"ContainerDied","Data":"1608c9ce3061643b6af2cd82fd0893b447018f71c3989c3d1459f797a39a2ea3"} Mar 18 17:13:42.176152 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:13:42.176083 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-799cc7bf5-mgwp5" event={"ID":"8af629a9-50bc-4252-898a-50744552feb1","Type":"ContainerDied","Data":"c156c9add36ae109e7f4e4efd7a79aa33913478864cf94963112abd17ee0490f"} Mar 18 17:13:42.176152 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:13:42.176103 2575 scope.go:117] "RemoveContainer" containerID="1608c9ce3061643b6af2cd82fd0893b447018f71c3989c3d1459f797a39a2ea3" Mar 18 17:13:42.185153 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:13:42.185131 2575 scope.go:117] "RemoveContainer" containerID="17c988f370a60ec3d40fe86e71b011af9c8e1267aba25d03a6da12b0ba9c3096" Mar 18 17:13:42.194468 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:13:42.194453 2575 scope.go:117] "RemoveContainer" containerID="1608c9ce3061643b6af2cd82fd0893b447018f71c3989c3d1459f797a39a2ea3" Mar 18 17:13:42.194735 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:13:42.194711 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1608c9ce3061643b6af2cd82fd0893b447018f71c3989c3d1459f797a39a2ea3\": container with ID starting with 1608c9ce3061643b6af2cd82fd0893b447018f71c3989c3d1459f797a39a2ea3 not found: ID does not exist" containerID="1608c9ce3061643b6af2cd82fd0893b447018f71c3989c3d1459f797a39a2ea3" Mar 18 17:13:42.194825 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:13:42.194748 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1608c9ce3061643b6af2cd82fd0893b447018f71c3989c3d1459f797a39a2ea3"} err="failed to get container status \"1608c9ce3061643b6af2cd82fd0893b447018f71c3989c3d1459f797a39a2ea3\": rpc error: code = NotFound desc = could not find container \"1608c9ce3061643b6af2cd82fd0893b447018f71c3989c3d1459f797a39a2ea3\": container with ID starting with 1608c9ce3061643b6af2cd82fd0893b447018f71c3989c3d1459f797a39a2ea3 not found: ID does not exist" Mar 18 17:13:42.194825 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:13:42.194771 2575 scope.go:117] "RemoveContainer" containerID="17c988f370a60ec3d40fe86e71b011af9c8e1267aba25d03a6da12b0ba9c3096" Mar 18 17:13:42.195026 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:13:42.195007 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17c988f370a60ec3d40fe86e71b011af9c8e1267aba25d03a6da12b0ba9c3096\": container with ID starting with 17c988f370a60ec3d40fe86e71b011af9c8e1267aba25d03a6da12b0ba9c3096 not found: ID does not exist" containerID="17c988f370a60ec3d40fe86e71b011af9c8e1267aba25d03a6da12b0ba9c3096" Mar 18 17:13:42.195085 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:13:42.195031 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17c988f370a60ec3d40fe86e71b011af9c8e1267aba25d03a6da12b0ba9c3096"} err="failed to get container status \"17c988f370a60ec3d40fe86e71b011af9c8e1267aba25d03a6da12b0ba9c3096\": rpc error: code = NotFound desc = could not find container \"17c988f370a60ec3d40fe86e71b011af9c8e1267aba25d03a6da12b0ba9c3096\": container with ID starting with 17c988f370a60ec3d40fe86e71b011af9c8e1267aba25d03a6da12b0ba9c3096 not found: ID does not exist" Mar 18 17:13:42.195806 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:13:42.195782 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-799cc7bf5-mgwp5"] Mar 18 17:13:42.200776 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:13:42.200753 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-799cc7bf5-mgwp5"] Mar 18 17:13:42.290931 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:13:42.290905 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8af629a9-50bc-4252-898a-50744552feb1" path="/var/lib/kubelet/pods/8af629a9-50bc-4252-898a-50744552feb1/volumes" Mar 18 17:13:43.180353 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:13:43.180324 2575 generic.go:358] "Generic (PLEG): container finished" podID="8cc86311-8a4b-4bba-adcc-46ffe55d6bbf" containerID="fcb7de9e30f01b6d67e6e9e70ce045609858f4574f1a34e213325724e948abdf" exitCode=0 Mar 18 17:13:43.180545 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:13:43.180402 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-c6d0b3-predictor-678d4cb6f9-9nhhx" event={"ID":"8cc86311-8a4b-4bba-adcc-46ffe55d6bbf","Type":"ContainerDied","Data":"fcb7de9e30f01b6d67e6e9e70ce045609858f4574f1a34e213325724e948abdf"} Mar 18 17:13:44.185690 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:13:44.185655 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-c6d0b3-predictor-678d4cb6f9-9nhhx" event={"ID":"8cc86311-8a4b-4bba-adcc-46ffe55d6bbf","Type":"ContainerStarted","Data":"417554d602dbea59263d94b605002f91913a720a55e6c24fa32b551dab49b339"} Mar 18 17:13:44.186109 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:13:44.186039 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-primary-c6d0b3-predictor-678d4cb6f9-9nhhx" Mar 18 17:13:44.187173 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:13:44.187145 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-c6d0b3-predictor-678d4cb6f9-9nhhx" podUID="8cc86311-8a4b-4bba-adcc-46ffe55d6bbf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.48:8080: connect: connection refused" Mar 18 17:13:44.204016 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:13:44.203969 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-primary-c6d0b3-predictor-678d4cb6f9-9nhhx" podStartSLOduration=6.20395812 podStartE2EDuration="6.20395812s" podCreationTimestamp="2026-03-18 17:13:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:13:44.201535778 +0000 UTC m=+1742.512788699" watchObservedRunningTime="2026-03-18 17:13:44.20395812 +0000 UTC m=+1742.515211039" Mar 18 17:13:45.190339 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:13:45.190298 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-c6d0b3-predictor-678d4cb6f9-9nhhx" podUID="8cc86311-8a4b-4bba-adcc-46ffe55d6bbf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.48:8080: connect: connection refused" Mar 18 17:13:54.287457 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:13:54.287426 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:13:55.190653 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:13:55.190611 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-c6d0b3-predictor-678d4cb6f9-9nhhx" podUID="8cc86311-8a4b-4bba-adcc-46ffe55d6bbf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.48:8080: connect: connection refused" Mar 18 17:14:05.190500 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:14:05.190448 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-c6d0b3-predictor-678d4cb6f9-9nhhx" podUID="8cc86311-8a4b-4bba-adcc-46ffe55d6bbf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.48:8080: connect: connection refused" Mar 18 17:14:06.291156 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:14:06.291118 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:14:15.190505 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:14:15.190462 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-c6d0b3-predictor-678d4cb6f9-9nhhx" podUID="8cc86311-8a4b-4bba-adcc-46ffe55d6bbf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.48:8080: connect: connection refused" Mar 18 17:14:21.287201 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:14:21.287173 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:14:25.191302 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:14:25.191261 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-c6d0b3-predictor-678d4cb6f9-9nhhx" podUID="8cc86311-8a4b-4bba-adcc-46ffe55d6bbf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.48:8080: connect: connection refused" Mar 18 17:14:34.288388 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:14:34.287031 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:14:35.190747 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:14:35.190700 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-c6d0b3-predictor-678d4cb6f9-9nhhx" podUID="8cc86311-8a4b-4bba-adcc-46ffe55d6bbf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.48:8080: connect: connection refused" Mar 18 17:14:42.293915 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:14:42.293887 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qktrm_a51f93e2-c750-4700-bb46-109456c7c78a/ovn-acl-logging/0.log" Mar 18 17:14:42.297806 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:14:42.297788 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qktrm_a51f93e2-c750-4700-bb46-109456c7c78a/ovn-acl-logging/0.log" Mar 18 17:14:45.190403 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:14:45.190339 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-c6d0b3-predictor-678d4cb6f9-9nhhx" podUID="8cc86311-8a4b-4bba-adcc-46ffe55d6bbf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.48:8080: connect: connection refused" Mar 18 17:14:49.287319 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:14:49.287264 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:14:55.190632 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:14:55.190587 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-c6d0b3-predictor-678d4cb6f9-9nhhx" podUID="8cc86311-8a4b-4bba-adcc-46ffe55d6bbf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.48:8080: connect: connection refused" Mar 18 17:15:04.289617 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:15:04.287602 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:15:05.192154 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:05.192123 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-primary-c6d0b3-predictor-678d4cb6f9-9nhhx" Mar 18 17:15:08.749951 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:08.749914 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-c6d0b3-predictor-64ddc58f8b-5lg46"] Mar 18 17:15:08.750351 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:08.750247 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8af629a9-50bc-4252-898a-50744552feb1" containerName="kserve-container" Mar 18 17:15:08.750351 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:08.750257 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="8af629a9-50bc-4252-898a-50744552feb1" containerName="kserve-container" Mar 18 17:15:08.750351 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:08.750273 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8af629a9-50bc-4252-898a-50744552feb1" containerName="storage-initializer" Mar 18 17:15:08.750351 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:08.750278 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="8af629a9-50bc-4252-898a-50744552feb1" containerName="storage-initializer" Mar 18 17:15:08.750351 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:08.750331 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="8af629a9-50bc-4252-898a-50744552feb1" containerName="kserve-container" Mar 18 17:15:08.756096 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:08.754835 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-c6d0b3-predictor-64ddc58f8b-5lg46" Mar 18 17:15:08.757206 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:08.757177 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-secret-c6d0b3\"" Mar 18 17:15:08.757370 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:08.757175 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-sa-c6d0b3-dockercfg-zmgjg\"" Mar 18 17:15:08.760388 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:08.760336 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-c6d0b3-predictor-64ddc58f8b-5lg46"] Mar 18 17:15:08.857592 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:08.857506 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5d56m\" (UniqueName: \"kubernetes.io/projected/7d9dd211-7bf1-4049-a6a7-5856daf7b3b2-kube-api-access-5d56m\") pod \"isvc-secondary-c6d0b3-predictor-64ddc58f8b-5lg46\" (UID: \"7d9dd211-7bf1-4049-a6a7-5856daf7b3b2\") " pod="kserve-ci-e2e-test/isvc-secondary-c6d0b3-predictor-64ddc58f8b-5lg46" Mar 18 17:15:08.857759 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:08.857619 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7d9dd211-7bf1-4049-a6a7-5856daf7b3b2-kserve-provision-location\") pod \"isvc-secondary-c6d0b3-predictor-64ddc58f8b-5lg46\" (UID: \"7d9dd211-7bf1-4049-a6a7-5856daf7b3b2\") " pod="kserve-ci-e2e-test/isvc-secondary-c6d0b3-predictor-64ddc58f8b-5lg46" Mar 18 17:15:08.958279 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:08.958246 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7d9dd211-7bf1-4049-a6a7-5856daf7b3b2-kserve-provision-location\") pod \"isvc-secondary-c6d0b3-predictor-64ddc58f8b-5lg46\" (UID: \"7d9dd211-7bf1-4049-a6a7-5856daf7b3b2\") " pod="kserve-ci-e2e-test/isvc-secondary-c6d0b3-predictor-64ddc58f8b-5lg46" Mar 18 17:15:08.958470 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:08.958317 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5d56m\" (UniqueName: \"kubernetes.io/projected/7d9dd211-7bf1-4049-a6a7-5856daf7b3b2-kube-api-access-5d56m\") pod \"isvc-secondary-c6d0b3-predictor-64ddc58f8b-5lg46\" (UID: \"7d9dd211-7bf1-4049-a6a7-5856daf7b3b2\") " pod="kserve-ci-e2e-test/isvc-secondary-c6d0b3-predictor-64ddc58f8b-5lg46" Mar 18 17:15:08.958653 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:08.958632 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7d9dd211-7bf1-4049-a6a7-5856daf7b3b2-kserve-provision-location\") pod \"isvc-secondary-c6d0b3-predictor-64ddc58f8b-5lg46\" (UID: \"7d9dd211-7bf1-4049-a6a7-5856daf7b3b2\") " pod="kserve-ci-e2e-test/isvc-secondary-c6d0b3-predictor-64ddc58f8b-5lg46" Mar 18 17:15:08.965877 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:08.965852 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5d56m\" (UniqueName: \"kubernetes.io/projected/7d9dd211-7bf1-4049-a6a7-5856daf7b3b2-kube-api-access-5d56m\") pod \"isvc-secondary-c6d0b3-predictor-64ddc58f8b-5lg46\" (UID: \"7d9dd211-7bf1-4049-a6a7-5856daf7b3b2\") " pod="kserve-ci-e2e-test/isvc-secondary-c6d0b3-predictor-64ddc58f8b-5lg46" Mar 18 17:15:09.067402 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:09.067318 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-c6d0b3-predictor-64ddc58f8b-5lg46" Mar 18 17:15:09.184209 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:09.184184 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-c6d0b3-predictor-64ddc58f8b-5lg46"] Mar 18 17:15:09.186860 ip-10-0-139-49 kubenswrapper[2575]: W0318 17:15:09.186830 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d9dd211_7bf1_4049_a6a7_5856daf7b3b2.slice/crio-2b957ea8a8bd81b5401051c3495dc55fe1e74db31509262d18d4e4f31e55a321 WatchSource:0}: Error finding container 2b957ea8a8bd81b5401051c3495dc55fe1e74db31509262d18d4e4f31e55a321: Status 404 returned error can't find the container with id 2b957ea8a8bd81b5401051c3495dc55fe1e74db31509262d18d4e4f31e55a321 Mar 18 17:15:09.456325 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:09.456279 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-c6d0b3-predictor-64ddc58f8b-5lg46" event={"ID":"7d9dd211-7bf1-4049-a6a7-5856daf7b3b2","Type":"ContainerStarted","Data":"b22b623d0308301342117ce87e84766188d227c0a89d119c877d8899e082654f"} Mar 18 17:15:09.456325 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:09.456319 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-c6d0b3-predictor-64ddc58f8b-5lg46" event={"ID":"7d9dd211-7bf1-4049-a6a7-5856daf7b3b2","Type":"ContainerStarted","Data":"2b957ea8a8bd81b5401051c3495dc55fe1e74db31509262d18d4e4f31e55a321"} Mar 18 17:15:10.460920 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:10.460893 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-c6d0b3-predictor-64ddc58f8b-5lg46_7d9dd211-7bf1-4049-a6a7-5856daf7b3b2/storage-initializer/0.log" Mar 18 17:15:10.461284 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:10.460937 2575 generic.go:358] "Generic (PLEG): container finished" podID="7d9dd211-7bf1-4049-a6a7-5856daf7b3b2" containerID="b22b623d0308301342117ce87e84766188d227c0a89d119c877d8899e082654f" exitCode=1 Mar 18 17:15:10.461284 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:10.461003 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-c6d0b3-predictor-64ddc58f8b-5lg46" event={"ID":"7d9dd211-7bf1-4049-a6a7-5856daf7b3b2","Type":"ContainerDied","Data":"b22b623d0308301342117ce87e84766188d227c0a89d119c877d8899e082654f"} Mar 18 17:15:11.465626 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:11.465600 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-c6d0b3-predictor-64ddc58f8b-5lg46_7d9dd211-7bf1-4049-a6a7-5856daf7b3b2/storage-initializer/1.log" Mar 18 17:15:11.466008 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:11.465993 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-c6d0b3-predictor-64ddc58f8b-5lg46_7d9dd211-7bf1-4049-a6a7-5856daf7b3b2/storage-initializer/0.log" Mar 18 17:15:11.466067 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:11.466029 2575 generic.go:358] "Generic (PLEG): container finished" podID="7d9dd211-7bf1-4049-a6a7-5856daf7b3b2" containerID="6d480fb4084fcca70317b76a569ec18f1ba1cb241c2825df6e8867b0d5ef8540" exitCode=1 Mar 18 17:15:11.466128 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:11.466080 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-c6d0b3-predictor-64ddc58f8b-5lg46" event={"ID":"7d9dd211-7bf1-4049-a6a7-5856daf7b3b2","Type":"ContainerDied","Data":"6d480fb4084fcca70317b76a569ec18f1ba1cb241c2825df6e8867b0d5ef8540"} Mar 18 17:15:11.466128 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:11.466111 2575 scope.go:117] "RemoveContainer" containerID="b22b623d0308301342117ce87e84766188d227c0a89d119c877d8899e082654f" Mar 18 17:15:11.466413 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:11.466393 2575 scope.go:117] "RemoveContainer" containerID="b22b623d0308301342117ce87e84766188d227c0a89d119c877d8899e082654f" Mar 18 17:15:11.480823 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:15:11.480797 2575 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-c6d0b3-predictor-64ddc58f8b-5lg46_kserve-ci-e2e-test_7d9dd211-7bf1-4049-a6a7-5856daf7b3b2_0 in pod sandbox 2b957ea8a8bd81b5401051c3495dc55fe1e74db31509262d18d4e4f31e55a321 from index: no such id: 'b22b623d0308301342117ce87e84766188d227c0a89d119c877d8899e082654f'" containerID="b22b623d0308301342117ce87e84766188d227c0a89d119c877d8899e082654f" Mar 18 17:15:11.480892 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:15:11.480841 2575 kuberuntime_container.go:951] "Unhandled Error" err="failed to remove pod init container \"storage-initializer\": rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-c6d0b3-predictor-64ddc58f8b-5lg46_kserve-ci-e2e-test_7d9dd211-7bf1-4049-a6a7-5856daf7b3b2_0 in pod sandbox 2b957ea8a8bd81b5401051c3495dc55fe1e74db31509262d18d4e4f31e55a321 from index: no such id: 'b22b623d0308301342117ce87e84766188d227c0a89d119c877d8899e082654f'; Skipping pod \"isvc-secondary-c6d0b3-predictor-64ddc58f8b-5lg46_kserve-ci-e2e-test(7d9dd211-7bf1-4049-a6a7-5856daf7b3b2)\"" logger="UnhandledError" Mar 18 17:15:11.482205 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:15:11.482179 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-secondary-c6d0b3-predictor-64ddc58f8b-5lg46_kserve-ci-e2e-test(7d9dd211-7bf1-4049-a6a7-5856daf7b3b2)\"" pod="kserve-ci-e2e-test/isvc-secondary-c6d0b3-predictor-64ddc58f8b-5lg46" podUID="7d9dd211-7bf1-4049-a6a7-5856daf7b3b2" Mar 18 17:15:12.470243 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:12.470215 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-c6d0b3-predictor-64ddc58f8b-5lg46_7d9dd211-7bf1-4049-a6a7-5856daf7b3b2/storage-initializer/1.log" Mar 18 17:15:12.470724 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:15:12.470703 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-secondary-c6d0b3-predictor-64ddc58f8b-5lg46_kserve-ci-e2e-test(7d9dd211-7bf1-4049-a6a7-5856daf7b3b2)\"" pod="kserve-ci-e2e-test/isvc-secondary-c6d0b3-predictor-64ddc58f8b-5lg46" podUID="7d9dd211-7bf1-4049-a6a7-5856daf7b3b2" Mar 18 17:15:15.287778 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:15:15.287748 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:15:22.599149 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:22.599114 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-c6d0b3-predictor-64ddc58f8b-5lg46"] Mar 18 17:15:22.681437 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:22.681405 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-c6d0b3-predictor-678d4cb6f9-9nhhx"] Mar 18 17:15:22.681729 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:22.681706 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-primary-c6d0b3-predictor-678d4cb6f9-9nhhx" podUID="8cc86311-8a4b-4bba-adcc-46ffe55d6bbf" containerName="kserve-container" containerID="cri-o://417554d602dbea59263d94b605002f91913a720a55e6c24fa32b551dab49b339" gracePeriod=30 Mar 18 17:15:22.745076 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:22.745055 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-c6d0b3-predictor-64ddc58f8b-5lg46_7d9dd211-7bf1-4049-a6a7-5856daf7b3b2/storage-initializer/1.log" Mar 18 17:15:22.745196 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:22.745115 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-c6d0b3-predictor-64ddc58f8b-5lg46" Mar 18 17:15:22.760277 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:22.760249 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-87d4cc-predictor-8499dc5f9d-kzwxg"] Mar 18 17:15:22.760706 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:22.760681 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7d9dd211-7bf1-4049-a6a7-5856daf7b3b2" containerName="storage-initializer" Mar 18 17:15:22.760706 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:22.760702 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d9dd211-7bf1-4049-a6a7-5856daf7b3b2" containerName="storage-initializer" Mar 18 17:15:22.761105 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:22.760725 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7d9dd211-7bf1-4049-a6a7-5856daf7b3b2" containerName="storage-initializer" Mar 18 17:15:22.761105 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:22.760734 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d9dd211-7bf1-4049-a6a7-5856daf7b3b2" containerName="storage-initializer" Mar 18 17:15:22.761105 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:22.760838 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="7d9dd211-7bf1-4049-a6a7-5856daf7b3b2" containerName="storage-initializer" Mar 18 17:15:22.761105 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:22.761053 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="7d9dd211-7bf1-4049-a6a7-5856daf7b3b2" containerName="storage-initializer" Mar 18 17:15:22.765106 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:22.765088 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-87d4cc-predictor-8499dc5f9d-kzwxg" Mar 18 17:15:22.766154 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:22.766132 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5d56m\" (UniqueName: \"kubernetes.io/projected/7d9dd211-7bf1-4049-a6a7-5856daf7b3b2-kube-api-access-5d56m\") pod \"7d9dd211-7bf1-4049-a6a7-5856daf7b3b2\" (UID: \"7d9dd211-7bf1-4049-a6a7-5856daf7b3b2\") " Mar 18 17:15:22.766240 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:22.766182 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7d9dd211-7bf1-4049-a6a7-5856daf7b3b2-kserve-provision-location\") pod \"7d9dd211-7bf1-4049-a6a7-5856daf7b3b2\" (UID: \"7d9dd211-7bf1-4049-a6a7-5856daf7b3b2\") " Mar 18 17:15:22.766438 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:22.766417 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d9dd211-7bf1-4049-a6a7-5856daf7b3b2-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "7d9dd211-7bf1-4049-a6a7-5856daf7b3b2" (UID: "7d9dd211-7bf1-4049-a6a7-5856daf7b3b2"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 18 17:15:22.766769 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:22.766751 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-secret-87d4cc\"" Mar 18 17:15:22.766844 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:22.766806 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-sa-87d4cc-dockercfg-5qxfk\"" Mar 18 17:15:22.768835 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:22.768814 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d9dd211-7bf1-4049-a6a7-5856daf7b3b2-kube-api-access-5d56m" (OuterVolumeSpecName: "kube-api-access-5d56m") pod "7d9dd211-7bf1-4049-a6a7-5856daf7b3b2" (UID: "7d9dd211-7bf1-4049-a6a7-5856daf7b3b2"). InnerVolumeSpecName "kube-api-access-5d56m". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 17:15:22.769794 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:22.769775 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-87d4cc-predictor-8499dc5f9d-kzwxg"] Mar 18 17:15:22.867105 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:22.867018 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7aa25448-5a33-4bfe-9cc1-8afb2ef3ec4c-kserve-provision-location\") pod \"isvc-init-fail-87d4cc-predictor-8499dc5f9d-kzwxg\" (UID: \"7aa25448-5a33-4bfe-9cc1-8afb2ef3ec4c\") " pod="kserve-ci-e2e-test/isvc-init-fail-87d4cc-predictor-8499dc5f9d-kzwxg" Mar 18 17:15:22.867105 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:22.867068 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z56tg\" (UniqueName: \"kubernetes.io/projected/7aa25448-5a33-4bfe-9cc1-8afb2ef3ec4c-kube-api-access-z56tg\") pod \"isvc-init-fail-87d4cc-predictor-8499dc5f9d-kzwxg\" (UID: \"7aa25448-5a33-4bfe-9cc1-8afb2ef3ec4c\") " pod="kserve-ci-e2e-test/isvc-init-fail-87d4cc-predictor-8499dc5f9d-kzwxg" Mar 18 17:15:22.867289 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:22.867156 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5d56m\" (UniqueName: \"kubernetes.io/projected/7d9dd211-7bf1-4049-a6a7-5856daf7b3b2-kube-api-access-5d56m\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 17:15:22.867289 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:22.867179 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7d9dd211-7bf1-4049-a6a7-5856daf7b3b2-kserve-provision-location\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 17:15:22.968433 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:22.968391 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7aa25448-5a33-4bfe-9cc1-8afb2ef3ec4c-kserve-provision-location\") pod \"isvc-init-fail-87d4cc-predictor-8499dc5f9d-kzwxg\" (UID: \"7aa25448-5a33-4bfe-9cc1-8afb2ef3ec4c\") " pod="kserve-ci-e2e-test/isvc-init-fail-87d4cc-predictor-8499dc5f9d-kzwxg" Mar 18 17:15:22.968630 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:22.968447 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z56tg\" (UniqueName: \"kubernetes.io/projected/7aa25448-5a33-4bfe-9cc1-8afb2ef3ec4c-kube-api-access-z56tg\") pod \"isvc-init-fail-87d4cc-predictor-8499dc5f9d-kzwxg\" (UID: \"7aa25448-5a33-4bfe-9cc1-8afb2ef3ec4c\") " pod="kserve-ci-e2e-test/isvc-init-fail-87d4cc-predictor-8499dc5f9d-kzwxg" Mar 18 17:15:22.968836 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:22.968812 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7aa25448-5a33-4bfe-9cc1-8afb2ef3ec4c-kserve-provision-location\") pod \"isvc-init-fail-87d4cc-predictor-8499dc5f9d-kzwxg\" (UID: \"7aa25448-5a33-4bfe-9cc1-8afb2ef3ec4c\") " pod="kserve-ci-e2e-test/isvc-init-fail-87d4cc-predictor-8499dc5f9d-kzwxg" Mar 18 17:15:22.975984 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:22.975957 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z56tg\" (UniqueName: \"kubernetes.io/projected/7aa25448-5a33-4bfe-9cc1-8afb2ef3ec4c-kube-api-access-z56tg\") pod \"isvc-init-fail-87d4cc-predictor-8499dc5f9d-kzwxg\" (UID: \"7aa25448-5a33-4bfe-9cc1-8afb2ef3ec4c\") " pod="kserve-ci-e2e-test/isvc-init-fail-87d4cc-predictor-8499dc5f9d-kzwxg" Mar 18 17:15:23.081250 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:23.081219 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-87d4cc-predictor-8499dc5f9d-kzwxg" Mar 18 17:15:23.202909 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:23.202728 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-87d4cc-predictor-8499dc5f9d-kzwxg"] Mar 18 17:15:23.204997 ip-10-0-139-49 kubenswrapper[2575]: W0318 17:15:23.204971 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7aa25448_5a33_4bfe_9cc1_8afb2ef3ec4c.slice/crio-688b0dac0d4f6cc2fcb853a73b3c1bd191a41bbd825c9616af863c77557c97ac WatchSource:0}: Error finding container 688b0dac0d4f6cc2fcb853a73b3c1bd191a41bbd825c9616af863c77557c97ac: Status 404 returned error can't find the container with id 688b0dac0d4f6cc2fcb853a73b3c1bd191a41bbd825c9616af863c77557c97ac Mar 18 17:15:23.506466 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:23.506439 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-c6d0b3-predictor-64ddc58f8b-5lg46_7d9dd211-7bf1-4049-a6a7-5856daf7b3b2/storage-initializer/1.log" Mar 18 17:15:23.506645 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:23.506546 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-c6d0b3-predictor-64ddc58f8b-5lg46" event={"ID":"7d9dd211-7bf1-4049-a6a7-5856daf7b3b2","Type":"ContainerDied","Data":"2b957ea8a8bd81b5401051c3495dc55fe1e74db31509262d18d4e4f31e55a321"} Mar 18 17:15:23.506645 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:23.506573 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-c6d0b3-predictor-64ddc58f8b-5lg46" Mar 18 17:15:23.506645 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:23.506583 2575 scope.go:117] "RemoveContainer" containerID="6d480fb4084fcca70317b76a569ec18f1ba1cb241c2825df6e8867b0d5ef8540" Mar 18 17:15:23.507976 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:23.507952 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-87d4cc-predictor-8499dc5f9d-kzwxg" event={"ID":"7aa25448-5a33-4bfe-9cc1-8afb2ef3ec4c","Type":"ContainerStarted","Data":"96fb385d63fe56c07d0bad85eab599b4b0f8909122bf3a1034aea869fe9ffd47"} Mar 18 17:15:23.508098 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:23.507983 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-87d4cc-predictor-8499dc5f9d-kzwxg" event={"ID":"7aa25448-5a33-4bfe-9cc1-8afb2ef3ec4c","Type":"ContainerStarted","Data":"688b0dac0d4f6cc2fcb853a73b3c1bd191a41bbd825c9616af863c77557c97ac"} Mar 18 17:15:23.546141 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:23.546114 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-c6d0b3-predictor-64ddc58f8b-5lg46"] Mar 18 17:15:23.548658 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:23.548635 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-c6d0b3-predictor-64ddc58f8b-5lg46"] Mar 18 17:15:24.290727 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:24.290648 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d9dd211-7bf1-4049-a6a7-5856daf7b3b2" path="/var/lib/kubelet/pods/7d9dd211-7bf1-4049-a6a7-5856daf7b3b2/volumes" Mar 18 17:15:24.512621 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:24.512594 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-87d4cc-predictor-8499dc5f9d-kzwxg_7aa25448-5a33-4bfe-9cc1-8afb2ef3ec4c/storage-initializer/0.log" Mar 18 17:15:24.512811 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:24.512634 2575 generic.go:358] "Generic (PLEG): container finished" podID="7aa25448-5a33-4bfe-9cc1-8afb2ef3ec4c" containerID="96fb385d63fe56c07d0bad85eab599b4b0f8909122bf3a1034aea869fe9ffd47" exitCode=1 Mar 18 17:15:24.512811 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:24.512741 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-87d4cc-predictor-8499dc5f9d-kzwxg" event={"ID":"7aa25448-5a33-4bfe-9cc1-8afb2ef3ec4c","Type":"ContainerDied","Data":"96fb385d63fe56c07d0bad85eab599b4b0f8909122bf3a1034aea869fe9ffd47"} Mar 18 17:15:25.190457 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:25.190415 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-c6d0b3-predictor-678d4cb6f9-9nhhx" podUID="8cc86311-8a4b-4bba-adcc-46ffe55d6bbf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.48:8080: connect: connection refused" Mar 18 17:15:25.518703 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:25.518618 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-87d4cc-predictor-8499dc5f9d-kzwxg_7aa25448-5a33-4bfe-9cc1-8afb2ef3ec4c/storage-initializer/1.log" Mar 18 17:15:25.519067 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:25.518953 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-87d4cc-predictor-8499dc5f9d-kzwxg_7aa25448-5a33-4bfe-9cc1-8afb2ef3ec4c/storage-initializer/0.log" Mar 18 17:15:25.519067 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:25.518985 2575 generic.go:358] "Generic (PLEG): container finished" podID="7aa25448-5a33-4bfe-9cc1-8afb2ef3ec4c" containerID="97b9950490538ced77c2c1d1936ad3ce50339ef3077abc9b8be03e487e7ae301" exitCode=1 Mar 18 17:15:25.519067 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:25.519053 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-87d4cc-predictor-8499dc5f9d-kzwxg" event={"ID":"7aa25448-5a33-4bfe-9cc1-8afb2ef3ec4c","Type":"ContainerDied","Data":"97b9950490538ced77c2c1d1936ad3ce50339ef3077abc9b8be03e487e7ae301"} Mar 18 17:15:25.519259 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:25.519086 2575 scope.go:117] "RemoveContainer" containerID="96fb385d63fe56c07d0bad85eab599b4b0f8909122bf3a1034aea869fe9ffd47" Mar 18 17:15:25.519308 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:25.519294 2575 scope.go:117] "RemoveContainer" containerID="96fb385d63fe56c07d0bad85eab599b4b0f8909122bf3a1034aea869fe9ffd47" Mar 18 17:15:25.538672 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:15:25.538638 2575 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-init-fail-87d4cc-predictor-8499dc5f9d-kzwxg_kserve-ci-e2e-test_7aa25448-5a33-4bfe-9cc1-8afb2ef3ec4c_0 in pod sandbox 688b0dac0d4f6cc2fcb853a73b3c1bd191a41bbd825c9616af863c77557c97ac from index: no such id: '96fb385d63fe56c07d0bad85eab599b4b0f8909122bf3a1034aea869fe9ffd47'" containerID="96fb385d63fe56c07d0bad85eab599b4b0f8909122bf3a1034aea869fe9ffd47" Mar 18 17:15:25.538748 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:15:25.538684 2575 kuberuntime_container.go:951] "Unhandled Error" err="failed to remove pod init container \"storage-initializer\": rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-init-fail-87d4cc-predictor-8499dc5f9d-kzwxg_kserve-ci-e2e-test_7aa25448-5a33-4bfe-9cc1-8afb2ef3ec4c_0 in pod sandbox 688b0dac0d4f6cc2fcb853a73b3c1bd191a41bbd825c9616af863c77557c97ac from index: no such id: '96fb385d63fe56c07d0bad85eab599b4b0f8909122bf3a1034aea869fe9ffd47'; Skipping pod \"isvc-init-fail-87d4cc-predictor-8499dc5f9d-kzwxg_kserve-ci-e2e-test(7aa25448-5a33-4bfe-9cc1-8afb2ef3ec4c)\"" logger="UnhandledError" Mar 18 17:15:25.540032 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:15:25.540009 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-init-fail-87d4cc-predictor-8499dc5f9d-kzwxg_kserve-ci-e2e-test(7aa25448-5a33-4bfe-9cc1-8afb2ef3ec4c)\"" pod="kserve-ci-e2e-test/isvc-init-fail-87d4cc-predictor-8499dc5f9d-kzwxg" podUID="7aa25448-5a33-4bfe-9cc1-8afb2ef3ec4c" Mar 18 17:15:26.523739 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:26.523713 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-87d4cc-predictor-8499dc5f9d-kzwxg_7aa25448-5a33-4bfe-9cc1-8afb2ef3ec4c/storage-initializer/1.log" Mar 18 17:15:26.524248 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:15:26.524227 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-init-fail-87d4cc-predictor-8499dc5f9d-kzwxg_kserve-ci-e2e-test(7aa25448-5a33-4bfe-9cc1-8afb2ef3ec4c)\"" pod="kserve-ci-e2e-test/isvc-init-fail-87d4cc-predictor-8499dc5f9d-kzwxg" podUID="7aa25448-5a33-4bfe-9cc1-8afb2ef3ec4c" Mar 18 17:15:27.527181 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:27.527158 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-c6d0b3-predictor-678d4cb6f9-9nhhx" Mar 18 17:15:27.527882 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:27.527857 2575 generic.go:358] "Generic (PLEG): container finished" podID="8cc86311-8a4b-4bba-adcc-46ffe55d6bbf" containerID="417554d602dbea59263d94b605002f91913a720a55e6c24fa32b551dab49b339" exitCode=0 Mar 18 17:15:27.527983 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:27.527912 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-c6d0b3-predictor-678d4cb6f9-9nhhx" event={"ID":"8cc86311-8a4b-4bba-adcc-46ffe55d6bbf","Type":"ContainerDied","Data":"417554d602dbea59263d94b605002f91913a720a55e6c24fa32b551dab49b339"} Mar 18 17:15:27.527983 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:27.527945 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-c6d0b3-predictor-678d4cb6f9-9nhhx" event={"ID":"8cc86311-8a4b-4bba-adcc-46ffe55d6bbf","Type":"ContainerDied","Data":"ce3ab32e3e5c1267c25cf66d589d6c06a0443acd4be1740395cb9f6fe0d2b0e1"} Mar 18 17:15:27.527983 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:27.527968 2575 scope.go:117] "RemoveContainer" containerID="417554d602dbea59263d94b605002f91913a720a55e6c24fa32b551dab49b339" Mar 18 17:15:27.536149 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:27.536131 2575 scope.go:117] "RemoveContainer" containerID="fcb7de9e30f01b6d67e6e9e70ce045609858f4574f1a34e213325724e948abdf" Mar 18 17:15:27.547444 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:27.547420 2575 scope.go:117] "RemoveContainer" containerID="417554d602dbea59263d94b605002f91913a720a55e6c24fa32b551dab49b339" Mar 18 17:15:27.548188 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:15:27.548047 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"417554d602dbea59263d94b605002f91913a720a55e6c24fa32b551dab49b339\": container with ID starting with 417554d602dbea59263d94b605002f91913a720a55e6c24fa32b551dab49b339 not found: ID does not exist" containerID="417554d602dbea59263d94b605002f91913a720a55e6c24fa32b551dab49b339" Mar 18 17:15:27.548188 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:27.548095 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"417554d602dbea59263d94b605002f91913a720a55e6c24fa32b551dab49b339"} err="failed to get container status \"417554d602dbea59263d94b605002f91913a720a55e6c24fa32b551dab49b339\": rpc error: code = NotFound desc = could not find container \"417554d602dbea59263d94b605002f91913a720a55e6c24fa32b551dab49b339\": container with ID starting with 417554d602dbea59263d94b605002f91913a720a55e6c24fa32b551dab49b339 not found: ID does not exist" Mar 18 17:15:27.548188 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:27.548123 2575 scope.go:117] "RemoveContainer" containerID="fcb7de9e30f01b6d67e6e9e70ce045609858f4574f1a34e213325724e948abdf" Mar 18 17:15:27.548492 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:15:27.548449 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fcb7de9e30f01b6d67e6e9e70ce045609858f4574f1a34e213325724e948abdf\": container with ID starting with fcb7de9e30f01b6d67e6e9e70ce045609858f4574f1a34e213325724e948abdf not found: ID does not exist" containerID="fcb7de9e30f01b6d67e6e9e70ce045609858f4574f1a34e213325724e948abdf" Mar 18 17:15:27.548592 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:27.548502 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fcb7de9e30f01b6d67e6e9e70ce045609858f4574f1a34e213325724e948abdf"} err="failed to get container status \"fcb7de9e30f01b6d67e6e9e70ce045609858f4574f1a34e213325724e948abdf\": rpc error: code = NotFound desc = could not find container \"fcb7de9e30f01b6d67e6e9e70ce045609858f4574f1a34e213325724e948abdf\": container with ID starting with fcb7de9e30f01b6d67e6e9e70ce045609858f4574f1a34e213325724e948abdf not found: ID does not exist" Mar 18 17:15:27.605471 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:27.605448 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-65rpb\" (UniqueName: \"kubernetes.io/projected/8cc86311-8a4b-4bba-adcc-46ffe55d6bbf-kube-api-access-65rpb\") pod \"8cc86311-8a4b-4bba-adcc-46ffe55d6bbf\" (UID: \"8cc86311-8a4b-4bba-adcc-46ffe55d6bbf\") " Mar 18 17:15:27.605629 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:27.605536 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8cc86311-8a4b-4bba-adcc-46ffe55d6bbf-kserve-provision-location\") pod \"8cc86311-8a4b-4bba-adcc-46ffe55d6bbf\" (UID: \"8cc86311-8a4b-4bba-adcc-46ffe55d6bbf\") " Mar 18 17:15:27.605850 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:27.605823 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8cc86311-8a4b-4bba-adcc-46ffe55d6bbf-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "8cc86311-8a4b-4bba-adcc-46ffe55d6bbf" (UID: "8cc86311-8a4b-4bba-adcc-46ffe55d6bbf"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 18 17:15:27.607775 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:27.607748 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cc86311-8a4b-4bba-adcc-46ffe55d6bbf-kube-api-access-65rpb" (OuterVolumeSpecName: "kube-api-access-65rpb") pod "8cc86311-8a4b-4bba-adcc-46ffe55d6bbf" (UID: "8cc86311-8a4b-4bba-adcc-46ffe55d6bbf"). InnerVolumeSpecName "kube-api-access-65rpb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 17:15:27.672468 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:27.672433 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-87d4cc-predictor-8499dc5f9d-kzwxg"] Mar 18 17:15:27.706940 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:27.706909 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-65rpb\" (UniqueName: \"kubernetes.io/projected/8cc86311-8a4b-4bba-adcc-46ffe55d6bbf-kube-api-access-65rpb\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 17:15:27.706940 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:27.706937 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8cc86311-8a4b-4bba-adcc-46ffe55d6bbf-kserve-provision-location\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 17:15:27.795285 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:27.795263 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-87d4cc-predictor-8499dc5f9d-kzwxg_7aa25448-5a33-4bfe-9cc1-8afb2ef3ec4c/storage-initializer/1.log" Mar 18 17:15:27.795467 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:27.795321 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-87d4cc-predictor-8499dc5f9d-kzwxg" Mar 18 17:15:27.908440 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:27.908407 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7aa25448-5a33-4bfe-9cc1-8afb2ef3ec4c-kserve-provision-location\") pod \"7aa25448-5a33-4bfe-9cc1-8afb2ef3ec4c\" (UID: \"7aa25448-5a33-4bfe-9cc1-8afb2ef3ec4c\") " Mar 18 17:15:27.908595 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:27.908454 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z56tg\" (UniqueName: \"kubernetes.io/projected/7aa25448-5a33-4bfe-9cc1-8afb2ef3ec4c-kube-api-access-z56tg\") pod \"7aa25448-5a33-4bfe-9cc1-8afb2ef3ec4c\" (UID: \"7aa25448-5a33-4bfe-9cc1-8afb2ef3ec4c\") " Mar 18 17:15:27.908675 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:27.908653 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7aa25448-5a33-4bfe-9cc1-8afb2ef3ec4c-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "7aa25448-5a33-4bfe-9cc1-8afb2ef3ec4c" (UID: "7aa25448-5a33-4bfe-9cc1-8afb2ef3ec4c"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 18 17:15:27.910610 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:27.910592 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7aa25448-5a33-4bfe-9cc1-8afb2ef3ec4c-kube-api-access-z56tg" (OuterVolumeSpecName: "kube-api-access-z56tg") pod "7aa25448-5a33-4bfe-9cc1-8afb2ef3ec4c" (UID: "7aa25448-5a33-4bfe-9cc1-8afb2ef3ec4c"). InnerVolumeSpecName "kube-api-access-z56tg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 17:15:27.953194 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:27.953122 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-6c564ccf49-9wq2x"] Mar 18 17:15:27.953469 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:27.953457 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7aa25448-5a33-4bfe-9cc1-8afb2ef3ec4c" containerName="storage-initializer" Mar 18 17:15:27.953523 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:27.953470 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="7aa25448-5a33-4bfe-9cc1-8afb2ef3ec4c" containerName="storage-initializer" Mar 18 17:15:27.953523 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:27.953481 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8cc86311-8a4b-4bba-adcc-46ffe55d6bbf" containerName="kserve-container" Mar 18 17:15:27.953523 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:27.953486 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cc86311-8a4b-4bba-adcc-46ffe55d6bbf" containerName="kserve-container" Mar 18 17:15:27.953523 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:27.953502 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8cc86311-8a4b-4bba-adcc-46ffe55d6bbf" containerName="storage-initializer" Mar 18 17:15:27.953523 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:27.953515 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cc86311-8a4b-4bba-adcc-46ffe55d6bbf" containerName="storage-initializer" Mar 18 17:15:27.953702 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:27.953567 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="7aa25448-5a33-4bfe-9cc1-8afb2ef3ec4c" containerName="storage-initializer" Mar 18 17:15:27.953702 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:27.953575 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="8cc86311-8a4b-4bba-adcc-46ffe55d6bbf" containerName="kserve-container" Mar 18 17:15:27.953702 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:27.953629 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7aa25448-5a33-4bfe-9cc1-8afb2ef3ec4c" containerName="storage-initializer" Mar 18 17:15:27.953702 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:27.953634 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="7aa25448-5a33-4bfe-9cc1-8afb2ef3ec4c" containerName="storage-initializer" Mar 18 17:15:27.953702 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:27.953693 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="7aa25448-5a33-4bfe-9cc1-8afb2ef3ec4c" containerName="storage-initializer" Mar 18 17:15:27.958146 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:27.958130 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-6c564ccf49-9wq2x" Mar 18 17:15:27.966397 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:27.966348 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-6c564ccf49-9wq2x"] Mar 18 17:15:28.009153 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:28.009126 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cw8xq\" (UniqueName: \"kubernetes.io/projected/f989052e-cbc1-4067-ab3c-6a9fe3acfc78-kube-api-access-cw8xq\") pod \"isvc-predictive-sklearn-predictor-6c564ccf49-9wq2x\" (UID: \"f989052e-cbc1-4067-ab3c-6a9fe3acfc78\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-6c564ccf49-9wq2x" Mar 18 17:15:28.009278 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:28.009197 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f989052e-cbc1-4067-ab3c-6a9fe3acfc78-kserve-provision-location\") pod \"isvc-predictive-sklearn-predictor-6c564ccf49-9wq2x\" (UID: \"f989052e-cbc1-4067-ab3c-6a9fe3acfc78\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-6c564ccf49-9wq2x" Mar 18 17:15:28.009278 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:28.009228 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7aa25448-5a33-4bfe-9cc1-8afb2ef3ec4c-kserve-provision-location\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 17:15:28.009278 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:28.009239 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-z56tg\" (UniqueName: \"kubernetes.io/projected/7aa25448-5a33-4bfe-9cc1-8afb2ef3ec4c-kube-api-access-z56tg\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 17:15:28.110429 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:28.110396 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f989052e-cbc1-4067-ab3c-6a9fe3acfc78-kserve-provision-location\") pod \"isvc-predictive-sklearn-predictor-6c564ccf49-9wq2x\" (UID: \"f989052e-cbc1-4067-ab3c-6a9fe3acfc78\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-6c564ccf49-9wq2x" Mar 18 17:15:28.110600 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:28.110440 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cw8xq\" (UniqueName: \"kubernetes.io/projected/f989052e-cbc1-4067-ab3c-6a9fe3acfc78-kube-api-access-cw8xq\") pod \"isvc-predictive-sklearn-predictor-6c564ccf49-9wq2x\" (UID: \"f989052e-cbc1-4067-ab3c-6a9fe3acfc78\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-6c564ccf49-9wq2x" Mar 18 17:15:28.110784 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:28.110766 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f989052e-cbc1-4067-ab3c-6a9fe3acfc78-kserve-provision-location\") pod \"isvc-predictive-sklearn-predictor-6c564ccf49-9wq2x\" (UID: \"f989052e-cbc1-4067-ab3c-6a9fe3acfc78\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-6c564ccf49-9wq2x" Mar 18 17:15:28.118193 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:28.118169 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cw8xq\" (UniqueName: \"kubernetes.io/projected/f989052e-cbc1-4067-ab3c-6a9fe3acfc78-kube-api-access-cw8xq\") pod \"isvc-predictive-sklearn-predictor-6c564ccf49-9wq2x\" (UID: \"f989052e-cbc1-4067-ab3c-6a9fe3acfc78\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-6c564ccf49-9wq2x" Mar 18 17:15:28.269157 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:28.269061 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-6c564ccf49-9wq2x" Mar 18 17:15:28.390959 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:28.390802 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-6c564ccf49-9wq2x"] Mar 18 17:15:28.393489 ip-10-0-139-49 kubenswrapper[2575]: W0318 17:15:28.393464 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf989052e_cbc1_4067_ab3c_6a9fe3acfc78.slice/crio-29ef21799e4635629b1acf757cde80e60600cf52fc7cb5635b94a2af91a81400 WatchSource:0}: Error finding container 29ef21799e4635629b1acf757cde80e60600cf52fc7cb5635b94a2af91a81400: Status 404 returned error can't find the container with id 29ef21799e4635629b1acf757cde80e60600cf52fc7cb5635b94a2af91a81400 Mar 18 17:15:28.533083 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:28.532992 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-6c564ccf49-9wq2x" event={"ID":"f989052e-cbc1-4067-ab3c-6a9fe3acfc78","Type":"ContainerStarted","Data":"46972bb52996b9c4f7bb1bd00ae896fb0d2ad0d28f7fdf59c003d5404d48f9ad"} Mar 18 17:15:28.533083 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:28.533032 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-6c564ccf49-9wq2x" event={"ID":"f989052e-cbc1-4067-ab3c-6a9fe3acfc78","Type":"ContainerStarted","Data":"29ef21799e4635629b1acf757cde80e60600cf52fc7cb5635b94a2af91a81400"} Mar 18 17:15:28.534040 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:28.534021 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-c6d0b3-predictor-678d4cb6f9-9nhhx" Mar 18 17:15:28.535291 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:28.535268 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-87d4cc-predictor-8499dc5f9d-kzwxg_7aa25448-5a33-4bfe-9cc1-8afb2ef3ec4c/storage-initializer/1.log" Mar 18 17:15:28.535420 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:28.535386 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-87d4cc-predictor-8499dc5f9d-kzwxg" event={"ID":"7aa25448-5a33-4bfe-9cc1-8afb2ef3ec4c","Type":"ContainerDied","Data":"688b0dac0d4f6cc2fcb853a73b3c1bd191a41bbd825c9616af863c77557c97ac"} Mar 18 17:15:28.535420 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:28.535397 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-87d4cc-predictor-8499dc5f9d-kzwxg" Mar 18 17:15:28.535515 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:28.535427 2575 scope.go:117] "RemoveContainer" containerID="97b9950490538ced77c2c1d1936ad3ce50339ef3077abc9b8be03e487e7ae301" Mar 18 17:15:28.561858 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:28.561817 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-c6d0b3-predictor-678d4cb6f9-9nhhx"] Mar 18 17:15:28.563280 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:28.563251 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-c6d0b3-predictor-678d4cb6f9-9nhhx"] Mar 18 17:15:28.586256 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:28.586220 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-87d4cc-predictor-8499dc5f9d-kzwxg"] Mar 18 17:15:28.588623 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:28.588590 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-87d4cc-predictor-8499dc5f9d-kzwxg"] Mar 18 17:15:29.287159 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:15:29.287126 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:15:30.290488 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:30.290456 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7aa25448-5a33-4bfe-9cc1-8afb2ef3ec4c" path="/var/lib/kubelet/pods/7aa25448-5a33-4bfe-9cc1-8afb2ef3ec4c/volumes" Mar 18 17:15:30.290873 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:30.290804 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cc86311-8a4b-4bba-adcc-46ffe55d6bbf" path="/var/lib/kubelet/pods/8cc86311-8a4b-4bba-adcc-46ffe55d6bbf/volumes" Mar 18 17:15:32.555596 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:32.555548 2575 generic.go:358] "Generic (PLEG): container finished" podID="f989052e-cbc1-4067-ab3c-6a9fe3acfc78" containerID="46972bb52996b9c4f7bb1bd00ae896fb0d2ad0d28f7fdf59c003d5404d48f9ad" exitCode=0 Mar 18 17:15:32.556015 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:32.555595 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-6c564ccf49-9wq2x" event={"ID":"f989052e-cbc1-4067-ab3c-6a9fe3acfc78","Type":"ContainerDied","Data":"46972bb52996b9c4f7bb1bd00ae896fb0d2ad0d28f7fdf59c003d5404d48f9ad"} Mar 18 17:15:32.556744 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:32.556728 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 17:15:43.287426 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:15:43.287381 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:15:54.642180 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:54.642141 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-6c564ccf49-9wq2x" event={"ID":"f989052e-cbc1-4067-ab3c-6a9fe3acfc78","Type":"ContainerStarted","Data":"a4e6b456a7dd22939c8c286f57cff6510d8c2fd42e171ef4c2bc82e26b9c3c74"} Mar 18 17:15:54.642616 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:54.642459 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-6c564ccf49-9wq2x" Mar 18 17:15:54.643599 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:54.643572 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-6c564ccf49-9wq2x" podUID="f989052e-cbc1-4067-ab3c-6a9fe3acfc78" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.51:8080: connect: connection refused" Mar 18 17:15:54.660540 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:54.660494 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-6c564ccf49-9wq2x" podStartSLOduration=6.137895497 podStartE2EDuration="27.66048036s" podCreationTimestamp="2026-03-18 17:15:27 +0000 UTC" firstStartedPulling="2026-03-18 17:15:32.556847207 +0000 UTC m=+1850.868100107" lastFinishedPulling="2026-03-18 17:15:54.079432071 +0000 UTC m=+1872.390684970" observedRunningTime="2026-03-18 17:15:54.658822862 +0000 UTC m=+1872.970075788" watchObservedRunningTime="2026-03-18 17:15:54.66048036 +0000 UTC m=+1872.971733281" Mar 18 17:15:55.645963 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:15:55.645921 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-6c564ccf49-9wq2x" podUID="f989052e-cbc1-4067-ab3c-6a9fe3acfc78" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.51:8080: connect: connection refused" Mar 18 17:15:57.287270 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:15:57.287239 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:16:05.646168 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:16:05.646121 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-6c564ccf49-9wq2x" podUID="f989052e-cbc1-4067-ab3c-6a9fe3acfc78" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.51:8080: connect: connection refused" Mar 18 17:16:09.287126 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:16:09.287095 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:16:15.646531 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:16:15.646488 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-6c564ccf49-9wq2x" podUID="f989052e-cbc1-4067-ab3c-6a9fe3acfc78" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.51:8080: connect: connection refused" Mar 18 17:16:23.287014 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:16:23.286980 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:16:25.646662 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:16:25.646620 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-6c564ccf49-9wq2x" podUID="f989052e-cbc1-4067-ab3c-6a9fe3acfc78" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.51:8080: connect: connection refused" Mar 18 17:16:35.286605 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:16:35.286574 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:16:35.646035 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:16:35.645998 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-6c564ccf49-9wq2x" podUID="f989052e-cbc1-4067-ab3c-6a9fe3acfc78" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.51:8080: connect: connection refused" Mar 18 17:16:45.646819 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:16:45.646780 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-6c564ccf49-9wq2x" podUID="f989052e-cbc1-4067-ab3c-6a9fe3acfc78" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.51:8080: connect: connection refused" Mar 18 17:16:48.287157 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:16:48.287125 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:16:55.646606 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:16:55.646565 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-6c564ccf49-9wq2x" podUID="f989052e-cbc1-4067-ab3c-6a9fe3acfc78" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.51:8080: connect: connection refused" Mar 18 17:17:00.290820 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:17:00.290734 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:17:05.647532 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:17:05.647497 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-6c564ccf49-9wq2x" Mar 18 17:17:07.975520 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:17:07.975483 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-6c564ccf49-9wq2x"] Mar 18 17:17:07.975903 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:17:07.975713 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-6c564ccf49-9wq2x" podUID="f989052e-cbc1-4067-ab3c-6a9fe3acfc78" containerName="kserve-container" containerID="cri-o://a4e6b456a7dd22939c8c286f57cff6510d8c2fd42e171ef4c2bc82e26b9c3c74" gracePeriod=30 Mar 18 17:17:08.347153 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:17:08.347068 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-57fb54b9b8-dqp9w"] Mar 18 17:17:08.350507 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:17:08.350486 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-57fb54b9b8-dqp9w" Mar 18 17:17:08.357925 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:17:08.357892 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-57fb54b9b8-dqp9w"] Mar 18 17:17:08.444954 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:17:08.444923 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9e97971d-1fc6-4cc8-8cab-2d11ed56b422-kserve-provision-location\") pod \"isvc-predictive-xgboost-predictor-57fb54b9b8-dqp9w\" (UID: \"9e97971d-1fc6-4cc8-8cab-2d11ed56b422\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-57fb54b9b8-dqp9w" Mar 18 17:17:08.445129 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:17:08.445061 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xd8c\" (UniqueName: \"kubernetes.io/projected/9e97971d-1fc6-4cc8-8cab-2d11ed56b422-kube-api-access-2xd8c\") pod \"isvc-predictive-xgboost-predictor-57fb54b9b8-dqp9w\" (UID: \"9e97971d-1fc6-4cc8-8cab-2d11ed56b422\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-57fb54b9b8-dqp9w" Mar 18 17:17:08.545881 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:17:08.545842 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2xd8c\" (UniqueName: \"kubernetes.io/projected/9e97971d-1fc6-4cc8-8cab-2d11ed56b422-kube-api-access-2xd8c\") pod \"isvc-predictive-xgboost-predictor-57fb54b9b8-dqp9w\" (UID: \"9e97971d-1fc6-4cc8-8cab-2d11ed56b422\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-57fb54b9b8-dqp9w" Mar 18 17:17:08.546082 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:17:08.545890 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9e97971d-1fc6-4cc8-8cab-2d11ed56b422-kserve-provision-location\") pod \"isvc-predictive-xgboost-predictor-57fb54b9b8-dqp9w\" (UID: \"9e97971d-1fc6-4cc8-8cab-2d11ed56b422\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-57fb54b9b8-dqp9w" Mar 18 17:17:08.546278 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:17:08.546259 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9e97971d-1fc6-4cc8-8cab-2d11ed56b422-kserve-provision-location\") pod \"isvc-predictive-xgboost-predictor-57fb54b9b8-dqp9w\" (UID: \"9e97971d-1fc6-4cc8-8cab-2d11ed56b422\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-57fb54b9b8-dqp9w" Mar 18 17:17:08.553179 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:17:08.553150 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xd8c\" (UniqueName: \"kubernetes.io/projected/9e97971d-1fc6-4cc8-8cab-2d11ed56b422-kube-api-access-2xd8c\") pod \"isvc-predictive-xgboost-predictor-57fb54b9b8-dqp9w\" (UID: \"9e97971d-1fc6-4cc8-8cab-2d11ed56b422\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-57fb54b9b8-dqp9w" Mar 18 17:17:08.661046 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:17:08.661013 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-57fb54b9b8-dqp9w" Mar 18 17:17:08.781852 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:17:08.781822 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-57fb54b9b8-dqp9w"] Mar 18 17:17:08.783742 ip-10-0-139-49 kubenswrapper[2575]: W0318 17:17:08.783710 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e97971d_1fc6_4cc8_8cab_2d11ed56b422.slice/crio-9cb222d3dffd8bb78328fa0deaf5086a914a93fb01a68e0290abb5830c224cba WatchSource:0}: Error finding container 9cb222d3dffd8bb78328fa0deaf5086a914a93fb01a68e0290abb5830c224cba: Status 404 returned error can't find the container with id 9cb222d3dffd8bb78328fa0deaf5086a914a93fb01a68e0290abb5830c224cba Mar 18 17:17:08.887945 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:17:08.887912 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-57fb54b9b8-dqp9w" event={"ID":"9e97971d-1fc6-4cc8-8cab-2d11ed56b422","Type":"ContainerStarted","Data":"efe22a39036deea9fa3f6057faed5c5bace8e2bb273ef13d17ed5e796757dd25"} Mar 18 17:17:08.888119 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:17:08.887952 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-57fb54b9b8-dqp9w" event={"ID":"9e97971d-1fc6-4cc8-8cab-2d11ed56b422","Type":"ContainerStarted","Data":"9cb222d3dffd8bb78328fa0deaf5086a914a93fb01a68e0290abb5830c224cba"} Mar 18 17:17:11.286726 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:17:11.286694 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:17:12.726925 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:17:12.726897 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-6c564ccf49-9wq2x" Mar 18 17:17:12.879942 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:17:12.879905 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cw8xq\" (UniqueName: \"kubernetes.io/projected/f989052e-cbc1-4067-ab3c-6a9fe3acfc78-kube-api-access-cw8xq\") pod \"f989052e-cbc1-4067-ab3c-6a9fe3acfc78\" (UID: \"f989052e-cbc1-4067-ab3c-6a9fe3acfc78\") " Mar 18 17:17:12.880103 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:17:12.880000 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f989052e-cbc1-4067-ab3c-6a9fe3acfc78-kserve-provision-location\") pod \"f989052e-cbc1-4067-ab3c-6a9fe3acfc78\" (UID: \"f989052e-cbc1-4067-ab3c-6a9fe3acfc78\") " Mar 18 17:17:12.880344 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:17:12.880317 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f989052e-cbc1-4067-ab3c-6a9fe3acfc78-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f989052e-cbc1-4067-ab3c-6a9fe3acfc78" (UID: "f989052e-cbc1-4067-ab3c-6a9fe3acfc78"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 18 17:17:12.882199 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:17:12.882177 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f989052e-cbc1-4067-ab3c-6a9fe3acfc78-kube-api-access-cw8xq" (OuterVolumeSpecName: "kube-api-access-cw8xq") pod "f989052e-cbc1-4067-ab3c-6a9fe3acfc78" (UID: "f989052e-cbc1-4067-ab3c-6a9fe3acfc78"). InnerVolumeSpecName "kube-api-access-cw8xq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 17:17:12.904244 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:17:12.904219 2575 generic.go:358] "Generic (PLEG): container finished" podID="9e97971d-1fc6-4cc8-8cab-2d11ed56b422" containerID="efe22a39036deea9fa3f6057faed5c5bace8e2bb273ef13d17ed5e796757dd25" exitCode=0 Mar 18 17:17:12.904352 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:17:12.904295 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-57fb54b9b8-dqp9w" event={"ID":"9e97971d-1fc6-4cc8-8cab-2d11ed56b422","Type":"ContainerDied","Data":"efe22a39036deea9fa3f6057faed5c5bace8e2bb273ef13d17ed5e796757dd25"} Mar 18 17:17:12.905860 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:17:12.905838 2575 generic.go:358] "Generic (PLEG): container finished" podID="f989052e-cbc1-4067-ab3c-6a9fe3acfc78" containerID="a4e6b456a7dd22939c8c286f57cff6510d8c2fd42e171ef4c2bc82e26b9c3c74" exitCode=0 Mar 18 17:17:12.905963 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:17:12.905871 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-6c564ccf49-9wq2x" event={"ID":"f989052e-cbc1-4067-ab3c-6a9fe3acfc78","Type":"ContainerDied","Data":"a4e6b456a7dd22939c8c286f57cff6510d8c2fd42e171ef4c2bc82e26b9c3c74"} Mar 18 17:17:12.905963 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:17:12.905888 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-6c564ccf49-9wq2x" event={"ID":"f989052e-cbc1-4067-ab3c-6a9fe3acfc78","Type":"ContainerDied","Data":"29ef21799e4635629b1acf757cde80e60600cf52fc7cb5635b94a2af91a81400"} Mar 18 17:17:12.905963 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:17:12.905893 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-6c564ccf49-9wq2x" Mar 18 17:17:12.905963 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:17:12.905903 2575 scope.go:117] "RemoveContainer" containerID="a4e6b456a7dd22939c8c286f57cff6510d8c2fd42e171ef4c2bc82e26b9c3c74" Mar 18 17:17:12.914793 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:17:12.914779 2575 scope.go:117] "RemoveContainer" containerID="46972bb52996b9c4f7bb1bd00ae896fb0d2ad0d28f7fdf59c003d5404d48f9ad" Mar 18 17:17:12.926227 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:17:12.926205 2575 scope.go:117] "RemoveContainer" containerID="a4e6b456a7dd22939c8c286f57cff6510d8c2fd42e171ef4c2bc82e26b9c3c74" Mar 18 17:17:12.926497 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:17:12.926476 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4e6b456a7dd22939c8c286f57cff6510d8c2fd42e171ef4c2bc82e26b9c3c74\": container with ID starting with a4e6b456a7dd22939c8c286f57cff6510d8c2fd42e171ef4c2bc82e26b9c3c74 not found: ID does not exist" containerID="a4e6b456a7dd22939c8c286f57cff6510d8c2fd42e171ef4c2bc82e26b9c3c74" Mar 18 17:17:12.926595 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:17:12.926502 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4e6b456a7dd22939c8c286f57cff6510d8c2fd42e171ef4c2bc82e26b9c3c74"} err="failed to get container status \"a4e6b456a7dd22939c8c286f57cff6510d8c2fd42e171ef4c2bc82e26b9c3c74\": rpc error: code = NotFound desc = could not find container \"a4e6b456a7dd22939c8c286f57cff6510d8c2fd42e171ef4c2bc82e26b9c3c74\": container with ID starting with a4e6b456a7dd22939c8c286f57cff6510d8c2fd42e171ef4c2bc82e26b9c3c74 not found: ID does not exist" Mar 18 17:17:12.926595 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:17:12.926519 2575 scope.go:117] "RemoveContainer" containerID="46972bb52996b9c4f7bb1bd00ae896fb0d2ad0d28f7fdf59c003d5404d48f9ad" Mar 18 17:17:12.926794 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:17:12.926757 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46972bb52996b9c4f7bb1bd00ae896fb0d2ad0d28f7fdf59c003d5404d48f9ad\": container with ID starting with 46972bb52996b9c4f7bb1bd00ae896fb0d2ad0d28f7fdf59c003d5404d48f9ad not found: ID does not exist" containerID="46972bb52996b9c4f7bb1bd00ae896fb0d2ad0d28f7fdf59c003d5404d48f9ad" Mar 18 17:17:12.926843 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:17:12.926785 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46972bb52996b9c4f7bb1bd00ae896fb0d2ad0d28f7fdf59c003d5404d48f9ad"} err="failed to get container status \"46972bb52996b9c4f7bb1bd00ae896fb0d2ad0d28f7fdf59c003d5404d48f9ad\": rpc error: code = NotFound desc = could not find container \"46972bb52996b9c4f7bb1bd00ae896fb0d2ad0d28f7fdf59c003d5404d48f9ad\": container with ID starting with 46972bb52996b9c4f7bb1bd00ae896fb0d2ad0d28f7fdf59c003d5404d48f9ad not found: ID does not exist" Mar 18 17:17:12.931637 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:17:12.931567 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-6c564ccf49-9wq2x"] Mar 18 17:17:12.933760 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:17:12.933742 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-6c564ccf49-9wq2x"] Mar 18 17:17:12.980521 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:17:12.980502 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f989052e-cbc1-4067-ab3c-6a9fe3acfc78-kserve-provision-location\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 17:17:12.980609 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:17:12.980524 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cw8xq\" (UniqueName: \"kubernetes.io/projected/f989052e-cbc1-4067-ab3c-6a9fe3acfc78-kube-api-access-cw8xq\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 17:17:13.911116 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:17:13.911079 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-57fb54b9b8-dqp9w" event={"ID":"9e97971d-1fc6-4cc8-8cab-2d11ed56b422","Type":"ContainerStarted","Data":"013f72df7959a763fa1a48932d225226c80ac6a5e1c54b1931ef61127bbb8168"} Mar 18 17:17:13.911564 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:17:13.911437 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-57fb54b9b8-dqp9w" Mar 18 17:17:13.912906 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:17:13.912873 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-57fb54b9b8-dqp9w" podUID="9e97971d-1fc6-4cc8-8cab-2d11ed56b422" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.52:8080: connect: connection refused" Mar 18 17:17:13.927873 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:17:13.927831 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-57fb54b9b8-dqp9w" podStartSLOduration=5.927818692 podStartE2EDuration="5.927818692s" podCreationTimestamp="2026-03-18 17:17:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:17:13.92637974 +0000 UTC m=+1952.237632655" watchObservedRunningTime="2026-03-18 17:17:13.927818692 +0000 UTC m=+1952.239071612" Mar 18 17:17:14.290389 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:17:14.290292 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f989052e-cbc1-4067-ab3c-6a9fe3acfc78" path="/var/lib/kubelet/pods/f989052e-cbc1-4067-ab3c-6a9fe3acfc78/volumes" Mar 18 17:17:14.918765 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:17:14.918729 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-57fb54b9b8-dqp9w" podUID="9e97971d-1fc6-4cc8-8cab-2d11ed56b422" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.52:8080: connect: connection refused" Mar 18 17:17:24.918989 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:17:24.918892 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-57fb54b9b8-dqp9w" podUID="9e97971d-1fc6-4cc8-8cab-2d11ed56b422" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.52:8080: connect: connection refused" Mar 18 17:17:25.286491 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:17:25.286461 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:17:34.919639 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:17:34.919601 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-57fb54b9b8-dqp9w" podUID="9e97971d-1fc6-4cc8-8cab-2d11ed56b422" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.52:8080: connect: connection refused" Mar 18 17:17:36.287535 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:17:36.287494 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:17:44.919175 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:17:44.919130 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-57fb54b9b8-dqp9w" podUID="9e97971d-1fc6-4cc8-8cab-2d11ed56b422" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.52:8080: connect: connection refused" Mar 18 17:17:49.286572 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:17:49.286542 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:17:54.919372 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:17:54.919326 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-57fb54b9b8-dqp9w" podUID="9e97971d-1fc6-4cc8-8cab-2d11ed56b422" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.52:8080: connect: connection refused" Mar 18 17:18:04.682249 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:18:04.682208 2575 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized" image="quay.io/opendatahub/odh-model-serving-api:fast" Mar 18 17:18:04.682647 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:18:04.682401 2575 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:server,Image:quay.io/opendatahub/odh-model-serving-api:fast,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},ContainerPort{Name:metrics,HostPort:0,ContainerPort:9090,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:TLS_CERT_DIR,Value:/tls,ValueFrom:nil,},EnvVar{Name:GATEWAY_LABEL_SELECTOR,Value:,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{50 -3} {} 50m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tls-certs,ReadOnly:true,MountPath:/tls,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vhs26,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000640000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod model-serving-api-9699c8d45-ttb7x_kserve(e41abd12-239d-46fb-9cdb-9d35fa51024d): ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized" logger="UnhandledError" Mar 18 17:18:04.683590 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:18:04.683563 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ErrImagePull: \"unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:18:04.918775 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:18:04.918732 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-57fb54b9b8-dqp9w" podUID="9e97971d-1fc6-4cc8-8cab-2d11ed56b422" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.52:8080: connect: connection refused" Mar 18 17:18:14.918818 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:18:14.918772 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-57fb54b9b8-dqp9w" podUID="9e97971d-1fc6-4cc8-8cab-2d11ed56b422" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.52:8080: connect: connection refused" Mar 18 17:18:17.286454 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:18:17.286411 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-57fb54b9b8-dqp9w" podUID="9e97971d-1fc6-4cc8-8cab-2d11ed56b422" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.52:8080: connect: connection refused" Mar 18 17:18:19.286624 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:18:19.286562 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:18:27.287522 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:18:27.287491 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-57fb54b9b8-dqp9w" Mar 18 17:18:28.173246 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:18:28.173209 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-57fb54b9b8-dqp9w"] Mar 18 17:18:28.173619 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:18:28.173566 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-57fb54b9b8-dqp9w" podUID="9e97971d-1fc6-4cc8-8cab-2d11ed56b422" containerName="kserve-container" containerID="cri-o://013f72df7959a763fa1a48932d225226c80ac6a5e1c54b1931ef61127bbb8168" gracePeriod=30 Mar 18 17:18:28.355412 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:18:28.355379 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-7f75c98c97-7psv6"] Mar 18 17:18:28.355879 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:18:28.355867 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f989052e-cbc1-4067-ab3c-6a9fe3acfc78" containerName="kserve-container" Mar 18 17:18:28.355948 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:18:28.355887 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="f989052e-cbc1-4067-ab3c-6a9fe3acfc78" containerName="kserve-container" Mar 18 17:18:28.355948 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:18:28.355910 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f989052e-cbc1-4067-ab3c-6a9fe3acfc78" containerName="storage-initializer" Mar 18 17:18:28.355948 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:18:28.355920 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="f989052e-cbc1-4067-ab3c-6a9fe3acfc78" containerName="storage-initializer" Mar 18 17:18:28.356160 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:18:28.355999 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="f989052e-cbc1-4067-ab3c-6a9fe3acfc78" containerName="kserve-container" Mar 18 17:18:28.359161 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:18:28.359140 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-7f75c98c97-7psv6" Mar 18 17:18:28.367818 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:18:28.367791 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-7f75c98c97-7psv6"] Mar 18 17:18:28.376460 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:18:28.376430 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpzvf\" (UniqueName: \"kubernetes.io/projected/755f481a-ba52-4f66-96fa-b6ae8f2239df-kube-api-access-zpzvf\") pod \"isvc-predictive-lightgbm-predictor-7f75c98c97-7psv6\" (UID: \"755f481a-ba52-4f66-96fa-b6ae8f2239df\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-7f75c98c97-7psv6" Mar 18 17:18:28.376575 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:18:28.376480 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/755f481a-ba52-4f66-96fa-b6ae8f2239df-kserve-provision-location\") pod \"isvc-predictive-lightgbm-predictor-7f75c98c97-7psv6\" (UID: \"755f481a-ba52-4f66-96fa-b6ae8f2239df\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-7f75c98c97-7psv6" Mar 18 17:18:28.477477 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:18:28.477391 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zpzvf\" (UniqueName: \"kubernetes.io/projected/755f481a-ba52-4f66-96fa-b6ae8f2239df-kube-api-access-zpzvf\") pod \"isvc-predictive-lightgbm-predictor-7f75c98c97-7psv6\" (UID: \"755f481a-ba52-4f66-96fa-b6ae8f2239df\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-7f75c98c97-7psv6" Mar 18 17:18:28.477477 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:18:28.477442 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/755f481a-ba52-4f66-96fa-b6ae8f2239df-kserve-provision-location\") pod \"isvc-predictive-lightgbm-predictor-7f75c98c97-7psv6\" (UID: \"755f481a-ba52-4f66-96fa-b6ae8f2239df\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-7f75c98c97-7psv6" Mar 18 17:18:28.477899 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:18:28.477878 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/755f481a-ba52-4f66-96fa-b6ae8f2239df-kserve-provision-location\") pod \"isvc-predictive-lightgbm-predictor-7f75c98c97-7psv6\" (UID: \"755f481a-ba52-4f66-96fa-b6ae8f2239df\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-7f75c98c97-7psv6" Mar 18 17:18:28.484765 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:18:28.484741 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpzvf\" (UniqueName: \"kubernetes.io/projected/755f481a-ba52-4f66-96fa-b6ae8f2239df-kube-api-access-zpzvf\") pod \"isvc-predictive-lightgbm-predictor-7f75c98c97-7psv6\" (UID: \"755f481a-ba52-4f66-96fa-b6ae8f2239df\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-7f75c98c97-7psv6" Mar 18 17:18:28.669618 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:18:28.669578 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-7f75c98c97-7psv6" Mar 18 17:18:28.793921 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:18:28.793892 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-7f75c98c97-7psv6"] Mar 18 17:18:28.795618 ip-10-0-139-49 kubenswrapper[2575]: W0318 17:18:28.795582 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod755f481a_ba52_4f66_96fa_b6ae8f2239df.slice/crio-a7578722ade4e3b10a8084d817c0d62827fd83a72b599fd0594f9a01916c12c2 WatchSource:0}: Error finding container a7578722ade4e3b10a8084d817c0d62827fd83a72b599fd0594f9a01916c12c2: Status 404 returned error can't find the container with id a7578722ade4e3b10a8084d817c0d62827fd83a72b599fd0594f9a01916c12c2 Mar 18 17:18:29.166044 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:18:29.166009 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-7f75c98c97-7psv6" event={"ID":"755f481a-ba52-4f66-96fa-b6ae8f2239df","Type":"ContainerStarted","Data":"218c71db3c002e7cfb674ea7fa8065a197fa9770a13cd65e5dbd3f56dbf8d1d1"} Mar 18 17:18:29.166044 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:18:29.166049 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-7f75c98c97-7psv6" event={"ID":"755f481a-ba52-4f66-96fa-b6ae8f2239df","Type":"ContainerStarted","Data":"a7578722ade4e3b10a8084d817c0d62827fd83a72b599fd0594f9a01916c12c2"} Mar 18 17:18:32.715402 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:18:32.715379 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-57fb54b9b8-dqp9w" Mar 18 17:18:32.806513 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:18:32.806426 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9e97971d-1fc6-4cc8-8cab-2d11ed56b422-kserve-provision-location\") pod \"9e97971d-1fc6-4cc8-8cab-2d11ed56b422\" (UID: \"9e97971d-1fc6-4cc8-8cab-2d11ed56b422\") " Mar 18 17:18:32.806513 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:18:32.806480 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2xd8c\" (UniqueName: \"kubernetes.io/projected/9e97971d-1fc6-4cc8-8cab-2d11ed56b422-kube-api-access-2xd8c\") pod \"9e97971d-1fc6-4cc8-8cab-2d11ed56b422\" (UID: \"9e97971d-1fc6-4cc8-8cab-2d11ed56b422\") " Mar 18 17:18:32.806782 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:18:32.806757 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e97971d-1fc6-4cc8-8cab-2d11ed56b422-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "9e97971d-1fc6-4cc8-8cab-2d11ed56b422" (UID: "9e97971d-1fc6-4cc8-8cab-2d11ed56b422"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 18 17:18:32.808630 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:18:32.808612 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e97971d-1fc6-4cc8-8cab-2d11ed56b422-kube-api-access-2xd8c" (OuterVolumeSpecName: "kube-api-access-2xd8c") pod "9e97971d-1fc6-4cc8-8cab-2d11ed56b422" (UID: "9e97971d-1fc6-4cc8-8cab-2d11ed56b422"). InnerVolumeSpecName "kube-api-access-2xd8c". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 17:18:32.907133 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:18:32.907102 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9e97971d-1fc6-4cc8-8cab-2d11ed56b422-kserve-provision-location\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 17:18:32.907133 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:18:32.907129 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2xd8c\" (UniqueName: \"kubernetes.io/projected/9e97971d-1fc6-4cc8-8cab-2d11ed56b422-kube-api-access-2xd8c\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 17:18:33.181244 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:18:33.181207 2575 generic.go:358] "Generic (PLEG): container finished" podID="755f481a-ba52-4f66-96fa-b6ae8f2239df" containerID="218c71db3c002e7cfb674ea7fa8065a197fa9770a13cd65e5dbd3f56dbf8d1d1" exitCode=0 Mar 18 17:18:33.181419 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:18:33.181281 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-7f75c98c97-7psv6" event={"ID":"755f481a-ba52-4f66-96fa-b6ae8f2239df","Type":"ContainerDied","Data":"218c71db3c002e7cfb674ea7fa8065a197fa9770a13cd65e5dbd3f56dbf8d1d1"} Mar 18 17:18:33.182767 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:18:33.182736 2575 generic.go:358] "Generic (PLEG): container finished" podID="9e97971d-1fc6-4cc8-8cab-2d11ed56b422" containerID="013f72df7959a763fa1a48932d225226c80ac6a5e1c54b1931ef61127bbb8168" exitCode=0 Mar 18 17:18:33.182861 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:18:33.182799 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-57fb54b9b8-dqp9w" Mar 18 17:18:33.182861 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:18:33.182815 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-57fb54b9b8-dqp9w" event={"ID":"9e97971d-1fc6-4cc8-8cab-2d11ed56b422","Type":"ContainerDied","Data":"013f72df7959a763fa1a48932d225226c80ac6a5e1c54b1931ef61127bbb8168"} Mar 18 17:18:33.182861 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:18:33.182855 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-57fb54b9b8-dqp9w" event={"ID":"9e97971d-1fc6-4cc8-8cab-2d11ed56b422","Type":"ContainerDied","Data":"9cb222d3dffd8bb78328fa0deaf5086a914a93fb01a68e0290abb5830c224cba"} Mar 18 17:18:33.182964 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:18:33.182870 2575 scope.go:117] "RemoveContainer" containerID="013f72df7959a763fa1a48932d225226c80ac6a5e1c54b1931ef61127bbb8168" Mar 18 17:18:33.192530 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:18:33.192514 2575 scope.go:117] "RemoveContainer" containerID="efe22a39036deea9fa3f6057faed5c5bace8e2bb273ef13d17ed5e796757dd25" Mar 18 17:18:33.204407 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:18:33.204336 2575 scope.go:117] "RemoveContainer" containerID="013f72df7959a763fa1a48932d225226c80ac6a5e1c54b1931ef61127bbb8168" Mar 18 17:18:33.205081 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:18:33.204794 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"013f72df7959a763fa1a48932d225226c80ac6a5e1c54b1931ef61127bbb8168\": container with ID starting with 013f72df7959a763fa1a48932d225226c80ac6a5e1c54b1931ef61127bbb8168 not found: ID does not exist" containerID="013f72df7959a763fa1a48932d225226c80ac6a5e1c54b1931ef61127bbb8168" Mar 18 17:18:33.205081 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:18:33.204831 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"013f72df7959a763fa1a48932d225226c80ac6a5e1c54b1931ef61127bbb8168"} err="failed to get container status \"013f72df7959a763fa1a48932d225226c80ac6a5e1c54b1931ef61127bbb8168\": rpc error: code = NotFound desc = could not find container \"013f72df7959a763fa1a48932d225226c80ac6a5e1c54b1931ef61127bbb8168\": container with ID starting with 013f72df7959a763fa1a48932d225226c80ac6a5e1c54b1931ef61127bbb8168 not found: ID does not exist" Mar 18 17:18:33.205081 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:18:33.204877 2575 scope.go:117] "RemoveContainer" containerID="efe22a39036deea9fa3f6057faed5c5bace8e2bb273ef13d17ed5e796757dd25" Mar 18 17:18:33.205267 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:18:33.205244 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"efe22a39036deea9fa3f6057faed5c5bace8e2bb273ef13d17ed5e796757dd25\": container with ID starting with efe22a39036deea9fa3f6057faed5c5bace8e2bb273ef13d17ed5e796757dd25 not found: ID does not exist" containerID="efe22a39036deea9fa3f6057faed5c5bace8e2bb273ef13d17ed5e796757dd25" Mar 18 17:18:33.205328 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:18:33.205272 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efe22a39036deea9fa3f6057faed5c5bace8e2bb273ef13d17ed5e796757dd25"} err="failed to get container status \"efe22a39036deea9fa3f6057faed5c5bace8e2bb273ef13d17ed5e796757dd25\": rpc error: code = NotFound desc = could not find container \"efe22a39036deea9fa3f6057faed5c5bace8e2bb273ef13d17ed5e796757dd25\": container with ID starting with efe22a39036deea9fa3f6057faed5c5bace8e2bb273ef13d17ed5e796757dd25 not found: ID does not exist" Mar 18 17:18:33.206953 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:18:33.206929 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-57fb54b9b8-dqp9w"] Mar 18 17:18:33.216597 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:18:33.216572 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-57fb54b9b8-dqp9w"] Mar 18 17:18:33.286883 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:18:33.286855 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:18:34.190074 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:18:34.190042 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-7f75c98c97-7psv6" event={"ID":"755f481a-ba52-4f66-96fa-b6ae8f2239df","Type":"ContainerStarted","Data":"b6272cf9085dda664703771766bd1f007eb64a35fb1b3291a659121825e0745b"} Mar 18 17:18:34.190504 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:18:34.190329 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-7f75c98c97-7psv6" Mar 18 17:18:34.191753 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:18:34.191727 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-7f75c98c97-7psv6" podUID="755f481a-ba52-4f66-96fa-b6ae8f2239df" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.53:8080: connect: connection refused" Mar 18 17:18:34.209764 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:18:34.209722 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-7f75c98c97-7psv6" podStartSLOduration=6.209706382 podStartE2EDuration="6.209706382s" podCreationTimestamp="2026-03-18 17:18:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:18:34.2079141 +0000 UTC m=+2032.519167021" watchObservedRunningTime="2026-03-18 17:18:34.209706382 +0000 UTC m=+2032.520959305" Mar 18 17:18:34.290407 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:18:34.290351 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e97971d-1fc6-4cc8-8cab-2d11ed56b422" path="/var/lib/kubelet/pods/9e97971d-1fc6-4cc8-8cab-2d11ed56b422/volumes" Mar 18 17:18:35.193530 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:18:35.193479 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-7f75c98c97-7psv6" podUID="755f481a-ba52-4f66-96fa-b6ae8f2239df" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.53:8080: connect: connection refused" Mar 18 17:18:45.193500 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:18:45.193459 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-7f75c98c97-7psv6" podUID="755f481a-ba52-4f66-96fa-b6ae8f2239df" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.53:8080: connect: connection refused" Mar 18 17:18:45.287643 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:18:45.287610 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:18:55.194451 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:18:55.194367 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-7f75c98c97-7psv6" podUID="755f481a-ba52-4f66-96fa-b6ae8f2239df" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.53:8080: connect: connection refused" Mar 18 17:19:00.286478 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:19:00.286446 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:19:05.193622 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:19:05.193578 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-7f75c98c97-7psv6" podUID="755f481a-ba52-4f66-96fa-b6ae8f2239df" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.53:8080: connect: connection refused" Mar 18 17:19:12.289296 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:19:12.289262 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:19:15.194248 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:19:15.194206 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-7f75c98c97-7psv6" podUID="755f481a-ba52-4f66-96fa-b6ae8f2239df" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.53:8080: connect: connection refused" Mar 18 17:19:25.193988 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:19:25.193940 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-7f75c98c97-7psv6" podUID="755f481a-ba52-4f66-96fa-b6ae8f2239df" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.53:8080: connect: connection refused" Mar 18 17:19:25.286751 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:19:25.286718 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:19:35.194081 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:19:35.194038 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-7f75c98c97-7psv6" podUID="755f481a-ba52-4f66-96fa-b6ae8f2239df" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.53:8080: connect: connection refused" Mar 18 17:19:39.286505 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:19:39.286476 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:19:42.319424 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:19:42.319394 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qktrm_a51f93e2-c750-4700-bb46-109456c7c78a/ovn-acl-logging/0.log" Mar 18 17:19:42.324103 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:19:42.324083 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qktrm_a51f93e2-c750-4700-bb46-109456c7c78a/ovn-acl-logging/0.log" Mar 18 17:19:45.194144 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:19:45.194112 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-7f75c98c97-7psv6" Mar 18 17:19:48.364864 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:19:48.364820 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-7f75c98c97-7psv6"] Mar 18 17:19:48.365346 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:19:48.365115 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-7f75c98c97-7psv6" podUID="755f481a-ba52-4f66-96fa-b6ae8f2239df" containerName="kserve-container" containerID="cri-o://b6272cf9085dda664703771766bd1f007eb64a35fb1b3291a659121825e0745b" gracePeriod=30 Mar 18 17:19:48.467927 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:19:48.467890 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-597b8bfb5-p6qz9"] Mar 18 17:19:48.468343 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:19:48.468323 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9e97971d-1fc6-4cc8-8cab-2d11ed56b422" containerName="kserve-container" Mar 18 17:19:48.468343 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:19:48.468345 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e97971d-1fc6-4cc8-8cab-2d11ed56b422" containerName="kserve-container" Mar 18 17:19:48.468549 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:19:48.468377 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9e97971d-1fc6-4cc8-8cab-2d11ed56b422" containerName="storage-initializer" Mar 18 17:19:48.468549 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:19:48.468386 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e97971d-1fc6-4cc8-8cab-2d11ed56b422" containerName="storage-initializer" Mar 18 17:19:48.468549 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:19:48.468479 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="9e97971d-1fc6-4cc8-8cab-2d11ed56b422" containerName="kserve-container" Mar 18 17:19:48.471769 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:19:48.471749 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-597b8bfb5-p6qz9" Mar 18 17:19:48.477650 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:19:48.477626 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-597b8bfb5-p6qz9"] Mar 18 17:19:48.495005 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:19:48.494977 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgtqg\" (UniqueName: \"kubernetes.io/projected/ec3eed47-00e3-44d0-b67f-1146f5f53e43-kube-api-access-xgtqg\") pod \"isvc-predictive-sklearn-v2-predictor-597b8bfb5-p6qz9\" (UID: \"ec3eed47-00e3-44d0-b67f-1146f5f53e43\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-597b8bfb5-p6qz9" Mar 18 17:19:48.495114 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:19:48.495069 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ec3eed47-00e3-44d0-b67f-1146f5f53e43-kserve-provision-location\") pod \"isvc-predictive-sklearn-v2-predictor-597b8bfb5-p6qz9\" (UID: \"ec3eed47-00e3-44d0-b67f-1146f5f53e43\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-597b8bfb5-p6qz9" Mar 18 17:19:48.595526 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:19:48.595482 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ec3eed47-00e3-44d0-b67f-1146f5f53e43-kserve-provision-location\") pod \"isvc-predictive-sklearn-v2-predictor-597b8bfb5-p6qz9\" (UID: \"ec3eed47-00e3-44d0-b67f-1146f5f53e43\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-597b8bfb5-p6qz9" Mar 18 17:19:48.595681 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:19:48.595546 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xgtqg\" (UniqueName: \"kubernetes.io/projected/ec3eed47-00e3-44d0-b67f-1146f5f53e43-kube-api-access-xgtqg\") pod \"isvc-predictive-sklearn-v2-predictor-597b8bfb5-p6qz9\" (UID: \"ec3eed47-00e3-44d0-b67f-1146f5f53e43\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-597b8bfb5-p6qz9" Mar 18 17:19:48.595850 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:19:48.595820 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ec3eed47-00e3-44d0-b67f-1146f5f53e43-kserve-provision-location\") pod \"isvc-predictive-sklearn-v2-predictor-597b8bfb5-p6qz9\" (UID: \"ec3eed47-00e3-44d0-b67f-1146f5f53e43\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-597b8bfb5-p6qz9" Mar 18 17:19:48.603310 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:19:48.603276 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgtqg\" (UniqueName: \"kubernetes.io/projected/ec3eed47-00e3-44d0-b67f-1146f5f53e43-kube-api-access-xgtqg\") pod \"isvc-predictive-sklearn-v2-predictor-597b8bfb5-p6qz9\" (UID: \"ec3eed47-00e3-44d0-b67f-1146f5f53e43\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-597b8bfb5-p6qz9" Mar 18 17:19:48.783206 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:19:48.783178 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-597b8bfb5-p6qz9" Mar 18 17:19:48.905677 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:19:48.905651 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-597b8bfb5-p6qz9"] Mar 18 17:19:48.907649 ip-10-0-139-49 kubenswrapper[2575]: W0318 17:19:48.907621 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec3eed47_00e3_44d0_b67f_1146f5f53e43.slice/crio-15411f1bd7926a5f16d63c3f9d5918ef61a092594038626766d1194dfbac7731 WatchSource:0}: Error finding container 15411f1bd7926a5f16d63c3f9d5918ef61a092594038626766d1194dfbac7731: Status 404 returned error can't find the container with id 15411f1bd7926a5f16d63c3f9d5918ef61a092594038626766d1194dfbac7731 Mar 18 17:19:49.431123 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:19:49.431086 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-597b8bfb5-p6qz9" event={"ID":"ec3eed47-00e3-44d0-b67f-1146f5f53e43","Type":"ContainerStarted","Data":"15b5d48806dff23739de60f95d2bddaf50d25d1ee4f8ff41460e84d38b78476b"} Mar 18 17:19:49.431123 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:19:49.431128 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-597b8bfb5-p6qz9" event={"ID":"ec3eed47-00e3-44d0-b67f-1146f5f53e43","Type":"ContainerStarted","Data":"15411f1bd7926a5f16d63c3f9d5918ef61a092594038626766d1194dfbac7731"} Mar 18 17:19:51.287316 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:19:51.287286 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:19:53.444671 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:19:53.444638 2575 generic.go:358] "Generic (PLEG): container finished" podID="ec3eed47-00e3-44d0-b67f-1146f5f53e43" containerID="15b5d48806dff23739de60f95d2bddaf50d25d1ee4f8ff41460e84d38b78476b" exitCode=0 Mar 18 17:19:53.445043 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:19:53.444710 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-597b8bfb5-p6qz9" event={"ID":"ec3eed47-00e3-44d0-b67f-1146f5f53e43","Type":"ContainerDied","Data":"15b5d48806dff23739de60f95d2bddaf50d25d1ee4f8ff41460e84d38b78476b"} Mar 18 17:19:54.003861 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:19:54.003834 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-7f75c98c97-7psv6" Mar 18 17:19:54.043558 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:19:54.043531 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zpzvf\" (UniqueName: \"kubernetes.io/projected/755f481a-ba52-4f66-96fa-b6ae8f2239df-kube-api-access-zpzvf\") pod \"755f481a-ba52-4f66-96fa-b6ae8f2239df\" (UID: \"755f481a-ba52-4f66-96fa-b6ae8f2239df\") " Mar 18 17:19:54.043704 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:19:54.043610 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/755f481a-ba52-4f66-96fa-b6ae8f2239df-kserve-provision-location\") pod \"755f481a-ba52-4f66-96fa-b6ae8f2239df\" (UID: \"755f481a-ba52-4f66-96fa-b6ae8f2239df\") " Mar 18 17:19:54.043948 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:19:54.043923 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/755f481a-ba52-4f66-96fa-b6ae8f2239df-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "755f481a-ba52-4f66-96fa-b6ae8f2239df" (UID: "755f481a-ba52-4f66-96fa-b6ae8f2239df"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 18 17:19:54.045686 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:19:54.045667 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/755f481a-ba52-4f66-96fa-b6ae8f2239df-kube-api-access-zpzvf" (OuterVolumeSpecName: "kube-api-access-zpzvf") pod "755f481a-ba52-4f66-96fa-b6ae8f2239df" (UID: "755f481a-ba52-4f66-96fa-b6ae8f2239df"). InnerVolumeSpecName "kube-api-access-zpzvf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 17:19:54.144935 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:19:54.144898 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zpzvf\" (UniqueName: \"kubernetes.io/projected/755f481a-ba52-4f66-96fa-b6ae8f2239df-kube-api-access-zpzvf\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 17:19:54.144935 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:19:54.144938 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/755f481a-ba52-4f66-96fa-b6ae8f2239df-kserve-provision-location\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 17:19:54.449417 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:19:54.449318 2575 generic.go:358] "Generic (PLEG): container finished" podID="755f481a-ba52-4f66-96fa-b6ae8f2239df" containerID="b6272cf9085dda664703771766bd1f007eb64a35fb1b3291a659121825e0745b" exitCode=0 Mar 18 17:19:54.449417 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:19:54.449397 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-7f75c98c97-7psv6" event={"ID":"755f481a-ba52-4f66-96fa-b6ae8f2239df","Type":"ContainerDied","Data":"b6272cf9085dda664703771766bd1f007eb64a35fb1b3291a659121825e0745b"} Mar 18 17:19:54.449883 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:19:54.449434 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-7f75c98c97-7psv6" event={"ID":"755f481a-ba52-4f66-96fa-b6ae8f2239df","Type":"ContainerDied","Data":"a7578722ade4e3b10a8084d817c0d62827fd83a72b599fd0594f9a01916c12c2"} Mar 18 17:19:54.449883 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:19:54.449404 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-7f75c98c97-7psv6" Mar 18 17:19:54.449883 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:19:54.449451 2575 scope.go:117] "RemoveContainer" containerID="b6272cf9085dda664703771766bd1f007eb64a35fb1b3291a659121825e0745b" Mar 18 17:19:54.451323 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:19:54.451290 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-597b8bfb5-p6qz9" event={"ID":"ec3eed47-00e3-44d0-b67f-1146f5f53e43","Type":"ContainerStarted","Data":"66e36a0f43c2a72f0b4796cdd75210a85c2a95f57e00e31cb07028c18aacd9fa"} Mar 18 17:19:54.451614 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:19:54.451587 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-597b8bfb5-p6qz9" Mar 18 17:19:54.458086 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:19:54.458071 2575 scope.go:117] "RemoveContainer" containerID="218c71db3c002e7cfb674ea7fa8065a197fa9770a13cd65e5dbd3f56dbf8d1d1" Mar 18 17:19:54.467523 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:19:54.467496 2575 scope.go:117] "RemoveContainer" containerID="b6272cf9085dda664703771766bd1f007eb64a35fb1b3291a659121825e0745b" Mar 18 17:19:54.467802 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:19:54.467783 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6272cf9085dda664703771766bd1f007eb64a35fb1b3291a659121825e0745b\": container with ID starting with b6272cf9085dda664703771766bd1f007eb64a35fb1b3291a659121825e0745b not found: ID does not exist" containerID="b6272cf9085dda664703771766bd1f007eb64a35fb1b3291a659121825e0745b" Mar 18 17:19:54.467848 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:19:54.467810 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6272cf9085dda664703771766bd1f007eb64a35fb1b3291a659121825e0745b"} err="failed to get container status \"b6272cf9085dda664703771766bd1f007eb64a35fb1b3291a659121825e0745b\": rpc error: code = NotFound desc = could not find container \"b6272cf9085dda664703771766bd1f007eb64a35fb1b3291a659121825e0745b\": container with ID starting with b6272cf9085dda664703771766bd1f007eb64a35fb1b3291a659121825e0745b not found: ID does not exist" Mar 18 17:19:54.467848 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:19:54.467828 2575 scope.go:117] "RemoveContainer" containerID="218c71db3c002e7cfb674ea7fa8065a197fa9770a13cd65e5dbd3f56dbf8d1d1" Mar 18 17:19:54.468091 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:19:54.468072 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"218c71db3c002e7cfb674ea7fa8065a197fa9770a13cd65e5dbd3f56dbf8d1d1\": container with ID starting with 218c71db3c002e7cfb674ea7fa8065a197fa9770a13cd65e5dbd3f56dbf8d1d1 not found: ID does not exist" containerID="218c71db3c002e7cfb674ea7fa8065a197fa9770a13cd65e5dbd3f56dbf8d1d1" Mar 18 17:19:54.468168 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:19:54.468096 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"218c71db3c002e7cfb674ea7fa8065a197fa9770a13cd65e5dbd3f56dbf8d1d1"} err="failed to get container status \"218c71db3c002e7cfb674ea7fa8065a197fa9770a13cd65e5dbd3f56dbf8d1d1\": rpc error: code = NotFound desc = could not find container \"218c71db3c002e7cfb674ea7fa8065a197fa9770a13cd65e5dbd3f56dbf8d1d1\": container with ID starting with 218c71db3c002e7cfb674ea7fa8065a197fa9770a13cd65e5dbd3f56dbf8d1d1 not found: ID does not exist" Mar 18 17:19:54.470831 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:19:54.470797 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-597b8bfb5-p6qz9" podStartSLOduration=6.470786635 podStartE2EDuration="6.470786635s" podCreationTimestamp="2026-03-18 17:19:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:19:54.469487623 +0000 UTC m=+2112.780740544" watchObservedRunningTime="2026-03-18 17:19:54.470786635 +0000 UTC m=+2112.782039556" Mar 18 17:19:54.483887 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:19:54.483867 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-7f75c98c97-7psv6"] Mar 18 17:19:54.489513 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:19:54.489493 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-7f75c98c97-7psv6"] Mar 18 17:19:56.290112 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:19:56.290077 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="755f481a-ba52-4f66-96fa-b6ae8f2239df" path="/var/lib/kubelet/pods/755f481a-ba52-4f66-96fa-b6ae8f2239df/volumes" Mar 18 17:20:02.290024 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:20:02.289995 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:20:16.289643 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:20:16.287704 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:20:25.457542 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:20:25.457442 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-597b8bfb5-p6qz9" podUID="ec3eed47-00e3-44d0-b67f-1146f5f53e43" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.54:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.134.0.54:8080: connect: connection refused" Mar 18 17:20:29.287526 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:20:29.287487 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:20:35.456960 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:20:35.456914 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-597b8bfb5-p6qz9" podUID="ec3eed47-00e3-44d0-b67f-1146f5f53e43" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.54:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.134.0.54:8080: connect: connection refused" Mar 18 17:20:41.286970 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:20:41.286949 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 17:20:41.287288 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:20:41.287222 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:20:45.457314 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:20:45.457270 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-597b8bfb5-p6qz9" podUID="ec3eed47-00e3-44d0-b67f-1146f5f53e43" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.54:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.134.0.54:8080: connect: connection refused" Mar 18 17:20:54.290941 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:20:54.290914 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:20:55.456507 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:20:55.456462 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-597b8bfb5-p6qz9" podUID="ec3eed47-00e3-44d0-b67f-1146f5f53e43" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.54:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.134.0.54:8080: connect: connection refused" Mar 18 17:21:05.287426 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:21:05.287389 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:21:05.460417 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:21:05.460386 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-597b8bfb5-p6qz9" Mar 18 17:21:08.676575 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:21:08.676546 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-597b8bfb5-p6qz9"] Mar 18 17:21:08.676959 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:21:08.676774 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-597b8bfb5-p6qz9" podUID="ec3eed47-00e3-44d0-b67f-1146f5f53e43" containerName="kserve-container" containerID="cri-o://66e36a0f43c2a72f0b4796cdd75210a85c2a95f57e00e31cb07028c18aacd9fa" gracePeriod=30 Mar 18 17:21:08.760203 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:21:08.760167 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-c57b4b4b8-bbvsn"] Mar 18 17:21:08.761075 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:21:08.761050 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="755f481a-ba52-4f66-96fa-b6ae8f2239df" containerName="storage-initializer" Mar 18 17:21:08.761227 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:21:08.761216 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="755f481a-ba52-4f66-96fa-b6ae8f2239df" containerName="storage-initializer" Mar 18 17:21:08.761351 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:21:08.761341 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="755f481a-ba52-4f66-96fa-b6ae8f2239df" containerName="kserve-container" Mar 18 17:21:08.761452 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:21:08.761442 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="755f481a-ba52-4f66-96fa-b6ae8f2239df" containerName="kserve-container" Mar 18 17:21:08.761685 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:21:08.761670 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="755f481a-ba52-4f66-96fa-b6ae8f2239df" containerName="kserve-container" Mar 18 17:21:08.765574 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:21:08.765553 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-c57b4b4b8-bbvsn" Mar 18 17:21:08.770964 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:21:08.770942 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-c57b4b4b8-bbvsn"] Mar 18 17:21:08.839330 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:21:08.839304 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrjg4\" (UniqueName: \"kubernetes.io/projected/1024df20-af7e-4007-9897-75100735b096-kube-api-access-lrjg4\") pod \"isvc-predictive-xgboost-v2-predictor-c57b4b4b8-bbvsn\" (UID: \"1024df20-af7e-4007-9897-75100735b096\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-c57b4b4b8-bbvsn" Mar 18 17:21:08.839501 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:21:08.839382 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1024df20-af7e-4007-9897-75100735b096-kserve-provision-location\") pod \"isvc-predictive-xgboost-v2-predictor-c57b4b4b8-bbvsn\" (UID: \"1024df20-af7e-4007-9897-75100735b096\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-c57b4b4b8-bbvsn" Mar 18 17:21:08.940282 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:21:08.940203 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lrjg4\" (UniqueName: \"kubernetes.io/projected/1024df20-af7e-4007-9897-75100735b096-kube-api-access-lrjg4\") pod \"isvc-predictive-xgboost-v2-predictor-c57b4b4b8-bbvsn\" (UID: \"1024df20-af7e-4007-9897-75100735b096\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-c57b4b4b8-bbvsn" Mar 18 17:21:08.940282 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:21:08.940265 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1024df20-af7e-4007-9897-75100735b096-kserve-provision-location\") pod \"isvc-predictive-xgboost-v2-predictor-c57b4b4b8-bbvsn\" (UID: \"1024df20-af7e-4007-9897-75100735b096\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-c57b4b4b8-bbvsn" Mar 18 17:21:08.940626 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:21:08.940606 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1024df20-af7e-4007-9897-75100735b096-kserve-provision-location\") pod \"isvc-predictive-xgboost-v2-predictor-c57b4b4b8-bbvsn\" (UID: \"1024df20-af7e-4007-9897-75100735b096\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-c57b4b4b8-bbvsn" Mar 18 17:21:08.948038 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:21:08.948013 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrjg4\" (UniqueName: \"kubernetes.io/projected/1024df20-af7e-4007-9897-75100735b096-kube-api-access-lrjg4\") pod \"isvc-predictive-xgboost-v2-predictor-c57b4b4b8-bbvsn\" (UID: \"1024df20-af7e-4007-9897-75100735b096\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-c57b4b4b8-bbvsn" Mar 18 17:21:09.077861 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:21:09.077826 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-c57b4b4b8-bbvsn" Mar 18 17:21:09.203136 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:21:09.203109 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-c57b4b4b8-bbvsn"] Mar 18 17:21:09.205639 ip-10-0-139-49 kubenswrapper[2575]: W0318 17:21:09.205606 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1024df20_af7e_4007_9897_75100735b096.slice/crio-41e302b2b554396e71f39f90c836cb2fa36fcabd02abee4e7f8b5cab9e0777bd WatchSource:0}: Error finding container 41e302b2b554396e71f39f90c836cb2fa36fcabd02abee4e7f8b5cab9e0777bd: Status 404 returned error can't find the container with id 41e302b2b554396e71f39f90c836cb2fa36fcabd02abee4e7f8b5cab9e0777bd Mar 18 17:21:09.705415 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:21:09.705381 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-c57b4b4b8-bbvsn" event={"ID":"1024df20-af7e-4007-9897-75100735b096","Type":"ContainerStarted","Data":"e2b8c902398f27744b90b37bce326a4369f7dabac543d9f28e7ec690de6d187d"} Mar 18 17:21:09.705415 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:21:09.705419 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-c57b4b4b8-bbvsn" event={"ID":"1024df20-af7e-4007-9897-75100735b096","Type":"ContainerStarted","Data":"41e302b2b554396e71f39f90c836cb2fa36fcabd02abee4e7f8b5cab9e0777bd"} Mar 18 17:21:13.132846 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:21:13.132824 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-597b8bfb5-p6qz9" Mar 18 17:21:13.173304 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:21:13.173281 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xgtqg\" (UniqueName: \"kubernetes.io/projected/ec3eed47-00e3-44d0-b67f-1146f5f53e43-kube-api-access-xgtqg\") pod \"ec3eed47-00e3-44d0-b67f-1146f5f53e43\" (UID: \"ec3eed47-00e3-44d0-b67f-1146f5f53e43\") " Mar 18 17:21:13.173452 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:21:13.173317 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ec3eed47-00e3-44d0-b67f-1146f5f53e43-kserve-provision-location\") pod \"ec3eed47-00e3-44d0-b67f-1146f5f53e43\" (UID: \"ec3eed47-00e3-44d0-b67f-1146f5f53e43\") " Mar 18 17:21:13.173624 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:21:13.173603 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ec3eed47-00e3-44d0-b67f-1146f5f53e43-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "ec3eed47-00e3-44d0-b67f-1146f5f53e43" (UID: "ec3eed47-00e3-44d0-b67f-1146f5f53e43"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 18 17:21:13.175457 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:21:13.175432 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec3eed47-00e3-44d0-b67f-1146f5f53e43-kube-api-access-xgtqg" (OuterVolumeSpecName: "kube-api-access-xgtqg") pod "ec3eed47-00e3-44d0-b67f-1146f5f53e43" (UID: "ec3eed47-00e3-44d0-b67f-1146f5f53e43"). InnerVolumeSpecName "kube-api-access-xgtqg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 17:21:13.274194 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:21:13.274124 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xgtqg\" (UniqueName: \"kubernetes.io/projected/ec3eed47-00e3-44d0-b67f-1146f5f53e43-kube-api-access-xgtqg\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 17:21:13.274194 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:21:13.274148 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ec3eed47-00e3-44d0-b67f-1146f5f53e43-kserve-provision-location\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 17:21:13.718324 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:21:13.718291 2575 generic.go:358] "Generic (PLEG): container finished" podID="ec3eed47-00e3-44d0-b67f-1146f5f53e43" containerID="66e36a0f43c2a72f0b4796cdd75210a85c2a95f57e00e31cb07028c18aacd9fa" exitCode=0 Mar 18 17:21:13.718537 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:21:13.718372 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-597b8bfb5-p6qz9" Mar 18 17:21:13.718537 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:21:13.718388 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-597b8bfb5-p6qz9" event={"ID":"ec3eed47-00e3-44d0-b67f-1146f5f53e43","Type":"ContainerDied","Data":"66e36a0f43c2a72f0b4796cdd75210a85c2a95f57e00e31cb07028c18aacd9fa"} Mar 18 17:21:13.718537 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:21:13.718422 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-597b8bfb5-p6qz9" event={"ID":"ec3eed47-00e3-44d0-b67f-1146f5f53e43","Type":"ContainerDied","Data":"15411f1bd7926a5f16d63c3f9d5918ef61a092594038626766d1194dfbac7731"} Mar 18 17:21:13.718537 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:21:13.718446 2575 scope.go:117] "RemoveContainer" containerID="66e36a0f43c2a72f0b4796cdd75210a85c2a95f57e00e31cb07028c18aacd9fa" Mar 18 17:21:13.719788 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:21:13.719764 2575 generic.go:358] "Generic (PLEG): container finished" podID="1024df20-af7e-4007-9897-75100735b096" containerID="e2b8c902398f27744b90b37bce326a4369f7dabac543d9f28e7ec690de6d187d" exitCode=0 Mar 18 17:21:13.719913 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:21:13.719804 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-c57b4b4b8-bbvsn" event={"ID":"1024df20-af7e-4007-9897-75100735b096","Type":"ContainerDied","Data":"e2b8c902398f27744b90b37bce326a4369f7dabac543d9f28e7ec690de6d187d"} Mar 18 17:21:13.727593 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:21:13.727578 2575 scope.go:117] "RemoveContainer" containerID="15b5d48806dff23739de60f95d2bddaf50d25d1ee4f8ff41460e84d38b78476b" Mar 18 17:21:13.737445 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:21:13.737190 2575 scope.go:117] "RemoveContainer" containerID="66e36a0f43c2a72f0b4796cdd75210a85c2a95f57e00e31cb07028c18aacd9fa" Mar 18 17:21:13.737811 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:21:13.737774 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66e36a0f43c2a72f0b4796cdd75210a85c2a95f57e00e31cb07028c18aacd9fa\": container with ID starting with 66e36a0f43c2a72f0b4796cdd75210a85c2a95f57e00e31cb07028c18aacd9fa not found: ID does not exist" containerID="66e36a0f43c2a72f0b4796cdd75210a85c2a95f57e00e31cb07028c18aacd9fa" Mar 18 17:21:13.737890 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:21:13.737811 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66e36a0f43c2a72f0b4796cdd75210a85c2a95f57e00e31cb07028c18aacd9fa"} err="failed to get container status \"66e36a0f43c2a72f0b4796cdd75210a85c2a95f57e00e31cb07028c18aacd9fa\": rpc error: code = NotFound desc = could not find container \"66e36a0f43c2a72f0b4796cdd75210a85c2a95f57e00e31cb07028c18aacd9fa\": container with ID starting with 66e36a0f43c2a72f0b4796cdd75210a85c2a95f57e00e31cb07028c18aacd9fa not found: ID does not exist" Mar 18 17:21:13.737890 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:21:13.737837 2575 scope.go:117] "RemoveContainer" containerID="15b5d48806dff23739de60f95d2bddaf50d25d1ee4f8ff41460e84d38b78476b" Mar 18 17:21:13.738307 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:21:13.738283 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15b5d48806dff23739de60f95d2bddaf50d25d1ee4f8ff41460e84d38b78476b\": container with ID starting with 15b5d48806dff23739de60f95d2bddaf50d25d1ee4f8ff41460e84d38b78476b not found: ID does not exist" containerID="15b5d48806dff23739de60f95d2bddaf50d25d1ee4f8ff41460e84d38b78476b" Mar 18 17:21:13.738441 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:21:13.738315 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15b5d48806dff23739de60f95d2bddaf50d25d1ee4f8ff41460e84d38b78476b"} err="failed to get container status \"15b5d48806dff23739de60f95d2bddaf50d25d1ee4f8ff41460e84d38b78476b\": rpc error: code = NotFound desc = could not find container \"15b5d48806dff23739de60f95d2bddaf50d25d1ee4f8ff41460e84d38b78476b\": container with ID starting with 15b5d48806dff23739de60f95d2bddaf50d25d1ee4f8ff41460e84d38b78476b not found: ID does not exist" Mar 18 17:21:13.748579 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:21:13.748557 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-597b8bfb5-p6qz9"] Mar 18 17:21:13.751484 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:21:13.751463 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-597b8bfb5-p6qz9"] Mar 18 17:21:14.291072 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:21:14.291042 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec3eed47-00e3-44d0-b67f-1146f5f53e43" path="/var/lib/kubelet/pods/ec3eed47-00e3-44d0-b67f-1146f5f53e43/volumes" Mar 18 17:21:14.725337 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:21:14.725306 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-c57b4b4b8-bbvsn" event={"ID":"1024df20-af7e-4007-9897-75100735b096","Type":"ContainerStarted","Data":"7de0764394bc665867b20eac5114e436adc0b1eeb9fe272eb156e13a0bc136b7"} Mar 18 17:21:14.725581 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:21:14.725565 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-c57b4b4b8-bbvsn" Mar 18 17:21:14.742075 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:21:14.742037 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-c57b4b4b8-bbvsn" podStartSLOduration=6.742027159 podStartE2EDuration="6.742027159s" podCreationTimestamp="2026-03-18 17:21:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:21:14.740713286 +0000 UTC m=+2193.051966217" watchObservedRunningTime="2026-03-18 17:21:14.742027159 +0000 UTC m=+2193.053280080" Mar 18 17:21:17.286526 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:21:17.286488 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:21:30.286618 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:21:30.286584 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:21:42.289154 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:21:42.289097 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:21:45.730864 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:21:45.730822 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-c57b4b4b8-bbvsn" podUID="1024df20-af7e-4007-9897-75100735b096" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.55:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.134.0.55:8080: connect: connection refused" Mar 18 17:21:54.287159 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:21:54.287117 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:21:55.729716 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:21:55.729675 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-c57b4b4b8-bbvsn" podUID="1024df20-af7e-4007-9897-75100735b096" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.55:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.134.0.55:8080: connect: connection refused" Mar 18 17:22:05.729144 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:22:05.729100 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-c57b4b4b8-bbvsn" podUID="1024df20-af7e-4007-9897-75100735b096" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.55:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.134.0.55:8080: connect: connection refused" Mar 18 17:22:08.287419 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:22:08.287386 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:22:15.729350 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:22:15.729310 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-c57b4b4b8-bbvsn" podUID="1024df20-af7e-4007-9897-75100735b096" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.55:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.134.0.55:8080: connect: connection refused" Mar 18 17:22:18.286292 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:22:18.286252 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-c57b4b4b8-bbvsn" podUID="1024df20-af7e-4007-9897-75100735b096" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.55:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.134.0.55:8080: connect: connection refused" Mar 18 17:22:19.286759 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:22:19.286720 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:22:28.290417 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:22:28.290388 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-c57b4b4b8-bbvsn" Mar 18 17:22:28.674087 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:22:28.674049 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-c57b4b4b8-bbvsn"] Mar 18 17:22:28.944080 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:22:28.944012 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-744555fc5-gzbqq"] Mar 18 17:22:28.944327 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:22:28.944315 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ec3eed47-00e3-44d0-b67f-1146f5f53e43" containerName="storage-initializer" Mar 18 17:22:28.944387 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:22:28.944330 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec3eed47-00e3-44d0-b67f-1146f5f53e43" containerName="storage-initializer" Mar 18 17:22:28.944387 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:22:28.944338 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ec3eed47-00e3-44d0-b67f-1146f5f53e43" containerName="kserve-container" Mar 18 17:22:28.944387 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:22:28.944344 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec3eed47-00e3-44d0-b67f-1146f5f53e43" containerName="kserve-container" Mar 18 17:22:28.944487 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:22:28.944432 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="ec3eed47-00e3-44d0-b67f-1146f5f53e43" containerName="kserve-container" Mar 18 17:22:28.947373 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:22:28.947346 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-744555fc5-gzbqq" Mar 18 17:22:28.954507 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:22:28.954071 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-744555fc5-gzbqq"] Mar 18 17:22:28.967164 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:22:28.967131 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-c57b4b4b8-bbvsn" podUID="1024df20-af7e-4007-9897-75100735b096" containerName="kserve-container" containerID="cri-o://7de0764394bc665867b20eac5114e436adc0b1eeb9fe272eb156e13a0bc136b7" gracePeriod=30 Mar 18 17:22:29.040148 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:22:29.040118 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x68lj\" (UniqueName: \"kubernetes.io/projected/4a309cf5-fdfd-4e0e-a598-a4b6e08c7558-kube-api-access-x68lj\") pod \"isvc-predictive-lightgbm-v2-predictor-744555fc5-gzbqq\" (UID: \"4a309cf5-fdfd-4e0e-a598-a4b6e08c7558\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-744555fc5-gzbqq" Mar 18 17:22:29.040310 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:22:29.040199 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4a309cf5-fdfd-4e0e-a598-a4b6e08c7558-kserve-provision-location\") pod \"isvc-predictive-lightgbm-v2-predictor-744555fc5-gzbqq\" (UID: \"4a309cf5-fdfd-4e0e-a598-a4b6e08c7558\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-744555fc5-gzbqq" Mar 18 17:22:29.141400 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:22:29.141352 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x68lj\" (UniqueName: \"kubernetes.io/projected/4a309cf5-fdfd-4e0e-a598-a4b6e08c7558-kube-api-access-x68lj\") pod \"isvc-predictive-lightgbm-v2-predictor-744555fc5-gzbqq\" (UID: \"4a309cf5-fdfd-4e0e-a598-a4b6e08c7558\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-744555fc5-gzbqq" Mar 18 17:22:29.141500 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:22:29.141444 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4a309cf5-fdfd-4e0e-a598-a4b6e08c7558-kserve-provision-location\") pod \"isvc-predictive-lightgbm-v2-predictor-744555fc5-gzbqq\" (UID: \"4a309cf5-fdfd-4e0e-a598-a4b6e08c7558\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-744555fc5-gzbqq" Mar 18 17:22:29.141806 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:22:29.141790 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4a309cf5-fdfd-4e0e-a598-a4b6e08c7558-kserve-provision-location\") pod \"isvc-predictive-lightgbm-v2-predictor-744555fc5-gzbqq\" (UID: \"4a309cf5-fdfd-4e0e-a598-a4b6e08c7558\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-744555fc5-gzbqq" Mar 18 17:22:29.148921 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:22:29.148898 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x68lj\" (UniqueName: \"kubernetes.io/projected/4a309cf5-fdfd-4e0e-a598-a4b6e08c7558-kube-api-access-x68lj\") pod \"isvc-predictive-lightgbm-v2-predictor-744555fc5-gzbqq\" (UID: \"4a309cf5-fdfd-4e0e-a598-a4b6e08c7558\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-744555fc5-gzbqq" Mar 18 17:22:29.258047 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:22:29.257992 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-744555fc5-gzbqq" Mar 18 17:22:29.378574 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:22:29.378552 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-744555fc5-gzbqq"] Mar 18 17:22:29.381969 ip-10-0-139-49 kubenswrapper[2575]: W0318 17:22:29.381933 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a309cf5_fdfd_4e0e_a598_a4b6e08c7558.slice/crio-9f9ef74f0f45676add81af335fb31b18cd89cdcf96ab15713046908764a6c46a WatchSource:0}: Error finding container 9f9ef74f0f45676add81af335fb31b18cd89cdcf96ab15713046908764a6c46a: Status 404 returned error can't find the container with id 9f9ef74f0f45676add81af335fb31b18cd89cdcf96ab15713046908764a6c46a Mar 18 17:22:29.972691 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:22:29.972659 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-744555fc5-gzbqq" event={"ID":"4a309cf5-fdfd-4e0e-a598-a4b6e08c7558","Type":"ContainerStarted","Data":"6ca3747a106a0b06887aa34325d556de77b016f60b8552c7953b8d7dc8fb6a04"} Mar 18 17:22:29.972691 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:22:29.972698 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-744555fc5-gzbqq" event={"ID":"4a309cf5-fdfd-4e0e-a598-a4b6e08c7558","Type":"ContainerStarted","Data":"9f9ef74f0f45676add81af335fb31b18cd89cdcf96ab15713046908764a6c46a"} Mar 18 17:22:32.288618 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:22:32.288442 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:22:32.983648 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:22:32.983616 2575 generic.go:358] "Generic (PLEG): container finished" podID="1024df20-af7e-4007-9897-75100735b096" containerID="7de0764394bc665867b20eac5114e436adc0b1eeb9fe272eb156e13a0bc136b7" exitCode=0 Mar 18 17:22:32.983794 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:22:32.983687 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-c57b4b4b8-bbvsn" event={"ID":"1024df20-af7e-4007-9897-75100735b096","Type":"ContainerDied","Data":"7de0764394bc665867b20eac5114e436adc0b1eeb9fe272eb156e13a0bc136b7"} Mar 18 17:22:33.008848 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:22:33.008830 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-c57b4b4b8-bbvsn" Mar 18 17:22:33.072498 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:22:33.072430 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1024df20-af7e-4007-9897-75100735b096-kserve-provision-location\") pod \"1024df20-af7e-4007-9897-75100735b096\" (UID: \"1024df20-af7e-4007-9897-75100735b096\") " Mar 18 17:22:33.072498 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:22:33.072485 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lrjg4\" (UniqueName: \"kubernetes.io/projected/1024df20-af7e-4007-9897-75100735b096-kube-api-access-lrjg4\") pod \"1024df20-af7e-4007-9897-75100735b096\" (UID: \"1024df20-af7e-4007-9897-75100735b096\") " Mar 18 17:22:33.072729 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:22:33.072708 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1024df20-af7e-4007-9897-75100735b096-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "1024df20-af7e-4007-9897-75100735b096" (UID: "1024df20-af7e-4007-9897-75100735b096"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 18 17:22:33.074578 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:22:33.074552 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1024df20-af7e-4007-9897-75100735b096-kube-api-access-lrjg4" (OuterVolumeSpecName: "kube-api-access-lrjg4") pod "1024df20-af7e-4007-9897-75100735b096" (UID: "1024df20-af7e-4007-9897-75100735b096"). InnerVolumeSpecName "kube-api-access-lrjg4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 17:22:33.173218 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:22:33.173184 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1024df20-af7e-4007-9897-75100735b096-kserve-provision-location\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 17:22:33.173218 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:22:33.173219 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lrjg4\" (UniqueName: \"kubernetes.io/projected/1024df20-af7e-4007-9897-75100735b096-kube-api-access-lrjg4\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 17:22:33.987977 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:22:33.987945 2575 generic.go:358] "Generic (PLEG): container finished" podID="4a309cf5-fdfd-4e0e-a598-a4b6e08c7558" containerID="6ca3747a106a0b06887aa34325d556de77b016f60b8552c7953b8d7dc8fb6a04" exitCode=0 Mar 18 17:22:33.988448 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:22:33.988025 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-744555fc5-gzbqq" event={"ID":"4a309cf5-fdfd-4e0e-a598-a4b6e08c7558","Type":"ContainerDied","Data":"6ca3747a106a0b06887aa34325d556de77b016f60b8552c7953b8d7dc8fb6a04"} Mar 18 17:22:33.989618 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:22:33.989583 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-c57b4b4b8-bbvsn" event={"ID":"1024df20-af7e-4007-9897-75100735b096","Type":"ContainerDied","Data":"41e302b2b554396e71f39f90c836cb2fa36fcabd02abee4e7f8b5cab9e0777bd"} Mar 18 17:22:33.989712 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:22:33.989622 2575 scope.go:117] "RemoveContainer" containerID="7de0764394bc665867b20eac5114e436adc0b1eeb9fe272eb156e13a0bc136b7" Mar 18 17:22:33.989712 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:22:33.989621 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-c57b4b4b8-bbvsn" Mar 18 17:22:33.998500 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:22:33.998485 2575 scope.go:117] "RemoveContainer" containerID="e2b8c902398f27744b90b37bce326a4369f7dabac543d9f28e7ec690de6d187d" Mar 18 17:22:34.014434 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:22:34.014414 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-c57b4b4b8-bbvsn"] Mar 18 17:22:34.016346 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:22:34.016324 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-c57b4b4b8-bbvsn"] Mar 18 17:22:34.292200 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:22:34.292121 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1024df20-af7e-4007-9897-75100735b096" path="/var/lib/kubelet/pods/1024df20-af7e-4007-9897-75100735b096/volumes" Mar 18 17:22:34.994079 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:22:34.994046 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-744555fc5-gzbqq" event={"ID":"4a309cf5-fdfd-4e0e-a598-a4b6e08c7558","Type":"ContainerStarted","Data":"c99da8666bad93ad5c9f2b7d44bc24e489de5ffd37e3801d8f068ad92c72c01f"} Mar 18 17:22:34.994517 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:22:34.994289 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-744555fc5-gzbqq" Mar 18 17:22:35.009204 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:22:35.009161 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-744555fc5-gzbqq" podStartSLOduration=7.009145719 podStartE2EDuration="7.009145719s" podCreationTimestamp="2026-03-18 17:22:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:22:35.008083638 +0000 UTC m=+2273.319336559" watchObservedRunningTime="2026-03-18 17:22:35.009145719 +0000 UTC m=+2273.320398641" Mar 18 17:22:45.287259 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:22:45.287228 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:23:00.287430 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:23:00.287315 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:23:06.000320 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:23:06.000281 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-744555fc5-gzbqq" podUID="4a309cf5-fdfd-4e0e-a598-a4b6e08c7558" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.56:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.134.0.56:8080: connect: connection refused" Mar 18 17:23:13.548704 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:23:13.548632 2575 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized" image="quay.io/opendatahub/odh-model-serving-api:fast" Mar 18 17:23:13.549009 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:23:13.548797 2575 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:server,Image:quay.io/opendatahub/odh-model-serving-api:fast,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},ContainerPort{Name:metrics,HostPort:0,ContainerPort:9090,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:TLS_CERT_DIR,Value:/tls,ValueFrom:nil,},EnvVar{Name:GATEWAY_LABEL_SELECTOR,Value:,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{50 -3} {} 50m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tls-certs,ReadOnly:true,MountPath:/tls,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vhs26,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000640000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod model-serving-api-9699c8d45-ttb7x_kserve(e41abd12-239d-46fb-9cdb-9d35fa51024d): ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized" logger="UnhandledError" Mar 18 17:23:13.549987 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:23:13.549960 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ErrImagePull: \"unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:23:16.000295 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:23:16.000259 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-744555fc5-gzbqq" podUID="4a309cf5-fdfd-4e0e-a598-a4b6e08c7558" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.56:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.134.0.56:8080: connect: connection refused" Mar 18 17:23:25.999713 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:23:25.999618 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-744555fc5-gzbqq" podUID="4a309cf5-fdfd-4e0e-a598-a4b6e08c7558" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.56:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.134.0.56:8080: connect: connection refused" Mar 18 17:23:27.286914 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:23:27.286882 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:23:35.999492 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:23:35.999448 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-744555fc5-gzbqq" podUID="4a309cf5-fdfd-4e0e-a598-a4b6e08c7558" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.56:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.134.0.56:8080: connect: connection refused" Mar 18 17:23:40.287156 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:23:40.287116 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:23:46.003121 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:23:46.003089 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-744555fc5-gzbqq" Mar 18 17:23:48.910472 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:23:48.910440 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-744555fc5-gzbqq"] Mar 18 17:23:48.910846 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:23:48.910693 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-744555fc5-gzbqq" podUID="4a309cf5-fdfd-4e0e-a598-a4b6e08c7558" containerName="kserve-container" containerID="cri-o://c99da8666bad93ad5c9f2b7d44bc24e489de5ffd37e3801d8f068ad92c72c01f" gracePeriod=30 Mar 18 17:23:51.163659 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:23:51.163626 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-86c4cc6d4-k6q4d"] Mar 18 17:23:51.164013 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:23:51.163967 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1024df20-af7e-4007-9897-75100735b096" containerName="kserve-container" Mar 18 17:23:51.164013 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:23:51.163978 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="1024df20-af7e-4007-9897-75100735b096" containerName="kserve-container" Mar 18 17:23:51.164013 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:23:51.163992 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1024df20-af7e-4007-9897-75100735b096" containerName="storage-initializer" Mar 18 17:23:51.164013 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:23:51.163997 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="1024df20-af7e-4007-9897-75100735b096" containerName="storage-initializer" Mar 18 17:23:51.164148 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:23:51.164058 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="1024df20-af7e-4007-9897-75100735b096" containerName="kserve-container" Mar 18 17:23:51.166977 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:23:51.166957 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-86c4cc6d4-k6q4d" Mar 18 17:23:51.174557 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:23:51.174534 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-86c4cc6d4-k6q4d"] Mar 18 17:23:51.333754 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:23:51.333727 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtbgh\" (UniqueName: \"kubernetes.io/projected/beb043c7-f436-46dd-b30f-6cf28fc9f944-kube-api-access-gtbgh\") pod \"isvc-sklearn-predictor-86c4cc6d4-k6q4d\" (UID: \"beb043c7-f436-46dd-b30f-6cf28fc9f944\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-86c4cc6d4-k6q4d" Mar 18 17:23:51.333894 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:23:51.333773 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/beb043c7-f436-46dd-b30f-6cf28fc9f944-kserve-provision-location\") pod \"isvc-sklearn-predictor-86c4cc6d4-k6q4d\" (UID: \"beb043c7-f436-46dd-b30f-6cf28fc9f944\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-86c4cc6d4-k6q4d" Mar 18 17:23:51.434741 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:23:51.434668 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/beb043c7-f436-46dd-b30f-6cf28fc9f944-kserve-provision-location\") pod \"isvc-sklearn-predictor-86c4cc6d4-k6q4d\" (UID: \"beb043c7-f436-46dd-b30f-6cf28fc9f944\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-86c4cc6d4-k6q4d" Mar 18 17:23:51.434856 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:23:51.434736 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gtbgh\" (UniqueName: \"kubernetes.io/projected/beb043c7-f436-46dd-b30f-6cf28fc9f944-kube-api-access-gtbgh\") pod \"isvc-sklearn-predictor-86c4cc6d4-k6q4d\" (UID: \"beb043c7-f436-46dd-b30f-6cf28fc9f944\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-86c4cc6d4-k6q4d" Mar 18 17:23:51.435026 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:23:51.435008 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/beb043c7-f436-46dd-b30f-6cf28fc9f944-kserve-provision-location\") pod \"isvc-sklearn-predictor-86c4cc6d4-k6q4d\" (UID: \"beb043c7-f436-46dd-b30f-6cf28fc9f944\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-86c4cc6d4-k6q4d" Mar 18 17:23:51.441912 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:23:51.441884 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtbgh\" (UniqueName: \"kubernetes.io/projected/beb043c7-f436-46dd-b30f-6cf28fc9f944-kube-api-access-gtbgh\") pod \"isvc-sklearn-predictor-86c4cc6d4-k6q4d\" (UID: \"beb043c7-f436-46dd-b30f-6cf28fc9f944\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-86c4cc6d4-k6q4d" Mar 18 17:23:51.479632 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:23:51.479603 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-86c4cc6d4-k6q4d" Mar 18 17:23:51.597961 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:23:51.597885 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-86c4cc6d4-k6q4d"] Mar 18 17:23:51.600340 ip-10-0-139-49 kubenswrapper[2575]: W0318 17:23:51.600316 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbeb043c7_f436_46dd_b30f_6cf28fc9f944.slice/crio-50f280690197a7811e34be5ce95cf8ad0e6ede4005af70c7cd0622e90de780e3 WatchSource:0}: Error finding container 50f280690197a7811e34be5ce95cf8ad0e6ede4005af70c7cd0622e90de780e3: Status 404 returned error can't find the container with id 50f280690197a7811e34be5ce95cf8ad0e6ede4005af70c7cd0622e90de780e3 Mar 18 17:23:52.252929 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:23:52.252895 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-86c4cc6d4-k6q4d" event={"ID":"beb043c7-f436-46dd-b30f-6cf28fc9f944","Type":"ContainerStarted","Data":"6ff962c54c47c072fbf8196ba4e6f78959ef3042aca6681968e6963874809956"} Mar 18 17:23:52.252929 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:23:52.252931 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-86c4cc6d4-k6q4d" event={"ID":"beb043c7-f436-46dd-b30f-6cf28fc9f944","Type":"ContainerStarted","Data":"50f280690197a7811e34be5ce95cf8ad0e6ede4005af70c7cd0622e90de780e3"} Mar 18 17:23:52.289637 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:23:52.289581 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:23:54.261501 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:23:54.261470 2575 generic.go:358] "Generic (PLEG): container finished" podID="4a309cf5-fdfd-4e0e-a598-a4b6e08c7558" containerID="c99da8666bad93ad5c9f2b7d44bc24e489de5ffd37e3801d8f068ad92c72c01f" exitCode=0 Mar 18 17:23:54.261813 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:23:54.261553 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-744555fc5-gzbqq" event={"ID":"4a309cf5-fdfd-4e0e-a598-a4b6e08c7558","Type":"ContainerDied","Data":"c99da8666bad93ad5c9f2b7d44bc24e489de5ffd37e3801d8f068ad92c72c01f"} Mar 18 17:23:54.644197 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:23:54.644171 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-744555fc5-gzbqq" Mar 18 17:23:54.757635 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:23:54.757613 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4a309cf5-fdfd-4e0e-a598-a4b6e08c7558-kserve-provision-location\") pod \"4a309cf5-fdfd-4e0e-a598-a4b6e08c7558\" (UID: \"4a309cf5-fdfd-4e0e-a598-a4b6e08c7558\") " Mar 18 17:23:54.757761 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:23:54.757661 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x68lj\" (UniqueName: \"kubernetes.io/projected/4a309cf5-fdfd-4e0e-a598-a4b6e08c7558-kube-api-access-x68lj\") pod \"4a309cf5-fdfd-4e0e-a598-a4b6e08c7558\" (UID: \"4a309cf5-fdfd-4e0e-a598-a4b6e08c7558\") " Mar 18 17:23:54.757936 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:23:54.757913 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a309cf5-fdfd-4e0e-a598-a4b6e08c7558-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "4a309cf5-fdfd-4e0e-a598-a4b6e08c7558" (UID: "4a309cf5-fdfd-4e0e-a598-a4b6e08c7558"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 18 17:23:54.759807 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:23:54.759774 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a309cf5-fdfd-4e0e-a598-a4b6e08c7558-kube-api-access-x68lj" (OuterVolumeSpecName: "kube-api-access-x68lj") pod "4a309cf5-fdfd-4e0e-a598-a4b6e08c7558" (UID: "4a309cf5-fdfd-4e0e-a598-a4b6e08c7558"). InnerVolumeSpecName "kube-api-access-x68lj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 17:23:54.859101 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:23:54.859044 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4a309cf5-fdfd-4e0e-a598-a4b6e08c7558-kserve-provision-location\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 17:23:54.859101 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:23:54.859065 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-x68lj\" (UniqueName: \"kubernetes.io/projected/4a309cf5-fdfd-4e0e-a598-a4b6e08c7558-kube-api-access-x68lj\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 17:23:55.266115 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:23:55.266094 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-744555fc5-gzbqq" Mar 18 17:23:55.266503 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:23:55.266092 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-744555fc5-gzbqq" event={"ID":"4a309cf5-fdfd-4e0e-a598-a4b6e08c7558","Type":"ContainerDied","Data":"9f9ef74f0f45676add81af335fb31b18cd89cdcf96ab15713046908764a6c46a"} Mar 18 17:23:55.266503 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:23:55.266208 2575 scope.go:117] "RemoveContainer" containerID="c99da8666bad93ad5c9f2b7d44bc24e489de5ffd37e3801d8f068ad92c72c01f" Mar 18 17:23:55.277301 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:23:55.277279 2575 scope.go:117] "RemoveContainer" containerID="6ca3747a106a0b06887aa34325d556de77b016f60b8552c7953b8d7dc8fb6a04" Mar 18 17:23:55.288120 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:23:55.288101 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-744555fc5-gzbqq"] Mar 18 17:23:55.291495 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:23:55.291475 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-744555fc5-gzbqq"] Mar 18 17:23:56.271985 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:23:56.271951 2575 generic.go:358] "Generic (PLEG): container finished" podID="beb043c7-f436-46dd-b30f-6cf28fc9f944" containerID="6ff962c54c47c072fbf8196ba4e6f78959ef3042aca6681968e6963874809956" exitCode=0 Mar 18 17:23:56.272321 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:23:56.271999 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-86c4cc6d4-k6q4d" event={"ID":"beb043c7-f436-46dd-b30f-6cf28fc9f944","Type":"ContainerDied","Data":"6ff962c54c47c072fbf8196ba4e6f78959ef3042aca6681968e6963874809956"} Mar 18 17:23:56.291397 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:23:56.291347 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a309cf5-fdfd-4e0e-a598-a4b6e08c7558" path="/var/lib/kubelet/pods/4a309cf5-fdfd-4e0e-a598-a4b6e08c7558/volumes" Mar 18 17:23:57.277080 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:23:57.277043 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-86c4cc6d4-k6q4d" event={"ID":"beb043c7-f436-46dd-b30f-6cf28fc9f944","Type":"ContainerStarted","Data":"65eeabda88d7c964f1d30d59586dd2bbdb33a91149a5f1451412fe8b01cf3d1c"} Mar 18 17:23:57.277564 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:23:57.277387 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-86c4cc6d4-k6q4d" Mar 18 17:23:57.278670 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:23:57.278644 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-86c4cc6d4-k6q4d" podUID="beb043c7-f436-46dd-b30f-6cf28fc9f944" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.57:8080: connect: connection refused" Mar 18 17:23:57.293369 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:23:57.293313 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-86c4cc6d4-k6q4d" podStartSLOduration=6.293284148 podStartE2EDuration="6.293284148s" podCreationTimestamp="2026-03-18 17:23:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:23:57.291727746 +0000 UTC m=+2355.602980669" watchObservedRunningTime="2026-03-18 17:23:57.293284148 +0000 UTC m=+2355.604537069" Mar 18 17:23:58.281827 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:23:58.281792 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-86c4cc6d4-k6q4d" podUID="beb043c7-f436-46dd-b30f-6cf28fc9f944" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.57:8080: connect: connection refused" Mar 18 17:24:04.287543 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:24:04.287512 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:24:08.282499 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:24:08.282450 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-86c4cc6d4-k6q4d" podUID="beb043c7-f436-46dd-b30f-6cf28fc9f944" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.57:8080: connect: connection refused" Mar 18 17:24:18.282299 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:24:18.282254 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-86c4cc6d4-k6q4d" podUID="beb043c7-f436-46dd-b30f-6cf28fc9f944" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.57:8080: connect: connection refused" Mar 18 17:24:18.286754 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:24:18.286726 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:24:28.282530 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:24:28.282488 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-86c4cc6d4-k6q4d" podUID="beb043c7-f436-46dd-b30f-6cf28fc9f944" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.57:8080: connect: connection refused" Mar 18 17:24:30.290787 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:24:30.290757 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:24:38.282608 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:24:38.282570 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-86c4cc6d4-k6q4d" podUID="beb043c7-f436-46dd-b30f-6cf28fc9f944" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.57:8080: connect: connection refused" Mar 18 17:24:42.289391 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:24:42.289335 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:24:42.342272 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:24:42.342249 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qktrm_a51f93e2-c750-4700-bb46-109456c7c78a/ovn-acl-logging/0.log" Mar 18 17:24:42.347546 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:24:42.347529 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qktrm_a51f93e2-c750-4700-bb46-109456c7c78a/ovn-acl-logging/0.log" Mar 18 17:24:48.282266 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:24:48.282217 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-86c4cc6d4-k6q4d" podUID="beb043c7-f436-46dd-b30f-6cf28fc9f944" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.57:8080: connect: connection refused" Mar 18 17:24:54.286968 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:24:54.286929 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:24:58.282294 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:24:58.282248 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-86c4cc6d4-k6q4d" podUID="beb043c7-f436-46dd-b30f-6cf28fc9f944" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.57:8080: connect: connection refused" Mar 18 17:25:05.285971 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:25:05.285929 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-86c4cc6d4-k6q4d" podUID="beb043c7-f436-46dd-b30f-6cf28fc9f944" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.57:8080: connect: connection refused" Mar 18 17:25:05.286916 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:25:05.286889 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:25:15.287049 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:25:15.287017 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-86c4cc6d4-k6q4d" Mar 18 17:25:20.289383 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:25:20.287477 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:25:21.463694 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:25:21.463649 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-86c4cc6d4-k6q4d"] Mar 18 17:25:21.464091 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:25:21.463997 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-86c4cc6d4-k6q4d" podUID="beb043c7-f436-46dd-b30f-6cf28fc9f944" containerName="kserve-container" containerID="cri-o://65eeabda88d7c964f1d30d59586dd2bbdb33a91149a5f1451412fe8b01cf3d1c" gracePeriod=30 Mar 18 17:25:21.549071 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:25:21.549045 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-67f9b5bcdf-brrhx"] Mar 18 17:25:21.549390 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:25:21.549350 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4a309cf5-fdfd-4e0e-a598-a4b6e08c7558" containerName="kserve-container" Mar 18 17:25:21.549390 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:25:21.549378 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a309cf5-fdfd-4e0e-a598-a4b6e08c7558" containerName="kserve-container" Mar 18 17:25:21.549513 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:25:21.549399 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4a309cf5-fdfd-4e0e-a598-a4b6e08c7558" containerName="storage-initializer" Mar 18 17:25:21.549513 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:25:21.549404 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a309cf5-fdfd-4e0e-a598-a4b6e08c7558" containerName="storage-initializer" Mar 18 17:25:21.549513 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:25:21.549454 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="4a309cf5-fdfd-4e0e-a598-a4b6e08c7558" containerName="kserve-container" Mar 18 17:25:21.552286 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:25:21.552271 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-67f9b5bcdf-brrhx" Mar 18 17:25:21.561812 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:25:21.561787 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-67f9b5bcdf-brrhx"] Mar 18 17:25:21.674610 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:25:21.674583 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ddd9e698-4fc3-49b1-bd53-aec4b50bc1f1-kserve-provision-location\") pod \"sklearn-v2-mlserver-predictor-67f9b5bcdf-brrhx\" (UID: \"ddd9e698-4fc3-49b1-bd53-aec4b50bc1f1\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-67f9b5bcdf-brrhx" Mar 18 17:25:21.674732 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:25:21.674621 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5xxx\" (UniqueName: \"kubernetes.io/projected/ddd9e698-4fc3-49b1-bd53-aec4b50bc1f1-kube-api-access-g5xxx\") pod \"sklearn-v2-mlserver-predictor-67f9b5bcdf-brrhx\" (UID: \"ddd9e698-4fc3-49b1-bd53-aec4b50bc1f1\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-67f9b5bcdf-brrhx" Mar 18 17:25:21.775202 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:25:21.775134 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ddd9e698-4fc3-49b1-bd53-aec4b50bc1f1-kserve-provision-location\") pod \"sklearn-v2-mlserver-predictor-67f9b5bcdf-brrhx\" (UID: \"ddd9e698-4fc3-49b1-bd53-aec4b50bc1f1\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-67f9b5bcdf-brrhx" Mar 18 17:25:21.775202 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:25:21.775180 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g5xxx\" (UniqueName: \"kubernetes.io/projected/ddd9e698-4fc3-49b1-bd53-aec4b50bc1f1-kube-api-access-g5xxx\") pod \"sklearn-v2-mlserver-predictor-67f9b5bcdf-brrhx\" (UID: \"ddd9e698-4fc3-49b1-bd53-aec4b50bc1f1\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-67f9b5bcdf-brrhx" Mar 18 17:25:21.775530 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:25:21.775510 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ddd9e698-4fc3-49b1-bd53-aec4b50bc1f1-kserve-provision-location\") pod \"sklearn-v2-mlserver-predictor-67f9b5bcdf-brrhx\" (UID: \"ddd9e698-4fc3-49b1-bd53-aec4b50bc1f1\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-67f9b5bcdf-brrhx" Mar 18 17:25:21.782293 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:25:21.782269 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5xxx\" (UniqueName: \"kubernetes.io/projected/ddd9e698-4fc3-49b1-bd53-aec4b50bc1f1-kube-api-access-g5xxx\") pod \"sklearn-v2-mlserver-predictor-67f9b5bcdf-brrhx\" (UID: \"ddd9e698-4fc3-49b1-bd53-aec4b50bc1f1\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-67f9b5bcdf-brrhx" Mar 18 17:25:21.863217 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:25:21.863192 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-67f9b5bcdf-brrhx" Mar 18 17:25:21.982209 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:25:21.982181 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-67f9b5bcdf-brrhx"] Mar 18 17:25:21.984458 ip-10-0-139-49 kubenswrapper[2575]: W0318 17:25:21.984429 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podddd9e698_4fc3_49b1_bd53_aec4b50bc1f1.slice/crio-890f04cc84e7c2b5cbfc8a6c84507b7b9c7d6a71f68e02bfe77979ce27e5add2 WatchSource:0}: Error finding container 890f04cc84e7c2b5cbfc8a6c84507b7b9c7d6a71f68e02bfe77979ce27e5add2: Status 404 returned error can't find the container with id 890f04cc84e7c2b5cbfc8a6c84507b7b9c7d6a71f68e02bfe77979ce27e5add2 Mar 18 17:25:22.542656 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:25:22.542624 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-67f9b5bcdf-brrhx" event={"ID":"ddd9e698-4fc3-49b1-bd53-aec4b50bc1f1","Type":"ContainerStarted","Data":"45e96485d3d9fb4dee7553e70a5d1ded861918923f3221dd16fad62fa9474e7a"} Mar 18 17:25:22.542656 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:25:22.542656 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-67f9b5bcdf-brrhx" event={"ID":"ddd9e698-4fc3-49b1-bd53-aec4b50bc1f1","Type":"ContainerStarted","Data":"890f04cc84e7c2b5cbfc8a6c84507b7b9c7d6a71f68e02bfe77979ce27e5add2"} Mar 18 17:25:25.286052 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:25:25.286013 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-86c4cc6d4-k6q4d" podUID="beb043c7-f436-46dd-b30f-6cf28fc9f944" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.57:8080: connect: connection refused" Mar 18 17:25:25.556173 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:25:25.556102 2575 generic.go:358] "Generic (PLEG): container finished" podID="beb043c7-f436-46dd-b30f-6cf28fc9f944" containerID="65eeabda88d7c964f1d30d59586dd2bbdb33a91149a5f1451412fe8b01cf3d1c" exitCode=0 Mar 18 17:25:25.556173 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:25:25.556155 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-86c4cc6d4-k6q4d" event={"ID":"beb043c7-f436-46dd-b30f-6cf28fc9f944","Type":"ContainerDied","Data":"65eeabda88d7c964f1d30d59586dd2bbdb33a91149a5f1451412fe8b01cf3d1c"} Mar 18 17:25:25.604787 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:25:25.604763 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-86c4cc6d4-k6q4d" Mar 18 17:25:25.700728 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:25:25.700678 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gtbgh\" (UniqueName: \"kubernetes.io/projected/beb043c7-f436-46dd-b30f-6cf28fc9f944-kube-api-access-gtbgh\") pod \"beb043c7-f436-46dd-b30f-6cf28fc9f944\" (UID: \"beb043c7-f436-46dd-b30f-6cf28fc9f944\") " Mar 18 17:25:25.700728 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:25:25.700734 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/beb043c7-f436-46dd-b30f-6cf28fc9f944-kserve-provision-location\") pod \"beb043c7-f436-46dd-b30f-6cf28fc9f944\" (UID: \"beb043c7-f436-46dd-b30f-6cf28fc9f944\") " Mar 18 17:25:25.701050 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:25:25.701024 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/beb043c7-f436-46dd-b30f-6cf28fc9f944-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "beb043c7-f436-46dd-b30f-6cf28fc9f944" (UID: "beb043c7-f436-46dd-b30f-6cf28fc9f944"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 18 17:25:25.702985 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:25:25.702957 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/beb043c7-f436-46dd-b30f-6cf28fc9f944-kube-api-access-gtbgh" (OuterVolumeSpecName: "kube-api-access-gtbgh") pod "beb043c7-f436-46dd-b30f-6cf28fc9f944" (UID: "beb043c7-f436-46dd-b30f-6cf28fc9f944"). InnerVolumeSpecName "kube-api-access-gtbgh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 17:25:25.801648 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:25:25.801620 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gtbgh\" (UniqueName: \"kubernetes.io/projected/beb043c7-f436-46dd-b30f-6cf28fc9f944-kube-api-access-gtbgh\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 17:25:25.801648 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:25:25.801643 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/beb043c7-f436-46dd-b30f-6cf28fc9f944-kserve-provision-location\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 17:25:26.560467 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:25:26.560385 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-86c4cc6d4-k6q4d" event={"ID":"beb043c7-f436-46dd-b30f-6cf28fc9f944","Type":"ContainerDied","Data":"50f280690197a7811e34be5ce95cf8ad0e6ede4005af70c7cd0622e90de780e3"} Mar 18 17:25:26.560467 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:25:26.560411 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-86c4cc6d4-k6q4d" Mar 18 17:25:26.560467 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:25:26.560437 2575 scope.go:117] "RemoveContainer" containerID="65eeabda88d7c964f1d30d59586dd2bbdb33a91149a5f1451412fe8b01cf3d1c" Mar 18 17:25:26.561894 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:25:26.561872 2575 generic.go:358] "Generic (PLEG): container finished" podID="ddd9e698-4fc3-49b1-bd53-aec4b50bc1f1" containerID="45e96485d3d9fb4dee7553e70a5d1ded861918923f3221dd16fad62fa9474e7a" exitCode=0 Mar 18 17:25:26.561984 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:25:26.561941 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-67f9b5bcdf-brrhx" event={"ID":"ddd9e698-4fc3-49b1-bd53-aec4b50bc1f1","Type":"ContainerDied","Data":"45e96485d3d9fb4dee7553e70a5d1ded861918923f3221dd16fad62fa9474e7a"} Mar 18 17:25:26.568965 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:25:26.568946 2575 scope.go:117] "RemoveContainer" containerID="6ff962c54c47c072fbf8196ba4e6f78959ef3042aca6681968e6963874809956" Mar 18 17:25:26.588719 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:25:26.588695 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-86c4cc6d4-k6q4d"] Mar 18 17:25:26.591140 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:25:26.591118 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-86c4cc6d4-k6q4d"] Mar 18 17:25:27.567336 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:25:27.567302 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-67f9b5bcdf-brrhx" event={"ID":"ddd9e698-4fc3-49b1-bd53-aec4b50bc1f1","Type":"ContainerStarted","Data":"96dbc386f7d1a7f654b5cdce1fbdf18b849a1042bfac3cab57e95c5fccfe0c3f"} Mar 18 17:25:27.567820 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:25:27.567550 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-67f9b5bcdf-brrhx" Mar 18 17:25:27.582781 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:25:27.582737 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-67f9b5bcdf-brrhx" podStartSLOduration=6.582720428 podStartE2EDuration="6.582720428s" podCreationTimestamp="2026-03-18 17:25:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:25:27.582141159 +0000 UTC m=+2445.893394082" watchObservedRunningTime="2026-03-18 17:25:27.582720428 +0000 UTC m=+2445.893973359" Mar 18 17:25:28.290470 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:25:28.290442 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="beb043c7-f436-46dd-b30f-6cf28fc9f944" path="/var/lib/kubelet/pods/beb043c7-f436-46dd-b30f-6cf28fc9f944/volumes" Mar 18 17:25:32.290936 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:25:32.290902 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:25:44.286930 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:25:44.286911 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 17:25:44.287188 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:25:44.287094 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:25:58.287667 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:25:58.287462 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:25:58.576726 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:25:58.576651 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-67f9b5bcdf-brrhx" Mar 18 17:26:01.373436 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:26:01.373403 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-67f9b5bcdf-brrhx"] Mar 18 17:26:01.373853 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:26:01.373659 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-67f9b5bcdf-brrhx" podUID="ddd9e698-4fc3-49b1-bd53-aec4b50bc1f1" containerName="kserve-container" containerID="cri-o://96dbc386f7d1a7f654b5cdce1fbdf18b849a1042bfac3cab57e95c5fccfe0c3f" gracePeriod=30 Mar 18 17:26:01.461791 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:26:01.461755 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-798668567d-5z22v"] Mar 18 17:26:01.462727 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:26:01.462700 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="beb043c7-f436-46dd-b30f-6cf28fc9f944" containerName="storage-initializer" Mar 18 17:26:01.462893 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:26:01.462879 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="beb043c7-f436-46dd-b30f-6cf28fc9f944" containerName="storage-initializer" Mar 18 17:26:01.462974 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:26:01.462964 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="beb043c7-f436-46dd-b30f-6cf28fc9f944" containerName="kserve-container" Mar 18 17:26:01.463052 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:26:01.463044 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="beb043c7-f436-46dd-b30f-6cf28fc9f944" containerName="kserve-container" Mar 18 17:26:01.463344 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:26:01.463331 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="beb043c7-f436-46dd-b30f-6cf28fc9f944" containerName="kserve-container" Mar 18 17:26:01.469267 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:26:01.469240 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-798668567d-5z22v" Mar 18 17:26:01.469722 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:26:01.469694 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-798668567d-5z22v"] Mar 18 17:26:01.554231 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:26:01.554199 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmrtx\" (UniqueName: \"kubernetes.io/projected/ed74f7ff-70ff-40cd-804b-b888a032192c-kube-api-access-cmrtx\") pod \"isvc-sklearn-runtime-predictor-798668567d-5z22v\" (UID: \"ed74f7ff-70ff-40cd-804b-b888a032192c\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-798668567d-5z22v" Mar 18 17:26:01.554412 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:26:01.554258 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ed74f7ff-70ff-40cd-804b-b888a032192c-kserve-provision-location\") pod \"isvc-sklearn-runtime-predictor-798668567d-5z22v\" (UID: \"ed74f7ff-70ff-40cd-804b-b888a032192c\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-798668567d-5z22v" Mar 18 17:26:01.654793 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:26:01.654761 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ed74f7ff-70ff-40cd-804b-b888a032192c-kserve-provision-location\") pod \"isvc-sklearn-runtime-predictor-798668567d-5z22v\" (UID: \"ed74f7ff-70ff-40cd-804b-b888a032192c\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-798668567d-5z22v" Mar 18 17:26:01.654938 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:26:01.654815 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cmrtx\" (UniqueName: \"kubernetes.io/projected/ed74f7ff-70ff-40cd-804b-b888a032192c-kube-api-access-cmrtx\") pod \"isvc-sklearn-runtime-predictor-798668567d-5z22v\" (UID: \"ed74f7ff-70ff-40cd-804b-b888a032192c\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-798668567d-5z22v" Mar 18 17:26:01.655123 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:26:01.655102 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ed74f7ff-70ff-40cd-804b-b888a032192c-kserve-provision-location\") pod \"isvc-sklearn-runtime-predictor-798668567d-5z22v\" (UID: \"ed74f7ff-70ff-40cd-804b-b888a032192c\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-798668567d-5z22v" Mar 18 17:26:01.661618 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:26:01.661592 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmrtx\" (UniqueName: \"kubernetes.io/projected/ed74f7ff-70ff-40cd-804b-b888a032192c-kube-api-access-cmrtx\") pod \"isvc-sklearn-runtime-predictor-798668567d-5z22v\" (UID: \"ed74f7ff-70ff-40cd-804b-b888a032192c\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-798668567d-5z22v" Mar 18 17:26:01.782054 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:26:01.782025 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-798668567d-5z22v" Mar 18 17:26:01.906501 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:26:01.906478 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-798668567d-5z22v"] Mar 18 17:26:01.908855 ip-10-0-139-49 kubenswrapper[2575]: W0318 17:26:01.908827 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded74f7ff_70ff_40cd_804b_b888a032192c.slice/crio-c499dedc33164741dcecff39f2bda4b2ce2efef4193f576a5e9d437a5de11c43 WatchSource:0}: Error finding container c499dedc33164741dcecff39f2bda4b2ce2efef4193f576a5e9d437a5de11c43: Status 404 returned error can't find the container with id c499dedc33164741dcecff39f2bda4b2ce2efef4193f576a5e9d437a5de11c43 Mar 18 17:26:02.680620 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:26:02.680586 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-798668567d-5z22v" event={"ID":"ed74f7ff-70ff-40cd-804b-b888a032192c","Type":"ContainerStarted","Data":"16217c7d3f5f5b0d24469ec4305fc1af52af8769a04b9b5973e1566e3ca0fbed"} Mar 18 17:26:02.680620 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:26:02.680626 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-798668567d-5z22v" event={"ID":"ed74f7ff-70ff-40cd-804b-b888a032192c","Type":"ContainerStarted","Data":"c499dedc33164741dcecff39f2bda4b2ce2efef4193f576a5e9d437a5de11c43"} Mar 18 17:26:06.694883 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:26:06.694846 2575 generic.go:358] "Generic (PLEG): container finished" podID="ddd9e698-4fc3-49b1-bd53-aec4b50bc1f1" containerID="96dbc386f7d1a7f654b5cdce1fbdf18b849a1042bfac3cab57e95c5fccfe0c3f" exitCode=0 Mar 18 17:26:06.695408 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:26:06.694925 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-67f9b5bcdf-brrhx" event={"ID":"ddd9e698-4fc3-49b1-bd53-aec4b50bc1f1","Type":"ContainerDied","Data":"96dbc386f7d1a7f654b5cdce1fbdf18b849a1042bfac3cab57e95c5fccfe0c3f"} Mar 18 17:26:06.825651 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:26:06.825632 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-67f9b5bcdf-brrhx" Mar 18 17:26:06.894330 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:26:06.894310 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g5xxx\" (UniqueName: \"kubernetes.io/projected/ddd9e698-4fc3-49b1-bd53-aec4b50bc1f1-kube-api-access-g5xxx\") pod \"ddd9e698-4fc3-49b1-bd53-aec4b50bc1f1\" (UID: \"ddd9e698-4fc3-49b1-bd53-aec4b50bc1f1\") " Mar 18 17:26:06.894473 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:26:06.894347 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ddd9e698-4fc3-49b1-bd53-aec4b50bc1f1-kserve-provision-location\") pod \"ddd9e698-4fc3-49b1-bd53-aec4b50bc1f1\" (UID: \"ddd9e698-4fc3-49b1-bd53-aec4b50bc1f1\") " Mar 18 17:26:06.894696 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:26:06.894667 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ddd9e698-4fc3-49b1-bd53-aec4b50bc1f1-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "ddd9e698-4fc3-49b1-bd53-aec4b50bc1f1" (UID: "ddd9e698-4fc3-49b1-bd53-aec4b50bc1f1"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 18 17:26:06.896529 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:26:06.896511 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ddd9e698-4fc3-49b1-bd53-aec4b50bc1f1-kube-api-access-g5xxx" (OuterVolumeSpecName: "kube-api-access-g5xxx") pod "ddd9e698-4fc3-49b1-bd53-aec4b50bc1f1" (UID: "ddd9e698-4fc3-49b1-bd53-aec4b50bc1f1"). InnerVolumeSpecName "kube-api-access-g5xxx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 17:26:06.995014 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:26:06.994986 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-g5xxx\" (UniqueName: \"kubernetes.io/projected/ddd9e698-4fc3-49b1-bd53-aec4b50bc1f1-kube-api-access-g5xxx\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 17:26:06.995014 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:26:06.995011 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ddd9e698-4fc3-49b1-bd53-aec4b50bc1f1-kserve-provision-location\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 17:26:07.699993 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:26:07.699917 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-67f9b5bcdf-brrhx" Mar 18 17:26:07.700433 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:26:07.699920 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-67f9b5bcdf-brrhx" event={"ID":"ddd9e698-4fc3-49b1-bd53-aec4b50bc1f1","Type":"ContainerDied","Data":"890f04cc84e7c2b5cbfc8a6c84507b7b9c7d6a71f68e02bfe77979ce27e5add2"} Mar 18 17:26:07.700433 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:26:07.700038 2575 scope.go:117] "RemoveContainer" containerID="96dbc386f7d1a7f654b5cdce1fbdf18b849a1042bfac3cab57e95c5fccfe0c3f" Mar 18 17:26:07.701463 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:26:07.701438 2575 generic.go:358] "Generic (PLEG): container finished" podID="ed74f7ff-70ff-40cd-804b-b888a032192c" containerID="16217c7d3f5f5b0d24469ec4305fc1af52af8769a04b9b5973e1566e3ca0fbed" exitCode=0 Mar 18 17:26:07.701555 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:26:07.701479 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-798668567d-5z22v" event={"ID":"ed74f7ff-70ff-40cd-804b-b888a032192c","Type":"ContainerDied","Data":"16217c7d3f5f5b0d24469ec4305fc1af52af8769a04b9b5973e1566e3ca0fbed"} Mar 18 17:26:07.708823 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:26:07.708801 2575 scope.go:117] "RemoveContainer" containerID="45e96485d3d9fb4dee7553e70a5d1ded861918923f3221dd16fad62fa9474e7a" Mar 18 17:26:07.731962 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:26:07.731940 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-67f9b5bcdf-brrhx"] Mar 18 17:26:07.735535 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:26:07.735516 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-67f9b5bcdf-brrhx"] Mar 18 17:26:08.290586 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:26:08.290547 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ddd9e698-4fc3-49b1-bd53-aec4b50bc1f1" path="/var/lib/kubelet/pods/ddd9e698-4fc3-49b1-bd53-aec4b50bc1f1/volumes" Mar 18 17:26:08.706263 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:26:08.706226 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-798668567d-5z22v" event={"ID":"ed74f7ff-70ff-40cd-804b-b888a032192c","Type":"ContainerStarted","Data":"c2f331a413bda4618aa1acebd946a6d0070fbc7b0d307a150dab88adfd370420"} Mar 18 17:26:08.706731 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:26:08.706602 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-798668567d-5z22v" Mar 18 17:26:08.708102 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:26:08.708065 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-798668567d-5z22v" podUID="ed74f7ff-70ff-40cd-804b-b888a032192c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.59:8080: connect: connection refused" Mar 18 17:26:08.723251 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:26:08.723210 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-798668567d-5z22v" podStartSLOduration=7.723197328 podStartE2EDuration="7.723197328s" podCreationTimestamp="2026-03-18 17:26:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:26:08.721771114 +0000 UTC m=+2487.033024036" watchObservedRunningTime="2026-03-18 17:26:08.723197328 +0000 UTC m=+2487.034450248" Mar 18 17:26:09.710735 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:26:09.710691 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-798668567d-5z22v" podUID="ed74f7ff-70ff-40cd-804b-b888a032192c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.59:8080: connect: connection refused" Mar 18 17:26:10.287563 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:26:10.287532 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:26:19.710708 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:26:19.710666 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-798668567d-5z22v" podUID="ed74f7ff-70ff-40cd-804b-b888a032192c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.59:8080: connect: connection refused" Mar 18 17:26:23.286639 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:26:23.286608 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:26:29.711772 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:26:29.711737 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-798668567d-5z22v" Mar 18 17:26:36.287432 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:26:36.287393 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:26:38.737669 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:26:38.737640 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-runtime-predictor-798668567d-5z22v_ed74f7ff-70ff-40cd-804b-b888a032192c/kserve-container/0.log" Mar 18 17:26:38.880930 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:26:38.880899 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-798668567d-5z22v"] Mar 18 17:26:38.881215 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:26:38.881177 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-798668567d-5z22v" podUID="ed74f7ff-70ff-40cd-804b-b888a032192c" containerName="kserve-container" containerID="cri-o://c2f331a413bda4618aa1acebd946a6d0070fbc7b0d307a150dab88adfd370420" gracePeriod=30 Mar 18 17:26:39.165698 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:26:39.165665 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-65df85985c-84k8x"] Mar 18 17:26:39.166003 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:26:39.165992 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ddd9e698-4fc3-49b1-bd53-aec4b50bc1f1" containerName="kserve-container" Mar 18 17:26:39.166056 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:26:39.166005 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddd9e698-4fc3-49b1-bd53-aec4b50bc1f1" containerName="kserve-container" Mar 18 17:26:39.166056 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:26:39.166022 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ddd9e698-4fc3-49b1-bd53-aec4b50bc1f1" containerName="storage-initializer" Mar 18 17:26:39.166056 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:26:39.166027 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddd9e698-4fc3-49b1-bd53-aec4b50bc1f1" containerName="storage-initializer" Mar 18 17:26:39.166160 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:26:39.166089 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="ddd9e698-4fc3-49b1-bd53-aec4b50bc1f1" containerName="kserve-container" Mar 18 17:26:39.168259 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:26:39.168243 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-65df85985c-84k8x" Mar 18 17:26:39.176804 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:26:39.176780 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-65df85985c-84k8x"] Mar 18 17:26:39.250981 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:26:39.250940 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qfk5\" (UniqueName: \"kubernetes.io/projected/3f88ca60-5bec-41c1-8d49-e2d94b7e1417-kube-api-access-2qfk5\") pod \"isvc-sklearn-v2-runtime-predictor-65df85985c-84k8x\" (UID: \"3f88ca60-5bec-41c1-8d49-e2d94b7e1417\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-65df85985c-84k8x" Mar 18 17:26:39.251162 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:26:39.250993 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3f88ca60-5bec-41c1-8d49-e2d94b7e1417-kserve-provision-location\") pod \"isvc-sklearn-v2-runtime-predictor-65df85985c-84k8x\" (UID: \"3f88ca60-5bec-41c1-8d49-e2d94b7e1417\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-65df85985c-84k8x" Mar 18 17:26:39.351981 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:26:39.351940 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2qfk5\" (UniqueName: \"kubernetes.io/projected/3f88ca60-5bec-41c1-8d49-e2d94b7e1417-kube-api-access-2qfk5\") pod \"isvc-sklearn-v2-runtime-predictor-65df85985c-84k8x\" (UID: \"3f88ca60-5bec-41c1-8d49-e2d94b7e1417\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-65df85985c-84k8x" Mar 18 17:26:39.352164 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:26:39.351991 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3f88ca60-5bec-41c1-8d49-e2d94b7e1417-kserve-provision-location\") pod \"isvc-sklearn-v2-runtime-predictor-65df85985c-84k8x\" (UID: \"3f88ca60-5bec-41c1-8d49-e2d94b7e1417\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-65df85985c-84k8x" Mar 18 17:26:39.352349 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:26:39.352330 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3f88ca60-5bec-41c1-8d49-e2d94b7e1417-kserve-provision-location\") pod \"isvc-sklearn-v2-runtime-predictor-65df85985c-84k8x\" (UID: \"3f88ca60-5bec-41c1-8d49-e2d94b7e1417\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-65df85985c-84k8x" Mar 18 17:26:39.359859 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:26:39.359825 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qfk5\" (UniqueName: \"kubernetes.io/projected/3f88ca60-5bec-41c1-8d49-e2d94b7e1417-kube-api-access-2qfk5\") pod \"isvc-sklearn-v2-runtime-predictor-65df85985c-84k8x\" (UID: \"3f88ca60-5bec-41c1-8d49-e2d94b7e1417\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-65df85985c-84k8x" Mar 18 17:26:39.479228 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:26:39.479145 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-65df85985c-84k8x" Mar 18 17:26:39.616805 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:26:39.616773 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-65df85985c-84k8x"] Mar 18 17:26:39.618091 ip-10-0-139-49 kubenswrapper[2575]: W0318 17:26:39.618064 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f88ca60_5bec_41c1_8d49_e2d94b7e1417.slice/crio-8bc00d3341b7effaf255a29407db211153d369a48eb44eb75a2295cc656c2c5b WatchSource:0}: Error finding container 8bc00d3341b7effaf255a29407db211153d369a48eb44eb75a2295cc656c2c5b: Status 404 returned error can't find the container with id 8bc00d3341b7effaf255a29407db211153d369a48eb44eb75a2295cc656c2c5b Mar 18 17:26:39.715695 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:26:39.715677 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-798668567d-5z22v" Mar 18 17:26:39.806266 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:26:39.806183 2575 generic.go:358] "Generic (PLEG): container finished" podID="ed74f7ff-70ff-40cd-804b-b888a032192c" containerID="c2f331a413bda4618aa1acebd946a6d0070fbc7b0d307a150dab88adfd370420" exitCode=0 Mar 18 17:26:39.806266 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:26:39.806255 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-798668567d-5z22v" Mar 18 17:26:39.806726 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:26:39.806263 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-798668567d-5z22v" event={"ID":"ed74f7ff-70ff-40cd-804b-b888a032192c","Type":"ContainerDied","Data":"c2f331a413bda4618aa1acebd946a6d0070fbc7b0d307a150dab88adfd370420"} Mar 18 17:26:39.806726 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:26:39.806295 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-798668567d-5z22v" event={"ID":"ed74f7ff-70ff-40cd-804b-b888a032192c","Type":"ContainerDied","Data":"c499dedc33164741dcecff39f2bda4b2ce2efef4193f576a5e9d437a5de11c43"} Mar 18 17:26:39.806726 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:26:39.806311 2575 scope.go:117] "RemoveContainer" containerID="c2f331a413bda4618aa1acebd946a6d0070fbc7b0d307a150dab88adfd370420" Mar 18 17:26:39.807885 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:26:39.807862 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-65df85985c-84k8x" event={"ID":"3f88ca60-5bec-41c1-8d49-e2d94b7e1417","Type":"ContainerStarted","Data":"50bdfda2dad05e9ed44167fb8487175963925c1e4e0bbf207b0b4e3a80c12e94"} Mar 18 17:26:39.807885 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:26:39.807888 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-65df85985c-84k8x" event={"ID":"3f88ca60-5bec-41c1-8d49-e2d94b7e1417","Type":"ContainerStarted","Data":"8bc00d3341b7effaf255a29407db211153d369a48eb44eb75a2295cc656c2c5b"} Mar 18 17:26:39.814750 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:26:39.814734 2575 scope.go:117] "RemoveContainer" containerID="16217c7d3f5f5b0d24469ec4305fc1af52af8769a04b9b5973e1566e3ca0fbed" Mar 18 17:26:39.826394 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:26:39.826033 2575 scope.go:117] "RemoveContainer" containerID="c2f331a413bda4618aa1acebd946a6d0070fbc7b0d307a150dab88adfd370420" Mar 18 17:26:39.826394 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:26:39.826378 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c2f331a413bda4618aa1acebd946a6d0070fbc7b0d307a150dab88adfd370420\": container with ID starting with c2f331a413bda4618aa1acebd946a6d0070fbc7b0d307a150dab88adfd370420 not found: ID does not exist" containerID="c2f331a413bda4618aa1acebd946a6d0070fbc7b0d307a150dab88adfd370420" Mar 18 17:26:39.826535 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:26:39.826409 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2f331a413bda4618aa1acebd946a6d0070fbc7b0d307a150dab88adfd370420"} err="failed to get container status \"c2f331a413bda4618aa1acebd946a6d0070fbc7b0d307a150dab88adfd370420\": rpc error: code = NotFound desc = could not find container \"c2f331a413bda4618aa1acebd946a6d0070fbc7b0d307a150dab88adfd370420\": container with ID starting with c2f331a413bda4618aa1acebd946a6d0070fbc7b0d307a150dab88adfd370420 not found: ID does not exist" Mar 18 17:26:39.826535 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:26:39.826426 2575 scope.go:117] "RemoveContainer" containerID="16217c7d3f5f5b0d24469ec4305fc1af52af8769a04b9b5973e1566e3ca0fbed" Mar 18 17:26:39.826731 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:26:39.826713 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16217c7d3f5f5b0d24469ec4305fc1af52af8769a04b9b5973e1566e3ca0fbed\": container with ID starting with 16217c7d3f5f5b0d24469ec4305fc1af52af8769a04b9b5973e1566e3ca0fbed not found: ID does not exist" containerID="16217c7d3f5f5b0d24469ec4305fc1af52af8769a04b9b5973e1566e3ca0fbed" Mar 18 17:26:39.826800 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:26:39.826734 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16217c7d3f5f5b0d24469ec4305fc1af52af8769a04b9b5973e1566e3ca0fbed"} err="failed to get container status \"16217c7d3f5f5b0d24469ec4305fc1af52af8769a04b9b5973e1566e3ca0fbed\": rpc error: code = NotFound desc = could not find container \"16217c7d3f5f5b0d24469ec4305fc1af52af8769a04b9b5973e1566e3ca0fbed\": container with ID starting with 16217c7d3f5f5b0d24469ec4305fc1af52af8769a04b9b5973e1566e3ca0fbed not found: ID does not exist" Mar 18 17:26:39.855644 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:26:39.855627 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ed74f7ff-70ff-40cd-804b-b888a032192c-kserve-provision-location\") pod \"ed74f7ff-70ff-40cd-804b-b888a032192c\" (UID: \"ed74f7ff-70ff-40cd-804b-b888a032192c\") " Mar 18 17:26:39.855761 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:26:39.855712 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cmrtx\" (UniqueName: \"kubernetes.io/projected/ed74f7ff-70ff-40cd-804b-b888a032192c-kube-api-access-cmrtx\") pod \"ed74f7ff-70ff-40cd-804b-b888a032192c\" (UID: \"ed74f7ff-70ff-40cd-804b-b888a032192c\") " Mar 18 17:26:39.858004 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:26:39.857976 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed74f7ff-70ff-40cd-804b-b888a032192c-kube-api-access-cmrtx" (OuterVolumeSpecName: "kube-api-access-cmrtx") pod "ed74f7ff-70ff-40cd-804b-b888a032192c" (UID: "ed74f7ff-70ff-40cd-804b-b888a032192c"). InnerVolumeSpecName "kube-api-access-cmrtx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 17:26:39.862752 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:26:39.862730 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed74f7ff-70ff-40cd-804b-b888a032192c-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "ed74f7ff-70ff-40cd-804b-b888a032192c" (UID: "ed74f7ff-70ff-40cd-804b-b888a032192c"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 18 17:26:39.956810 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:26:39.956761 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cmrtx\" (UniqueName: \"kubernetes.io/projected/ed74f7ff-70ff-40cd-804b-b888a032192c-kube-api-access-cmrtx\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 17:26:39.956810 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:26:39.956800 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ed74f7ff-70ff-40cd-804b-b888a032192c-kserve-provision-location\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 17:26:40.126661 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:26:40.126632 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-798668567d-5z22v"] Mar 18 17:26:40.130217 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:26:40.130193 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-798668567d-5z22v"] Mar 18 17:26:40.290902 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:26:40.290869 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed74f7ff-70ff-40cd-804b-b888a032192c" path="/var/lib/kubelet/pods/ed74f7ff-70ff-40cd-804b-b888a032192c/volumes" Mar 18 17:26:40.711334 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:26:40.711299 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-798668567d-5z22v" podUID="ed74f7ff-70ff-40cd-804b-b888a032192c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.59:8080: i/o timeout" Mar 18 17:26:43.821995 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:26:43.821931 2575 generic.go:358] "Generic (PLEG): container finished" podID="3f88ca60-5bec-41c1-8d49-e2d94b7e1417" containerID="50bdfda2dad05e9ed44167fb8487175963925c1e4e0bbf207b0b4e3a80c12e94" exitCode=0 Mar 18 17:26:43.822321 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:26:43.822006 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-65df85985c-84k8x" event={"ID":"3f88ca60-5bec-41c1-8d49-e2d94b7e1417","Type":"ContainerDied","Data":"50bdfda2dad05e9ed44167fb8487175963925c1e4e0bbf207b0b4e3a80c12e94"} Mar 18 17:26:44.826772 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:26:44.826737 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-65df85985c-84k8x" event={"ID":"3f88ca60-5bec-41c1-8d49-e2d94b7e1417","Type":"ContainerStarted","Data":"46bdb62cb07e94e51f19a21c6a9567ad1c424a241f6dae677884ca62f1ed0b50"} Mar 18 17:26:44.827128 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:26:44.826981 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-65df85985c-84k8x" Mar 18 17:26:44.843147 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:26:44.843105 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-65df85985c-84k8x" podStartSLOduration=5.843092039 podStartE2EDuration="5.843092039s" podCreationTimestamp="2026-03-18 17:26:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:26:44.84141834 +0000 UTC m=+2523.152671264" watchObservedRunningTime="2026-03-18 17:26:44.843092039 +0000 UTC m=+2523.154344959" Mar 18 17:26:49.287505 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:26:49.287473 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:27:01.287071 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:27:01.287041 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:27:15.286953 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:27:15.286895 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:27:15.834377 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:27:15.834325 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-65df85985c-84k8x" Mar 18 17:27:19.001432 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:27:19.001400 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-65df85985c-84k8x"] Mar 18 17:27:19.001806 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:27:19.001674 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-65df85985c-84k8x" podUID="3f88ca60-5bec-41c1-8d49-e2d94b7e1417" containerName="kserve-container" containerID="cri-o://46bdb62cb07e94e51f19a21c6a9567ad1c424a241f6dae677884ca62f1ed0b50" gracePeriod=30 Mar 18 17:27:19.259624 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:27:19.259534 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-78c574d5fb-nkctx"] Mar 18 17:27:19.259942 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:27:19.259924 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ed74f7ff-70ff-40cd-804b-b888a032192c" containerName="kserve-container" Mar 18 17:27:19.260029 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:27:19.259946 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed74f7ff-70ff-40cd-804b-b888a032192c" containerName="kserve-container" Mar 18 17:27:19.260029 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:27:19.259968 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ed74f7ff-70ff-40cd-804b-b888a032192c" containerName="storage-initializer" Mar 18 17:27:19.260029 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:27:19.259977 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed74f7ff-70ff-40cd-804b-b888a032192c" containerName="storage-initializer" Mar 18 17:27:19.260192 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:27:19.260065 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="ed74f7ff-70ff-40cd-804b-b888a032192c" containerName="kserve-container" Mar 18 17:27:19.263339 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:27:19.263318 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-78c574d5fb-nkctx" Mar 18 17:27:19.269825 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:27:19.269799 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-78c574d5fb-nkctx"] Mar 18 17:27:19.329897 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:27:19.329872 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0a8176e8-e342-4c6b-99d2-509149d44b56-kserve-provision-location\") pod \"isvc-sklearn-v2-predictor-78c574d5fb-nkctx\" (UID: \"0a8176e8-e342-4c6b-99d2-509149d44b56\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-78c574d5fb-nkctx" Mar 18 17:27:19.330007 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:27:19.329912 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6p6g5\" (UniqueName: \"kubernetes.io/projected/0a8176e8-e342-4c6b-99d2-509149d44b56-kube-api-access-6p6g5\") pod \"isvc-sklearn-v2-predictor-78c574d5fb-nkctx\" (UID: \"0a8176e8-e342-4c6b-99d2-509149d44b56\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-78c574d5fb-nkctx" Mar 18 17:27:19.431049 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:27:19.431025 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6p6g5\" (UniqueName: \"kubernetes.io/projected/0a8176e8-e342-4c6b-99d2-509149d44b56-kube-api-access-6p6g5\") pod \"isvc-sklearn-v2-predictor-78c574d5fb-nkctx\" (UID: \"0a8176e8-e342-4c6b-99d2-509149d44b56\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-78c574d5fb-nkctx" Mar 18 17:27:19.431151 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:27:19.431092 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0a8176e8-e342-4c6b-99d2-509149d44b56-kserve-provision-location\") pod \"isvc-sklearn-v2-predictor-78c574d5fb-nkctx\" (UID: \"0a8176e8-e342-4c6b-99d2-509149d44b56\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-78c574d5fb-nkctx" Mar 18 17:27:19.431426 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:27:19.431405 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0a8176e8-e342-4c6b-99d2-509149d44b56-kserve-provision-location\") pod \"isvc-sklearn-v2-predictor-78c574d5fb-nkctx\" (UID: \"0a8176e8-e342-4c6b-99d2-509149d44b56\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-78c574d5fb-nkctx" Mar 18 17:27:19.440620 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:27:19.440597 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6p6g5\" (UniqueName: \"kubernetes.io/projected/0a8176e8-e342-4c6b-99d2-509149d44b56-kube-api-access-6p6g5\") pod \"isvc-sklearn-v2-predictor-78c574d5fb-nkctx\" (UID: \"0a8176e8-e342-4c6b-99d2-509149d44b56\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-78c574d5fb-nkctx" Mar 18 17:27:19.575246 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:27:19.575180 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-78c574d5fb-nkctx" Mar 18 17:27:19.701512 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:27:19.701486 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-78c574d5fb-nkctx"] Mar 18 17:27:19.703675 ip-10-0-139-49 kubenswrapper[2575]: W0318 17:27:19.703647 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a8176e8_e342_4c6b_99d2_509149d44b56.slice/crio-ebf74d3e657eade19fac6ba24b9630b287bd6aad82e227719708132f0f2b87c9 WatchSource:0}: Error finding container ebf74d3e657eade19fac6ba24b9630b287bd6aad82e227719708132f0f2b87c9: Status 404 returned error can't find the container with id ebf74d3e657eade19fac6ba24b9630b287bd6aad82e227719708132f0f2b87c9 Mar 18 17:27:19.943228 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:27:19.943199 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-78c574d5fb-nkctx" event={"ID":"0a8176e8-e342-4c6b-99d2-509149d44b56","Type":"ContainerStarted","Data":"11ff6519eaa932d600fa009fb2ef6670825e8abebd2b1258180014e55e028fec"} Mar 18 17:27:19.943228 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:27:19.943234 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-78c574d5fb-nkctx" event={"ID":"0a8176e8-e342-4c6b-99d2-509149d44b56","Type":"ContainerStarted","Data":"ebf74d3e657eade19fac6ba24b9630b287bd6aad82e227719708132f0f2b87c9"} Mar 18 17:27:23.958064 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:27:23.958030 2575 generic.go:358] "Generic (PLEG): container finished" podID="0a8176e8-e342-4c6b-99d2-509149d44b56" containerID="11ff6519eaa932d600fa009fb2ef6670825e8abebd2b1258180014e55e028fec" exitCode=0 Mar 18 17:27:23.958446 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:27:23.958107 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-78c574d5fb-nkctx" event={"ID":"0a8176e8-e342-4c6b-99d2-509149d44b56","Type":"ContainerDied","Data":"11ff6519eaa932d600fa009fb2ef6670825e8abebd2b1258180014e55e028fec"} Mar 18 17:27:24.255700 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:27:24.255681 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-65df85985c-84k8x" Mar 18 17:27:24.367079 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:27:24.367050 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2qfk5\" (UniqueName: \"kubernetes.io/projected/3f88ca60-5bec-41c1-8d49-e2d94b7e1417-kube-api-access-2qfk5\") pod \"3f88ca60-5bec-41c1-8d49-e2d94b7e1417\" (UID: \"3f88ca60-5bec-41c1-8d49-e2d94b7e1417\") " Mar 18 17:27:24.367209 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:27:24.367157 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3f88ca60-5bec-41c1-8d49-e2d94b7e1417-kserve-provision-location\") pod \"3f88ca60-5bec-41c1-8d49-e2d94b7e1417\" (UID: \"3f88ca60-5bec-41c1-8d49-e2d94b7e1417\") " Mar 18 17:27:24.367747 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:27:24.367514 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f88ca60-5bec-41c1-8d49-e2d94b7e1417-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "3f88ca60-5bec-41c1-8d49-e2d94b7e1417" (UID: "3f88ca60-5bec-41c1-8d49-e2d94b7e1417"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 18 17:27:24.369309 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:27:24.369287 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f88ca60-5bec-41c1-8d49-e2d94b7e1417-kube-api-access-2qfk5" (OuterVolumeSpecName: "kube-api-access-2qfk5") pod "3f88ca60-5bec-41c1-8d49-e2d94b7e1417" (UID: "3f88ca60-5bec-41c1-8d49-e2d94b7e1417"). InnerVolumeSpecName "kube-api-access-2qfk5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 17:27:24.468617 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:27:24.468581 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3f88ca60-5bec-41c1-8d49-e2d94b7e1417-kserve-provision-location\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 17:27:24.468617 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:27:24.468617 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2qfk5\" (UniqueName: \"kubernetes.io/projected/3f88ca60-5bec-41c1-8d49-e2d94b7e1417-kube-api-access-2qfk5\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 17:27:24.964242 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:27:24.964211 2575 generic.go:358] "Generic (PLEG): container finished" podID="3f88ca60-5bec-41c1-8d49-e2d94b7e1417" containerID="46bdb62cb07e94e51f19a21c6a9567ad1c424a241f6dae677884ca62f1ed0b50" exitCode=0 Mar 18 17:27:24.964635 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:27:24.964280 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-65df85985c-84k8x" Mar 18 17:27:24.964635 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:27:24.964294 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-65df85985c-84k8x" event={"ID":"3f88ca60-5bec-41c1-8d49-e2d94b7e1417","Type":"ContainerDied","Data":"46bdb62cb07e94e51f19a21c6a9567ad1c424a241f6dae677884ca62f1ed0b50"} Mar 18 17:27:24.964635 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:27:24.964337 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-65df85985c-84k8x" event={"ID":"3f88ca60-5bec-41c1-8d49-e2d94b7e1417","Type":"ContainerDied","Data":"8bc00d3341b7effaf255a29407db211153d369a48eb44eb75a2295cc656c2c5b"} Mar 18 17:27:24.964635 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:27:24.964372 2575 scope.go:117] "RemoveContainer" containerID="46bdb62cb07e94e51f19a21c6a9567ad1c424a241f6dae677884ca62f1ed0b50" Mar 18 17:27:24.966310 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:27:24.966289 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-78c574d5fb-nkctx" event={"ID":"0a8176e8-e342-4c6b-99d2-509149d44b56","Type":"ContainerStarted","Data":"249de046f54fe8851495a91afe24a1ee7b56310a132ba980d0e3c01eb66e18e7"} Mar 18 17:27:24.966630 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:27:24.966610 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-78c574d5fb-nkctx" Mar 18 17:27:24.968036 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:27:24.968002 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-78c574d5fb-nkctx" podUID="0a8176e8-e342-4c6b-99d2-509149d44b56" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.61:8080: connect: connection refused" Mar 18 17:27:24.976800 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:27:24.976781 2575 scope.go:117] "RemoveContainer" containerID="50bdfda2dad05e9ed44167fb8487175963925c1e4e0bbf207b0b4e3a80c12e94" Mar 18 17:27:24.983573 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:27:24.983530 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-78c574d5fb-nkctx" podStartSLOduration=5.9835161580000005 podStartE2EDuration="5.983516158s" podCreationTimestamp="2026-03-18 17:27:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:27:24.981537603 +0000 UTC m=+2563.292790519" watchObservedRunningTime="2026-03-18 17:27:24.983516158 +0000 UTC m=+2563.294769079" Mar 18 17:27:24.992542 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:27:24.992523 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-65df85985c-84k8x"] Mar 18 17:27:24.998419 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:27:24.998400 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-65df85985c-84k8x"] Mar 18 17:27:25.001008 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:27:25.000985 2575 scope.go:117] "RemoveContainer" containerID="46bdb62cb07e94e51f19a21c6a9567ad1c424a241f6dae677884ca62f1ed0b50" Mar 18 17:27:25.001259 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:27:25.001244 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46bdb62cb07e94e51f19a21c6a9567ad1c424a241f6dae677884ca62f1ed0b50\": container with ID starting with 46bdb62cb07e94e51f19a21c6a9567ad1c424a241f6dae677884ca62f1ed0b50 not found: ID does not exist" containerID="46bdb62cb07e94e51f19a21c6a9567ad1c424a241f6dae677884ca62f1ed0b50" Mar 18 17:27:25.001312 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:27:25.001266 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46bdb62cb07e94e51f19a21c6a9567ad1c424a241f6dae677884ca62f1ed0b50"} err="failed to get container status \"46bdb62cb07e94e51f19a21c6a9567ad1c424a241f6dae677884ca62f1ed0b50\": rpc error: code = NotFound desc = could not find container \"46bdb62cb07e94e51f19a21c6a9567ad1c424a241f6dae677884ca62f1ed0b50\": container with ID starting with 46bdb62cb07e94e51f19a21c6a9567ad1c424a241f6dae677884ca62f1ed0b50 not found: ID does not exist" Mar 18 17:27:25.001312 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:27:25.001288 2575 scope.go:117] "RemoveContainer" containerID="50bdfda2dad05e9ed44167fb8487175963925c1e4e0bbf207b0b4e3a80c12e94" Mar 18 17:27:25.001556 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:27:25.001539 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50bdfda2dad05e9ed44167fb8487175963925c1e4e0bbf207b0b4e3a80c12e94\": container with ID starting with 50bdfda2dad05e9ed44167fb8487175963925c1e4e0bbf207b0b4e3a80c12e94 not found: ID does not exist" containerID="50bdfda2dad05e9ed44167fb8487175963925c1e4e0bbf207b0b4e3a80c12e94" Mar 18 17:27:25.001609 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:27:25.001562 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50bdfda2dad05e9ed44167fb8487175963925c1e4e0bbf207b0b4e3a80c12e94"} err="failed to get container status \"50bdfda2dad05e9ed44167fb8487175963925c1e4e0bbf207b0b4e3a80c12e94\": rpc error: code = NotFound desc = could not find container \"50bdfda2dad05e9ed44167fb8487175963925c1e4e0bbf207b0b4e3a80c12e94\": container with ID starting with 50bdfda2dad05e9ed44167fb8487175963925c1e4e0bbf207b0b4e3a80c12e94 not found: ID does not exist" Mar 18 17:27:25.971610 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:27:25.971571 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-78c574d5fb-nkctx" podUID="0a8176e8-e342-4c6b-99d2-509149d44b56" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.61:8080: connect: connection refused" Mar 18 17:27:26.296487 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:27:26.296414 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f88ca60-5bec-41c1-8d49-e2d94b7e1417" path="/var/lib/kubelet/pods/3f88ca60-5bec-41c1-8d49-e2d94b7e1417/volumes" Mar 18 17:27:27.287251 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:27:27.287214 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:27:35.971914 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:27:35.971872 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-78c574d5fb-nkctx" podUID="0a8176e8-e342-4c6b-99d2-509149d44b56" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.61:8080: connect: connection refused" Mar 18 17:27:38.287342 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:27:38.287307 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:27:45.971736 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:27:45.971698 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-78c574d5fb-nkctx" podUID="0a8176e8-e342-4c6b-99d2-509149d44b56" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.61:8080: connect: connection refused" Mar 18 17:27:53.286669 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:27:53.286533 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:27:55.971812 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:27:55.971770 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-78c574d5fb-nkctx" podUID="0a8176e8-e342-4c6b-99d2-509149d44b56" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.61:8080: connect: connection refused" Mar 18 17:28:05.972206 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:28:05.972159 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-78c574d5fb-nkctx" podUID="0a8176e8-e342-4c6b-99d2-509149d44b56" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.61:8080: connect: connection refused" Mar 18 17:28:06.287033 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:28:06.287002 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:28:15.971808 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:28:15.971765 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-78c574d5fb-nkctx" podUID="0a8176e8-e342-4c6b-99d2-509149d44b56" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.61:8080: connect: connection refused" Mar 18 17:28:20.577261 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:28:20.577187 2575 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized" image="quay.io/opendatahub/odh-model-serving-api:fast" Mar 18 17:28:20.577606 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:28:20.577352 2575 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:server,Image:quay.io/opendatahub/odh-model-serving-api:fast,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},ContainerPort{Name:metrics,HostPort:0,ContainerPort:9090,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:TLS_CERT_DIR,Value:/tls,ValueFrom:nil,},EnvVar{Name:GATEWAY_LABEL_SELECTOR,Value:,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{50 -3} {} 50m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tls-certs,ReadOnly:true,MountPath:/tls,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vhs26,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000640000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod model-serving-api-9699c8d45-ttb7x_kserve(e41abd12-239d-46fb-9cdb-9d35fa51024d): ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized" logger="UnhandledError" Mar 18 17:28:20.578555 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:28:20.578523 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ErrImagePull: \"unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:28:25.972499 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:28:25.972460 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-78c574d5fb-nkctx" podUID="0a8176e8-e342-4c6b-99d2-509149d44b56" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.61:8080: connect: connection refused" Mar 18 17:28:28.286067 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:28:28.286023 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-78c574d5fb-nkctx" podUID="0a8176e8-e342-4c6b-99d2-509149d44b56" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.61:8080: connect: connection refused" Mar 18 17:28:35.286846 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:28:35.286814 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:28:38.286617 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:28:38.286578 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-78c574d5fb-nkctx" podUID="0a8176e8-e342-4c6b-99d2-509149d44b56" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.61:8080: connect: connection refused" Mar 18 17:28:48.289437 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:28:48.289411 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-78c574d5fb-nkctx" Mar 18 17:28:49.287329 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:28:49.287300 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:28:49.676475 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:28:49.676444 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-78c574d5fb-nkctx"] Mar 18 17:28:49.676863 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:28:49.676676 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-78c574d5fb-nkctx" podUID="0a8176e8-e342-4c6b-99d2-509149d44b56" containerName="kserve-container" containerID="cri-o://249de046f54fe8851495a91afe24a1ee7b56310a132ba980d0e3c01eb66e18e7" gracePeriod=30 Mar 18 17:28:49.749085 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:28:49.749057 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-58b7c85c9f-wldnc"] Mar 18 17:28:49.749446 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:28:49.749433 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3f88ca60-5bec-41c1-8d49-e2d94b7e1417" containerName="kserve-container" Mar 18 17:28:49.749496 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:28:49.749451 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f88ca60-5bec-41c1-8d49-e2d94b7e1417" containerName="kserve-container" Mar 18 17:28:49.749496 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:28:49.749478 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3f88ca60-5bec-41c1-8d49-e2d94b7e1417" containerName="storage-initializer" Mar 18 17:28:49.749496 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:28:49.749484 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f88ca60-5bec-41c1-8d49-e2d94b7e1417" containerName="storage-initializer" Mar 18 17:28:49.749600 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:28:49.749547 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="3f88ca60-5bec-41c1-8d49-e2d94b7e1417" containerName="kserve-container" Mar 18 17:28:49.752638 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:28:49.752622 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-58b7c85c9f-wldnc" Mar 18 17:28:49.758959 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:28:49.758936 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-58b7c85c9f-wldnc"] Mar 18 17:28:49.784505 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:28:49.784485 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nccrr\" (UniqueName: \"kubernetes.io/projected/65d0da32-07db-47d4-a164-e5b95c33ea25-kube-api-access-nccrr\") pod \"isvc-sklearn-v2-mixed-predictor-58b7c85c9f-wldnc\" (UID: \"65d0da32-07db-47d4-a164-e5b95c33ea25\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-58b7c85c9f-wldnc" Mar 18 17:28:49.784624 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:28:49.784539 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/65d0da32-07db-47d4-a164-e5b95c33ea25-kserve-provision-location\") pod \"isvc-sklearn-v2-mixed-predictor-58b7c85c9f-wldnc\" (UID: \"65d0da32-07db-47d4-a164-e5b95c33ea25\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-58b7c85c9f-wldnc" Mar 18 17:28:49.885421 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:28:49.885389 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/65d0da32-07db-47d4-a164-e5b95c33ea25-kserve-provision-location\") pod \"isvc-sklearn-v2-mixed-predictor-58b7c85c9f-wldnc\" (UID: \"65d0da32-07db-47d4-a164-e5b95c33ea25\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-58b7c85c9f-wldnc" Mar 18 17:28:49.885579 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:28:49.885443 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nccrr\" (UniqueName: \"kubernetes.io/projected/65d0da32-07db-47d4-a164-e5b95c33ea25-kube-api-access-nccrr\") pod \"isvc-sklearn-v2-mixed-predictor-58b7c85c9f-wldnc\" (UID: \"65d0da32-07db-47d4-a164-e5b95c33ea25\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-58b7c85c9f-wldnc" Mar 18 17:28:49.885714 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:28:49.885692 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/65d0da32-07db-47d4-a164-e5b95c33ea25-kserve-provision-location\") pod \"isvc-sklearn-v2-mixed-predictor-58b7c85c9f-wldnc\" (UID: \"65d0da32-07db-47d4-a164-e5b95c33ea25\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-58b7c85c9f-wldnc" Mar 18 17:28:49.894505 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:28:49.894480 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nccrr\" (UniqueName: \"kubernetes.io/projected/65d0da32-07db-47d4-a164-e5b95c33ea25-kube-api-access-nccrr\") pod \"isvc-sklearn-v2-mixed-predictor-58b7c85c9f-wldnc\" (UID: \"65d0da32-07db-47d4-a164-e5b95c33ea25\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-58b7c85c9f-wldnc" Mar 18 17:28:50.064189 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:28:50.064115 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-58b7c85c9f-wldnc" Mar 18 17:28:50.188465 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:28:50.188438 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-58b7c85c9f-wldnc"] Mar 18 17:28:50.190745 ip-10-0-139-49 kubenswrapper[2575]: W0318 17:28:50.190715 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod65d0da32_07db_47d4_a164_e5b95c33ea25.slice/crio-46db938d4c303b7abea9c67e69e32b8c06263ba844985a435011afe957a38f63 WatchSource:0}: Error finding container 46db938d4c303b7abea9c67e69e32b8c06263ba844985a435011afe957a38f63: Status 404 returned error can't find the container with id 46db938d4c303b7abea9c67e69e32b8c06263ba844985a435011afe957a38f63 Mar 18 17:28:50.250503 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:28:50.250481 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-58b7c85c9f-wldnc" event={"ID":"65d0da32-07db-47d4-a164-e5b95c33ea25","Type":"ContainerStarted","Data":"46db938d4c303b7abea9c67e69e32b8c06263ba844985a435011afe957a38f63"} Mar 18 17:28:51.255594 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:28:51.255556 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-58b7c85c9f-wldnc" event={"ID":"65d0da32-07db-47d4-a164-e5b95c33ea25","Type":"ContainerStarted","Data":"e535d76e1496ddc05c5b8960c993df86a76298812220c4eef746ea6493f584dd"} Mar 18 17:28:53.816836 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:28:53.816814 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-78c574d5fb-nkctx" Mar 18 17:28:53.915187 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:28:53.915161 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6p6g5\" (UniqueName: \"kubernetes.io/projected/0a8176e8-e342-4c6b-99d2-509149d44b56-kube-api-access-6p6g5\") pod \"0a8176e8-e342-4c6b-99d2-509149d44b56\" (UID: \"0a8176e8-e342-4c6b-99d2-509149d44b56\") " Mar 18 17:28:53.915285 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:28:53.915262 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0a8176e8-e342-4c6b-99d2-509149d44b56-kserve-provision-location\") pod \"0a8176e8-e342-4c6b-99d2-509149d44b56\" (UID: \"0a8176e8-e342-4c6b-99d2-509149d44b56\") " Mar 18 17:28:53.915608 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:28:53.915586 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a8176e8-e342-4c6b-99d2-509149d44b56-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "0a8176e8-e342-4c6b-99d2-509149d44b56" (UID: "0a8176e8-e342-4c6b-99d2-509149d44b56"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 18 17:28:53.917343 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:28:53.917323 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a8176e8-e342-4c6b-99d2-509149d44b56-kube-api-access-6p6g5" (OuterVolumeSpecName: "kube-api-access-6p6g5") pod "0a8176e8-e342-4c6b-99d2-509149d44b56" (UID: "0a8176e8-e342-4c6b-99d2-509149d44b56"). InnerVolumeSpecName "kube-api-access-6p6g5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 17:28:54.016311 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:28:54.016246 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0a8176e8-e342-4c6b-99d2-509149d44b56-kserve-provision-location\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 17:28:54.016311 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:28:54.016279 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6p6g5\" (UniqueName: \"kubernetes.io/projected/0a8176e8-e342-4c6b-99d2-509149d44b56-kube-api-access-6p6g5\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 17:28:54.266098 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:28:54.266068 2575 generic.go:358] "Generic (PLEG): container finished" podID="0a8176e8-e342-4c6b-99d2-509149d44b56" containerID="249de046f54fe8851495a91afe24a1ee7b56310a132ba980d0e3c01eb66e18e7" exitCode=0 Mar 18 17:28:54.266222 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:28:54.266134 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-78c574d5fb-nkctx" Mar 18 17:28:54.266222 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:28:54.266146 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-78c574d5fb-nkctx" event={"ID":"0a8176e8-e342-4c6b-99d2-509149d44b56","Type":"ContainerDied","Data":"249de046f54fe8851495a91afe24a1ee7b56310a132ba980d0e3c01eb66e18e7"} Mar 18 17:28:54.266222 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:28:54.266175 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-78c574d5fb-nkctx" event={"ID":"0a8176e8-e342-4c6b-99d2-509149d44b56","Type":"ContainerDied","Data":"ebf74d3e657eade19fac6ba24b9630b287bd6aad82e227719708132f0f2b87c9"} Mar 18 17:28:54.266222 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:28:54.266189 2575 scope.go:117] "RemoveContainer" containerID="249de046f54fe8851495a91afe24a1ee7b56310a132ba980d0e3c01eb66e18e7" Mar 18 17:28:54.274526 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:28:54.274503 2575 scope.go:117] "RemoveContainer" containerID="11ff6519eaa932d600fa009fb2ef6670825e8abebd2b1258180014e55e028fec" Mar 18 17:28:54.283350 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:28:54.283334 2575 scope.go:117] "RemoveContainer" containerID="249de046f54fe8851495a91afe24a1ee7b56310a132ba980d0e3c01eb66e18e7" Mar 18 17:28:54.283624 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:28:54.283605 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"249de046f54fe8851495a91afe24a1ee7b56310a132ba980d0e3c01eb66e18e7\": container with ID starting with 249de046f54fe8851495a91afe24a1ee7b56310a132ba980d0e3c01eb66e18e7 not found: ID does not exist" containerID="249de046f54fe8851495a91afe24a1ee7b56310a132ba980d0e3c01eb66e18e7" Mar 18 17:28:54.283673 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:28:54.283632 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"249de046f54fe8851495a91afe24a1ee7b56310a132ba980d0e3c01eb66e18e7"} err="failed to get container status \"249de046f54fe8851495a91afe24a1ee7b56310a132ba980d0e3c01eb66e18e7\": rpc error: code = NotFound desc = could not find container \"249de046f54fe8851495a91afe24a1ee7b56310a132ba980d0e3c01eb66e18e7\": container with ID starting with 249de046f54fe8851495a91afe24a1ee7b56310a132ba980d0e3c01eb66e18e7 not found: ID does not exist" Mar 18 17:28:54.283673 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:28:54.283649 2575 scope.go:117] "RemoveContainer" containerID="11ff6519eaa932d600fa009fb2ef6670825e8abebd2b1258180014e55e028fec" Mar 18 17:28:54.284094 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:28:54.284001 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11ff6519eaa932d600fa009fb2ef6670825e8abebd2b1258180014e55e028fec\": container with ID starting with 11ff6519eaa932d600fa009fb2ef6670825e8abebd2b1258180014e55e028fec not found: ID does not exist" containerID="11ff6519eaa932d600fa009fb2ef6670825e8abebd2b1258180014e55e028fec" Mar 18 17:28:54.284094 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:28:54.284036 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11ff6519eaa932d600fa009fb2ef6670825e8abebd2b1258180014e55e028fec"} err="failed to get container status \"11ff6519eaa932d600fa009fb2ef6670825e8abebd2b1258180014e55e028fec\": rpc error: code = NotFound desc = could not find container \"11ff6519eaa932d600fa009fb2ef6670825e8abebd2b1258180014e55e028fec\": container with ID starting with 11ff6519eaa932d600fa009fb2ef6670825e8abebd2b1258180014e55e028fec not found: ID does not exist" Mar 18 17:28:54.286101 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:28:54.286082 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-78c574d5fb-nkctx"] Mar 18 17:28:54.290480 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:28:54.290463 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-78c574d5fb-nkctx"] Mar 18 17:28:55.270317 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:28:55.270286 2575 generic.go:358] "Generic (PLEG): container finished" podID="65d0da32-07db-47d4-a164-e5b95c33ea25" containerID="e535d76e1496ddc05c5b8960c993df86a76298812220c4eef746ea6493f584dd" exitCode=0 Mar 18 17:28:55.270734 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:28:55.270379 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-58b7c85c9f-wldnc" event={"ID":"65d0da32-07db-47d4-a164-e5b95c33ea25","Type":"ContainerDied","Data":"e535d76e1496ddc05c5b8960c993df86a76298812220c4eef746ea6493f584dd"} Mar 18 17:28:56.276675 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:28:56.276644 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-58b7c85c9f-wldnc" event={"ID":"65d0da32-07db-47d4-a164-e5b95c33ea25","Type":"ContainerStarted","Data":"550f8d694a51257c77d66122207179ddb43877bedc00b453b7021dbedd2e8347"} Mar 18 17:28:56.277095 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:28:56.276977 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-58b7c85c9f-wldnc" Mar 18 17:28:56.278209 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:28:56.278184 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-58b7c85c9f-wldnc" podUID="65d0da32-07db-47d4-a164-e5b95c33ea25" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.62:8080: connect: connection refused" Mar 18 17:28:56.289557 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:28:56.289534 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a8176e8-e342-4c6b-99d2-509149d44b56" path="/var/lib/kubelet/pods/0a8176e8-e342-4c6b-99d2-509149d44b56/volumes" Mar 18 17:28:56.292497 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:28:56.292458 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-58b7c85c9f-wldnc" podStartSLOduration=7.292447028 podStartE2EDuration="7.292447028s" podCreationTimestamp="2026-03-18 17:28:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:28:56.292067683 +0000 UTC m=+2654.603320604" watchObservedRunningTime="2026-03-18 17:28:56.292447028 +0000 UTC m=+2654.603699948" Mar 18 17:28:57.280869 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:28:57.280839 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-58b7c85c9f-wldnc" podUID="65d0da32-07db-47d4-a164-e5b95c33ea25" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.62:8080: connect: connection refused" Mar 18 17:29:01.286776 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:29:01.286732 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:29:07.282005 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:29:07.281957 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-58b7c85c9f-wldnc" podUID="65d0da32-07db-47d4-a164-e5b95c33ea25" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.62:8080: connect: connection refused" Mar 18 17:29:13.287450 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:29:13.287422 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:29:17.281194 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:29:17.281139 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-58b7c85c9f-wldnc" podUID="65d0da32-07db-47d4-a164-e5b95c33ea25" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.62:8080: connect: connection refused" Mar 18 17:29:27.281047 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:29:27.280957 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-58b7c85c9f-wldnc" podUID="65d0da32-07db-47d4-a164-e5b95c33ea25" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.62:8080: connect: connection refused" Mar 18 17:29:28.287503 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:29:28.287292 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:29:37.281651 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:29:37.281606 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-58b7c85c9f-wldnc" podUID="65d0da32-07db-47d4-a164-e5b95c33ea25" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.62:8080: connect: connection refused" Mar 18 17:29:42.289374 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:29:42.289307 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:29:42.364117 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:29:42.364095 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qktrm_a51f93e2-c750-4700-bb46-109456c7c78a/ovn-acl-logging/0.log" Mar 18 17:29:42.370339 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:29:42.370319 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qktrm_a51f93e2-c750-4700-bb46-109456c7c78a/ovn-acl-logging/0.log" Mar 18 17:29:47.281613 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:29:47.281577 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-58b7c85c9f-wldnc" podUID="65d0da32-07db-47d4-a164-e5b95c33ea25" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.62:8080: connect: connection refused" Mar 18 17:29:57.281584 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:29:57.281544 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-58b7c85c9f-wldnc" podUID="65d0da32-07db-47d4-a164-e5b95c33ea25" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.62:8080: connect: connection refused" Mar 18 17:29:57.286996 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:29:57.286971 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:30:01.286105 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:30:01.286063 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-58b7c85c9f-wldnc" podUID="65d0da32-07db-47d4-a164-e5b95c33ea25" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.62:8080: connect: connection refused" Mar 18 17:30:08.287259 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:30:08.287228 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:30:11.286577 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:30:11.286536 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-58b7c85c9f-wldnc" podUID="65d0da32-07db-47d4-a164-e5b95c33ea25" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.62:8080: connect: connection refused" Mar 18 17:30:20.286924 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:30:20.286895 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:30:21.287372 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:30:21.287338 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-58b7c85c9f-wldnc" Mar 18 17:30:29.688737 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:30:29.688702 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-58b7c85c9f-wldnc"] Mar 18 17:30:29.689098 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:30:29.688953 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-58b7c85c9f-wldnc" podUID="65d0da32-07db-47d4-a164-e5b95c33ea25" containerName="kserve-container" containerID="cri-o://550f8d694a51257c77d66122207179ddb43877bedc00b453b7021dbedd2e8347" gracePeriod=30 Mar 18 17:30:29.953880 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:30:29.953805 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-c46f4487f-p72c6"] Mar 18 17:30:29.954133 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:30:29.954121 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0a8176e8-e342-4c6b-99d2-509149d44b56" containerName="storage-initializer" Mar 18 17:30:29.954176 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:30:29.954135 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a8176e8-e342-4c6b-99d2-509149d44b56" containerName="storage-initializer" Mar 18 17:30:29.954176 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:30:29.954150 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0a8176e8-e342-4c6b-99d2-509149d44b56" containerName="kserve-container" Mar 18 17:30:29.954176 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:30:29.954155 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a8176e8-e342-4c6b-99d2-509149d44b56" containerName="kserve-container" Mar 18 17:30:29.954269 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:30:29.954216 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="0a8176e8-e342-4c6b-99d2-509149d44b56" containerName="kserve-container" Mar 18 17:30:29.956996 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:30:29.956979 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-c46f4487f-p72c6" Mar 18 17:30:29.977079 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:30:29.977056 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-c46f4487f-p72c6"] Mar 18 17:30:30.089729 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:30:30.089703 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7f9868a4-0feb-4e21-b01c-6bcba698b35a-kserve-provision-location\") pod \"isvc-tensorflow-predictor-c46f4487f-p72c6\" (UID: \"7f9868a4-0feb-4e21-b01c-6bcba698b35a\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-c46f4487f-p72c6" Mar 18 17:30:30.089880 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:30:30.089740 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfnk9\" (UniqueName: \"kubernetes.io/projected/7f9868a4-0feb-4e21-b01c-6bcba698b35a-kube-api-access-wfnk9\") pod \"isvc-tensorflow-predictor-c46f4487f-p72c6\" (UID: \"7f9868a4-0feb-4e21-b01c-6bcba698b35a\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-c46f4487f-p72c6" Mar 18 17:30:30.191126 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:30:30.191089 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7f9868a4-0feb-4e21-b01c-6bcba698b35a-kserve-provision-location\") pod \"isvc-tensorflow-predictor-c46f4487f-p72c6\" (UID: \"7f9868a4-0feb-4e21-b01c-6bcba698b35a\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-c46f4487f-p72c6" Mar 18 17:30:30.191126 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:30:30.191126 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wfnk9\" (UniqueName: \"kubernetes.io/projected/7f9868a4-0feb-4e21-b01c-6bcba698b35a-kube-api-access-wfnk9\") pod \"isvc-tensorflow-predictor-c46f4487f-p72c6\" (UID: \"7f9868a4-0feb-4e21-b01c-6bcba698b35a\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-c46f4487f-p72c6" Mar 18 17:30:30.191489 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:30:30.191471 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7f9868a4-0feb-4e21-b01c-6bcba698b35a-kserve-provision-location\") pod \"isvc-tensorflow-predictor-c46f4487f-p72c6\" (UID: \"7f9868a4-0feb-4e21-b01c-6bcba698b35a\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-c46f4487f-p72c6" Mar 18 17:30:30.202906 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:30:30.202880 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfnk9\" (UniqueName: \"kubernetes.io/projected/7f9868a4-0feb-4e21-b01c-6bcba698b35a-kube-api-access-wfnk9\") pod \"isvc-tensorflow-predictor-c46f4487f-p72c6\" (UID: \"7f9868a4-0feb-4e21-b01c-6bcba698b35a\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-c46f4487f-p72c6" Mar 18 17:30:30.267117 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:30:30.267064 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-c46f4487f-p72c6" Mar 18 17:30:30.393268 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:30:30.393242 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-c46f4487f-p72c6"] Mar 18 17:30:30.395927 ip-10-0-139-49 kubenswrapper[2575]: W0318 17:30:30.395893 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f9868a4_0feb_4e21_b01c_6bcba698b35a.slice/crio-2cdc22f2cbbc480b872990f793b3702ab29c3d7bfe5bb8731dd3a5ec880c7956 WatchSource:0}: Error finding container 2cdc22f2cbbc480b872990f793b3702ab29c3d7bfe5bb8731dd3a5ec880c7956: Status 404 returned error can't find the container with id 2cdc22f2cbbc480b872990f793b3702ab29c3d7bfe5bb8731dd3a5ec880c7956 Mar 18 17:30:30.575079 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:30:30.574991 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-c46f4487f-p72c6" event={"ID":"7f9868a4-0feb-4e21-b01c-6bcba698b35a","Type":"ContainerStarted","Data":"100d24a895a487588165e84ab1cf86206162aea9f05b33c65cc186aa31d66e66"} Mar 18 17:30:30.575079 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:30:30.575028 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-c46f4487f-p72c6" event={"ID":"7f9868a4-0feb-4e21-b01c-6bcba698b35a","Type":"ContainerStarted","Data":"2cdc22f2cbbc480b872990f793b3702ab29c3d7bfe5bb8731dd3a5ec880c7956"} Mar 18 17:30:31.286672 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:30:31.286126 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-58b7c85c9f-wldnc" podUID="65d0da32-07db-47d4-a164-e5b95c33ea25" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.62:8080: connect: connection refused" Mar 18 17:30:31.286963 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:30:31.286784 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:30:33.947147 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:30:33.947123 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-58b7c85c9f-wldnc" Mar 18 17:30:34.125567 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:30:34.125496 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nccrr\" (UniqueName: \"kubernetes.io/projected/65d0da32-07db-47d4-a164-e5b95c33ea25-kube-api-access-nccrr\") pod \"65d0da32-07db-47d4-a164-e5b95c33ea25\" (UID: \"65d0da32-07db-47d4-a164-e5b95c33ea25\") " Mar 18 17:30:34.125567 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:30:34.125564 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/65d0da32-07db-47d4-a164-e5b95c33ea25-kserve-provision-location\") pod \"65d0da32-07db-47d4-a164-e5b95c33ea25\" (UID: \"65d0da32-07db-47d4-a164-e5b95c33ea25\") " Mar 18 17:30:34.125907 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:30:34.125881 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65d0da32-07db-47d4-a164-e5b95c33ea25-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "65d0da32-07db-47d4-a164-e5b95c33ea25" (UID: "65d0da32-07db-47d4-a164-e5b95c33ea25"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 18 17:30:34.127534 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:30:34.127510 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65d0da32-07db-47d4-a164-e5b95c33ea25-kube-api-access-nccrr" (OuterVolumeSpecName: "kube-api-access-nccrr") pod "65d0da32-07db-47d4-a164-e5b95c33ea25" (UID: "65d0da32-07db-47d4-a164-e5b95c33ea25"). InnerVolumeSpecName "kube-api-access-nccrr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 17:30:34.226491 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:30:34.226461 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/65d0da32-07db-47d4-a164-e5b95c33ea25-kserve-provision-location\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 17:30:34.226491 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:30:34.226484 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nccrr\" (UniqueName: \"kubernetes.io/projected/65d0da32-07db-47d4-a164-e5b95c33ea25-kube-api-access-nccrr\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 17:30:34.589374 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:30:34.589334 2575 generic.go:358] "Generic (PLEG): container finished" podID="65d0da32-07db-47d4-a164-e5b95c33ea25" containerID="550f8d694a51257c77d66122207179ddb43877bedc00b453b7021dbedd2e8347" exitCode=0 Mar 18 17:30:34.589563 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:30:34.589393 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-58b7c85c9f-wldnc" event={"ID":"65d0da32-07db-47d4-a164-e5b95c33ea25","Type":"ContainerDied","Data":"550f8d694a51257c77d66122207179ddb43877bedc00b453b7021dbedd2e8347"} Mar 18 17:30:34.589563 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:30:34.589421 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-58b7c85c9f-wldnc" Mar 18 17:30:34.589563 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:30:34.589432 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-58b7c85c9f-wldnc" event={"ID":"65d0da32-07db-47d4-a164-e5b95c33ea25","Type":"ContainerDied","Data":"46db938d4c303b7abea9c67e69e32b8c06263ba844985a435011afe957a38f63"} Mar 18 17:30:34.589563 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:30:34.589451 2575 scope.go:117] "RemoveContainer" containerID="550f8d694a51257c77d66122207179ddb43877bedc00b453b7021dbedd2e8347" Mar 18 17:30:34.598266 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:30:34.598250 2575 scope.go:117] "RemoveContainer" containerID="e535d76e1496ddc05c5b8960c993df86a76298812220c4eef746ea6493f584dd" Mar 18 17:30:34.607710 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:30:34.607651 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-58b7c85c9f-wldnc"] Mar 18 17:30:34.608647 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:30:34.608630 2575 scope.go:117] "RemoveContainer" containerID="550f8d694a51257c77d66122207179ddb43877bedc00b453b7021dbedd2e8347" Mar 18 17:30:34.608918 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:30:34.608897 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"550f8d694a51257c77d66122207179ddb43877bedc00b453b7021dbedd2e8347\": container with ID starting with 550f8d694a51257c77d66122207179ddb43877bedc00b453b7021dbedd2e8347 not found: ID does not exist" containerID="550f8d694a51257c77d66122207179ddb43877bedc00b453b7021dbedd2e8347" Mar 18 17:30:34.609019 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:30:34.608947 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"550f8d694a51257c77d66122207179ddb43877bedc00b453b7021dbedd2e8347"} err="failed to get container status \"550f8d694a51257c77d66122207179ddb43877bedc00b453b7021dbedd2e8347\": rpc error: code = NotFound desc = could not find container \"550f8d694a51257c77d66122207179ddb43877bedc00b453b7021dbedd2e8347\": container with ID starting with 550f8d694a51257c77d66122207179ddb43877bedc00b453b7021dbedd2e8347 not found: ID does not exist" Mar 18 17:30:34.609019 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:30:34.608967 2575 scope.go:117] "RemoveContainer" containerID="e535d76e1496ddc05c5b8960c993df86a76298812220c4eef746ea6493f584dd" Mar 18 17:30:34.609241 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:30:34.609224 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e535d76e1496ddc05c5b8960c993df86a76298812220c4eef746ea6493f584dd\": container with ID starting with e535d76e1496ddc05c5b8960c993df86a76298812220c4eef746ea6493f584dd not found: ID does not exist" containerID="e535d76e1496ddc05c5b8960c993df86a76298812220c4eef746ea6493f584dd" Mar 18 17:30:34.609314 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:30:34.609250 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e535d76e1496ddc05c5b8960c993df86a76298812220c4eef746ea6493f584dd"} err="failed to get container status \"e535d76e1496ddc05c5b8960c993df86a76298812220c4eef746ea6493f584dd\": rpc error: code = NotFound desc = could not find container \"e535d76e1496ddc05c5b8960c993df86a76298812220c4eef746ea6493f584dd\": container with ID starting with e535d76e1496ddc05c5b8960c993df86a76298812220c4eef746ea6493f584dd not found: ID does not exist" Mar 18 17:30:34.611182 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:30:34.611162 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-58b7c85c9f-wldnc"] Mar 18 17:30:35.593929 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:30:35.593899 2575 generic.go:358] "Generic (PLEG): container finished" podID="7f9868a4-0feb-4e21-b01c-6bcba698b35a" containerID="100d24a895a487588165e84ab1cf86206162aea9f05b33c65cc186aa31d66e66" exitCode=0 Mar 18 17:30:35.594340 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:30:35.593975 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-c46f4487f-p72c6" event={"ID":"7f9868a4-0feb-4e21-b01c-6bcba698b35a","Type":"ContainerDied","Data":"100d24a895a487588165e84ab1cf86206162aea9f05b33c65cc186aa31d66e66"} Mar 18 17:30:36.293110 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:30:36.293081 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65d0da32-07db-47d4-a164-e5b95c33ea25" path="/var/lib/kubelet/pods/65d0da32-07db-47d4-a164-e5b95c33ea25/volumes" Mar 18 17:30:39.611958 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:30:39.611871 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-c46f4487f-p72c6" event={"ID":"7f9868a4-0feb-4e21-b01c-6bcba698b35a","Type":"ContainerStarted","Data":"c300242391a86e43d9574560f3ca2fb60a5028850b9e9cf2192cf6c7f48d74ad"} Mar 18 17:30:39.612311 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:30:39.612155 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-c46f4487f-p72c6" Mar 18 17:30:39.613533 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:30:39.613499 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-c46f4487f-p72c6" podUID="7f9868a4-0feb-4e21-b01c-6bcba698b35a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.63:8080: connect: connection refused" Mar 18 17:30:39.628346 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:30:39.628289 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-c46f4487f-p72c6" podStartSLOduration=6.927667355 podStartE2EDuration="10.628279525s" podCreationTimestamp="2026-03-18 17:30:29 +0000 UTC" firstStartedPulling="2026-03-18 17:30:35.595139824 +0000 UTC m=+2753.906392723" lastFinishedPulling="2026-03-18 17:30:39.295751991 +0000 UTC m=+2757.607004893" observedRunningTime="2026-03-18 17:30:39.627378123 +0000 UTC m=+2757.938631041" watchObservedRunningTime="2026-03-18 17:30:39.628279525 +0000 UTC m=+2757.939532446" Mar 18 17:30:40.616746 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:30:40.616708 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-c46f4487f-p72c6" podUID="7f9868a4-0feb-4e21-b01c-6bcba698b35a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.63:8080: connect: connection refused" Mar 18 17:30:43.287470 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:30:43.287441 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:30:50.617938 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:30:50.617859 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-c46f4487f-p72c6" Mar 18 17:30:58.287129 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:30:58.286989 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 17:30:58.287386 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:30:58.287161 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:31:09.483432 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:31:09.480465 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-c46f4487f-p72c6"] Mar 18 17:31:09.483432 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:31:09.480905 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-c46f4487f-p72c6" podUID="7f9868a4-0feb-4e21-b01c-6bcba698b35a" containerName="kserve-container" containerID="cri-o://c300242391a86e43d9574560f3ca2fb60a5028850b9e9cf2192cf6c7f48d74ad" gracePeriod=30 Mar 18 17:31:09.659916 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:31:09.659884 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-6d8865c5fc-8p678"] Mar 18 17:31:09.660273 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:31:09.660257 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="65d0da32-07db-47d4-a164-e5b95c33ea25" containerName="kserve-container" Mar 18 17:31:09.660318 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:31:09.660277 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="65d0da32-07db-47d4-a164-e5b95c33ea25" containerName="kserve-container" Mar 18 17:31:09.660318 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:31:09.660296 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="65d0da32-07db-47d4-a164-e5b95c33ea25" containerName="storage-initializer" Mar 18 17:31:09.660318 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:31:09.660301 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="65d0da32-07db-47d4-a164-e5b95c33ea25" containerName="storage-initializer" Mar 18 17:31:09.660449 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:31:09.660384 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="65d0da32-07db-47d4-a164-e5b95c33ea25" containerName="kserve-container" Mar 18 17:31:09.663314 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:31:09.663298 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-6d8865c5fc-8p678" Mar 18 17:31:09.680445 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:31:09.680420 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-6d8865c5fc-8p678"] Mar 18 17:31:09.777257 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:31:09.777185 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6809c3b0-cbf5-453b-a6f4-3c08d49a2403-kserve-provision-location\") pod \"isvc-tensorflow-runtime-predictor-6d8865c5fc-8p678\" (UID: \"6809c3b0-cbf5-453b-a6f4-3c08d49a2403\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-6d8865c5fc-8p678" Mar 18 17:31:09.777257 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:31:09.777225 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qsp5\" (UniqueName: \"kubernetes.io/projected/6809c3b0-cbf5-453b-a6f4-3c08d49a2403-kube-api-access-7qsp5\") pod \"isvc-tensorflow-runtime-predictor-6d8865c5fc-8p678\" (UID: \"6809c3b0-cbf5-453b-a6f4-3c08d49a2403\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-6d8865c5fc-8p678" Mar 18 17:31:09.878340 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:31:09.878302 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6809c3b0-cbf5-453b-a6f4-3c08d49a2403-kserve-provision-location\") pod \"isvc-tensorflow-runtime-predictor-6d8865c5fc-8p678\" (UID: \"6809c3b0-cbf5-453b-a6f4-3c08d49a2403\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-6d8865c5fc-8p678" Mar 18 17:31:09.878512 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:31:09.878353 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7qsp5\" (UniqueName: \"kubernetes.io/projected/6809c3b0-cbf5-453b-a6f4-3c08d49a2403-kube-api-access-7qsp5\") pod \"isvc-tensorflow-runtime-predictor-6d8865c5fc-8p678\" (UID: \"6809c3b0-cbf5-453b-a6f4-3c08d49a2403\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-6d8865c5fc-8p678" Mar 18 17:31:09.878709 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:31:09.878691 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6809c3b0-cbf5-453b-a6f4-3c08d49a2403-kserve-provision-location\") pod \"isvc-tensorflow-runtime-predictor-6d8865c5fc-8p678\" (UID: \"6809c3b0-cbf5-453b-a6f4-3c08d49a2403\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-6d8865c5fc-8p678" Mar 18 17:31:09.888989 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:31:09.888961 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qsp5\" (UniqueName: \"kubernetes.io/projected/6809c3b0-cbf5-453b-a6f4-3c08d49a2403-kube-api-access-7qsp5\") pod \"isvc-tensorflow-runtime-predictor-6d8865c5fc-8p678\" (UID: \"6809c3b0-cbf5-453b-a6f4-3c08d49a2403\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-6d8865c5fc-8p678" Mar 18 17:31:09.975004 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:31:09.974978 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-6d8865c5fc-8p678" Mar 18 17:31:10.105607 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:31:10.105560 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-6d8865c5fc-8p678"] Mar 18 17:31:10.108499 ip-10-0-139-49 kubenswrapper[2575]: W0318 17:31:10.108470 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6809c3b0_cbf5_453b_a6f4_3c08d49a2403.slice/crio-855b6c0ebc8a411aaf638cd9f4b6a7c2ca1b4c6abddb4cba97ddc7c17b9ab367 WatchSource:0}: Error finding container 855b6c0ebc8a411aaf638cd9f4b6a7c2ca1b4c6abddb4cba97ddc7c17b9ab367: Status 404 returned error can't find the container with id 855b6c0ebc8a411aaf638cd9f4b6a7c2ca1b4c6abddb4cba97ddc7c17b9ab367 Mar 18 17:31:10.715959 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:31:10.715923 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-6d8865c5fc-8p678" event={"ID":"6809c3b0-cbf5-453b-a6f4-3c08d49a2403","Type":"ContainerStarted","Data":"b1ba36ec488d7eebc7b17835dd82207c0a7d970a5e42528119ba6c90107bef73"} Mar 18 17:31:10.715959 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:31:10.715958 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-6d8865c5fc-8p678" event={"ID":"6809c3b0-cbf5-453b-a6f4-3c08d49a2403","Type":"ContainerStarted","Data":"855b6c0ebc8a411aaf638cd9f4b6a7c2ca1b4c6abddb4cba97ddc7c17b9ab367"} Mar 18 17:31:13.287616 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:31:13.287579 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:31:14.730225 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:31:14.730191 2575 generic.go:358] "Generic (PLEG): container finished" podID="6809c3b0-cbf5-453b-a6f4-3c08d49a2403" containerID="b1ba36ec488d7eebc7b17835dd82207c0a7d970a5e42528119ba6c90107bef73" exitCode=0 Mar 18 17:31:14.730585 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:31:14.730268 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-6d8865c5fc-8p678" event={"ID":"6809c3b0-cbf5-453b-a6f4-3c08d49a2403","Type":"ContainerDied","Data":"b1ba36ec488d7eebc7b17835dd82207c0a7d970a5e42528119ba6c90107bef73"} Mar 18 17:31:15.735012 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:31:15.734976 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-6d8865c5fc-8p678" event={"ID":"6809c3b0-cbf5-453b-a6f4-3c08d49a2403","Type":"ContainerStarted","Data":"59fe9bce0c7fb539e70cc32939d3baee4ad86ec8bd20622ddbde6e540e129999"} Mar 18 17:31:15.735504 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:31:15.735335 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-6d8865c5fc-8p678" Mar 18 17:31:15.736630 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:31:15.736605 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-6d8865c5fc-8p678" podUID="6809c3b0-cbf5-453b-a6f4-3c08d49a2403" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.64:8080: connect: connection refused" Mar 18 17:31:15.758222 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:31:15.758181 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-6d8865c5fc-8p678" podStartSLOduration=6.758169043 podStartE2EDuration="6.758169043s" podCreationTimestamp="2026-03-18 17:31:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:31:15.75696625 +0000 UTC m=+2794.068219172" watchObservedRunningTime="2026-03-18 17:31:15.758169043 +0000 UTC m=+2794.069421965" Mar 18 17:31:16.738506 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:31:16.738470 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-6d8865c5fc-8p678" podUID="6809c3b0-cbf5-453b-a6f4-3c08d49a2403" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.64:8080: connect: connection refused" Mar 18 17:31:25.286713 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:31:25.286682 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:31:26.739943 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:31:26.739909 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-6d8865c5fc-8p678" Mar 18 17:31:36.286868 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:31:36.286837 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:31:38.983527 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:31:38.983497 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-6d8865c5fc-8p678"] Mar 18 17:31:38.983990 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:31:38.983770 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-6d8865c5fc-8p678" podUID="6809c3b0-cbf5-453b-a6f4-3c08d49a2403" containerName="kserve-container" containerID="cri-o://59fe9bce0c7fb539e70cc32939d3baee4ad86ec8bd20622ddbde6e540e129999" gracePeriod=30 Mar 18 17:31:39.057072 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:31:39.057044 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-54cbc8ffbb-28sjj"] Mar 18 17:31:39.059467 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:31:39.059451 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-54cbc8ffbb-28sjj" Mar 18 17:31:39.083064 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:31:39.083036 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-54cbc8ffbb-28sjj"] Mar 18 17:31:39.196707 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:31:39.196673 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bp9x\" (UniqueName: \"kubernetes.io/projected/7091abb3-5a2c-46a8-ae39-577cb8493fb3-kube-api-access-6bp9x\") pod \"isvc-triton-predictor-54cbc8ffbb-28sjj\" (UID: \"7091abb3-5a2c-46a8-ae39-577cb8493fb3\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-54cbc8ffbb-28sjj" Mar 18 17:31:39.196847 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:31:39.196719 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7091abb3-5a2c-46a8-ae39-577cb8493fb3-kserve-provision-location\") pod \"isvc-triton-predictor-54cbc8ffbb-28sjj\" (UID: \"7091abb3-5a2c-46a8-ae39-577cb8493fb3\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-54cbc8ffbb-28sjj" Mar 18 17:31:39.297821 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:31:39.297756 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6bp9x\" (UniqueName: \"kubernetes.io/projected/7091abb3-5a2c-46a8-ae39-577cb8493fb3-kube-api-access-6bp9x\") pod \"isvc-triton-predictor-54cbc8ffbb-28sjj\" (UID: \"7091abb3-5a2c-46a8-ae39-577cb8493fb3\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-54cbc8ffbb-28sjj" Mar 18 17:31:39.297821 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:31:39.297791 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7091abb3-5a2c-46a8-ae39-577cb8493fb3-kserve-provision-location\") pod \"isvc-triton-predictor-54cbc8ffbb-28sjj\" (UID: \"7091abb3-5a2c-46a8-ae39-577cb8493fb3\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-54cbc8ffbb-28sjj" Mar 18 17:31:39.298099 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:31:39.298082 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7091abb3-5a2c-46a8-ae39-577cb8493fb3-kserve-provision-location\") pod \"isvc-triton-predictor-54cbc8ffbb-28sjj\" (UID: \"7091abb3-5a2c-46a8-ae39-577cb8493fb3\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-54cbc8ffbb-28sjj" Mar 18 17:31:39.309610 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:31:39.309585 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bp9x\" (UniqueName: \"kubernetes.io/projected/7091abb3-5a2c-46a8-ae39-577cb8493fb3-kube-api-access-6bp9x\") pod \"isvc-triton-predictor-54cbc8ffbb-28sjj\" (UID: \"7091abb3-5a2c-46a8-ae39-577cb8493fb3\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-54cbc8ffbb-28sjj" Mar 18 17:31:39.368603 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:31:39.368574 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-54cbc8ffbb-28sjj" Mar 18 17:31:39.495896 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:31:39.495862 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-54cbc8ffbb-28sjj"] Mar 18 17:31:39.498908 ip-10-0-139-49 kubenswrapper[2575]: W0318 17:31:39.498876 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7091abb3_5a2c_46a8_ae39_577cb8493fb3.slice/crio-2c277036e7619348c6cf2b5a27bba6d2415d22ddfa4b8c5410abaf8816d9277b WatchSource:0}: Error finding container 2c277036e7619348c6cf2b5a27bba6d2415d22ddfa4b8c5410abaf8816d9277b: Status 404 returned error can't find the container with id 2c277036e7619348c6cf2b5a27bba6d2415d22ddfa4b8c5410abaf8816d9277b Mar 18 17:31:39.815989 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:31:39.815890 2575 generic.go:358] "Generic (PLEG): container finished" podID="7f9868a4-0feb-4e21-b01c-6bcba698b35a" containerID="c300242391a86e43d9574560f3ca2fb60a5028850b9e9cf2192cf6c7f48d74ad" exitCode=137 Mar 18 17:31:39.815989 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:31:39.815973 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-c46f4487f-p72c6" event={"ID":"7f9868a4-0feb-4e21-b01c-6bcba698b35a","Type":"ContainerDied","Data":"c300242391a86e43d9574560f3ca2fb60a5028850b9e9cf2192cf6c7f48d74ad"} Mar 18 17:31:39.817342 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:31:39.817311 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-54cbc8ffbb-28sjj" event={"ID":"7091abb3-5a2c-46a8-ae39-577cb8493fb3","Type":"ContainerStarted","Data":"5968ccca4f32f3a7464bbe0650c9e3b7e46145c648adf36904a4e4ee123b43e4"} Mar 18 17:31:39.817342 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:31:39.817340 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-54cbc8ffbb-28sjj" event={"ID":"7091abb3-5a2c-46a8-ae39-577cb8493fb3","Type":"ContainerStarted","Data":"2c277036e7619348c6cf2b5a27bba6d2415d22ddfa4b8c5410abaf8816d9277b"} Mar 18 17:31:40.115599 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:31:40.115577 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-c46f4487f-p72c6" Mar 18 17:31:40.203962 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:31:40.203930 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7f9868a4-0feb-4e21-b01c-6bcba698b35a-kserve-provision-location\") pod \"7f9868a4-0feb-4e21-b01c-6bcba698b35a\" (UID: \"7f9868a4-0feb-4e21-b01c-6bcba698b35a\") " Mar 18 17:31:40.204119 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:31:40.203999 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wfnk9\" (UniqueName: \"kubernetes.io/projected/7f9868a4-0feb-4e21-b01c-6bcba698b35a-kube-api-access-wfnk9\") pod \"7f9868a4-0feb-4e21-b01c-6bcba698b35a\" (UID: \"7f9868a4-0feb-4e21-b01c-6bcba698b35a\") " Mar 18 17:31:40.206129 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:31:40.206099 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f9868a4-0feb-4e21-b01c-6bcba698b35a-kube-api-access-wfnk9" (OuterVolumeSpecName: "kube-api-access-wfnk9") pod "7f9868a4-0feb-4e21-b01c-6bcba698b35a" (UID: "7f9868a4-0feb-4e21-b01c-6bcba698b35a"). InnerVolumeSpecName "kube-api-access-wfnk9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 17:31:40.215138 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:31:40.215116 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f9868a4-0feb-4e21-b01c-6bcba698b35a-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "7f9868a4-0feb-4e21-b01c-6bcba698b35a" (UID: "7f9868a4-0feb-4e21-b01c-6bcba698b35a"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 18 17:31:40.304758 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:31:40.304732 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wfnk9\" (UniqueName: \"kubernetes.io/projected/7f9868a4-0feb-4e21-b01c-6bcba698b35a-kube-api-access-wfnk9\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 17:31:40.304847 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:31:40.304765 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7f9868a4-0feb-4e21-b01c-6bcba698b35a-kserve-provision-location\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 17:31:40.821599 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:31:40.821576 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-c46f4487f-p72c6" Mar 18 17:31:40.821736 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:31:40.821607 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-c46f4487f-p72c6" event={"ID":"7f9868a4-0feb-4e21-b01c-6bcba698b35a","Type":"ContainerDied","Data":"2cdc22f2cbbc480b872990f793b3702ab29c3d7bfe5bb8731dd3a5ec880c7956"} Mar 18 17:31:40.821736 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:31:40.821661 2575 scope.go:117] "RemoveContainer" containerID="c300242391a86e43d9574560f3ca2fb60a5028850b9e9cf2192cf6c7f48d74ad" Mar 18 17:31:40.829995 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:31:40.829977 2575 scope.go:117] "RemoveContainer" containerID="100d24a895a487588165e84ab1cf86206162aea9f05b33c65cc186aa31d66e66" Mar 18 17:31:40.845186 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:31:40.845164 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-c46f4487f-p72c6"] Mar 18 17:31:40.849454 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:31:40.849436 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-c46f4487f-p72c6"] Mar 18 17:31:42.295179 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:31:42.295147 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f9868a4-0feb-4e21-b01c-6bcba698b35a" path="/var/lib/kubelet/pods/7f9868a4-0feb-4e21-b01c-6bcba698b35a/volumes" Mar 18 17:31:43.832573 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:31:43.832546 2575 generic.go:358] "Generic (PLEG): container finished" podID="7091abb3-5a2c-46a8-ae39-577cb8493fb3" containerID="5968ccca4f32f3a7464bbe0650c9e3b7e46145c648adf36904a4e4ee123b43e4" exitCode=0 Mar 18 17:31:43.832895 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:31:43.832621 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-54cbc8ffbb-28sjj" event={"ID":"7091abb3-5a2c-46a8-ae39-577cb8493fb3","Type":"ContainerDied","Data":"5968ccca4f32f3a7464bbe0650c9e3b7e46145c648adf36904a4e4ee123b43e4"} Mar 18 17:31:49.287413 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:31:49.287219 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:32:02.291433 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:32:02.291303 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:32:09.006518 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:32:09.006477 2575 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7091abb3_5a2c_46a8_ae39_577cb8493fb3.slice/crio-conmon-5968ccca4f32f3a7464bbe0650c9e3b7e46145c648adf36904a4e4ee123b43e4.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7091abb3_5a2c_46a8_ae39_577cb8493fb3.slice/crio-5968ccca4f32f3a7464bbe0650c9e3b7e46145c648adf36904a4e4ee123b43e4.scope\": RecentStats: unable to find data in memory cache]" Mar 18 17:32:09.007003 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:32:09.006971 2575 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7091abb3_5a2c_46a8_ae39_577cb8493fb3.slice/crio-conmon-5968ccca4f32f3a7464bbe0650c9e3b7e46145c648adf36904a4e4ee123b43e4.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7091abb3_5a2c_46a8_ae39_577cb8493fb3.slice/crio-5968ccca4f32f3a7464bbe0650c9e3b7e46145c648adf36904a4e4ee123b43e4.scope\": RecentStats: unable to find data in memory cache]" Mar 18 17:32:09.007107 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:32:09.006638 2575 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7091abb3_5a2c_46a8_ae39_577cb8493fb3.slice/crio-conmon-5968ccca4f32f3a7464bbe0650c9e3b7e46145c648adf36904a4e4ee123b43e4.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7091abb3_5a2c_46a8_ae39_577cb8493fb3.slice/crio-5968ccca4f32f3a7464bbe0650c9e3b7e46145c648adf36904a4e4ee123b43e4.scope\": RecentStats: unable to find data in memory cache]" Mar 18 17:32:09.007435 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:32:09.007400 2575 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7091abb3_5a2c_46a8_ae39_577cb8493fb3.slice/crio-conmon-5968ccca4f32f3a7464bbe0650c9e3b7e46145c648adf36904a4e4ee123b43e4.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7091abb3_5a2c_46a8_ae39_577cb8493fb3.slice/crio-5968ccca4f32f3a7464bbe0650c9e3b7e46145c648adf36904a4e4ee123b43e4.scope\": RecentStats: unable to find data in memory cache]" Mar 18 17:32:09.684619 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:32:09.684592 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-6d8865c5fc-8p678" Mar 18 17:32:09.760427 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:32:09.760168 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7qsp5\" (UniqueName: \"kubernetes.io/projected/6809c3b0-cbf5-453b-a6f4-3c08d49a2403-kube-api-access-7qsp5\") pod \"6809c3b0-cbf5-453b-a6f4-3c08d49a2403\" (UID: \"6809c3b0-cbf5-453b-a6f4-3c08d49a2403\") " Mar 18 17:32:09.760427 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:32:09.760269 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6809c3b0-cbf5-453b-a6f4-3c08d49a2403-kserve-provision-location\") pod \"6809c3b0-cbf5-453b-a6f4-3c08d49a2403\" (UID: \"6809c3b0-cbf5-453b-a6f4-3c08d49a2403\") " Mar 18 17:32:09.764704 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:32:09.764544 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6809c3b0-cbf5-453b-a6f4-3c08d49a2403-kube-api-access-7qsp5" (OuterVolumeSpecName: "kube-api-access-7qsp5") pod "6809c3b0-cbf5-453b-a6f4-3c08d49a2403" (UID: "6809c3b0-cbf5-453b-a6f4-3c08d49a2403"). InnerVolumeSpecName "kube-api-access-7qsp5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 17:32:09.764704 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:32:09.764543 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6809c3b0-cbf5-453b-a6f4-3c08d49a2403-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "6809c3b0-cbf5-453b-a6f4-3c08d49a2403" (UID: "6809c3b0-cbf5-453b-a6f4-3c08d49a2403"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 18 17:32:09.861307 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:32:09.861253 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7qsp5\" (UniqueName: \"kubernetes.io/projected/6809c3b0-cbf5-453b-a6f4-3c08d49a2403-kube-api-access-7qsp5\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 17:32:09.861307 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:32:09.861285 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6809c3b0-cbf5-453b-a6f4-3c08d49a2403-kserve-provision-location\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 17:32:09.971594 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:32:09.971558 2575 generic.go:358] "Generic (PLEG): container finished" podID="6809c3b0-cbf5-453b-a6f4-3c08d49a2403" containerID="59fe9bce0c7fb539e70cc32939d3baee4ad86ec8bd20622ddbde6e540e129999" exitCode=137 Mar 18 17:32:09.971755 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:32:09.971618 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-6d8865c5fc-8p678" event={"ID":"6809c3b0-cbf5-453b-a6f4-3c08d49a2403","Type":"ContainerDied","Data":"59fe9bce0c7fb539e70cc32939d3baee4ad86ec8bd20622ddbde6e540e129999"} Mar 18 17:32:09.971755 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:32:09.971650 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-6d8865c5fc-8p678" event={"ID":"6809c3b0-cbf5-453b-a6f4-3c08d49a2403","Type":"ContainerDied","Data":"855b6c0ebc8a411aaf638cd9f4b6a7c2ca1b4c6abddb4cba97ddc7c17b9ab367"} Mar 18 17:32:09.971755 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:32:09.971657 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-6d8865c5fc-8p678" Mar 18 17:32:09.971755 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:32:09.971674 2575 scope.go:117] "RemoveContainer" containerID="59fe9bce0c7fb539e70cc32939d3baee4ad86ec8bd20622ddbde6e540e129999" Mar 18 17:32:09.983898 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:32:09.983812 2575 scope.go:117] "RemoveContainer" containerID="b1ba36ec488d7eebc7b17835dd82207c0a7d970a5e42528119ba6c90107bef73" Mar 18 17:32:10.000207 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:32:10.000185 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-6d8865c5fc-8p678"] Mar 18 17:32:10.004814 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:32:10.004791 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-6d8865c5fc-8p678"] Mar 18 17:32:10.009656 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:32:10.009336 2575 scope.go:117] "RemoveContainer" containerID="59fe9bce0c7fb539e70cc32939d3baee4ad86ec8bd20622ddbde6e540e129999" Mar 18 17:32:10.009920 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:32:10.009792 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59fe9bce0c7fb539e70cc32939d3baee4ad86ec8bd20622ddbde6e540e129999\": container with ID starting with 59fe9bce0c7fb539e70cc32939d3baee4ad86ec8bd20622ddbde6e540e129999 not found: ID does not exist" containerID="59fe9bce0c7fb539e70cc32939d3baee4ad86ec8bd20622ddbde6e540e129999" Mar 18 17:32:10.009920 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:32:10.009827 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59fe9bce0c7fb539e70cc32939d3baee4ad86ec8bd20622ddbde6e540e129999"} err="failed to get container status \"59fe9bce0c7fb539e70cc32939d3baee4ad86ec8bd20622ddbde6e540e129999\": rpc error: code = NotFound desc = could not find container \"59fe9bce0c7fb539e70cc32939d3baee4ad86ec8bd20622ddbde6e540e129999\": container with ID starting with 59fe9bce0c7fb539e70cc32939d3baee4ad86ec8bd20622ddbde6e540e129999 not found: ID does not exist" Mar 18 17:32:10.009920 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:32:10.009850 2575 scope.go:117] "RemoveContainer" containerID="b1ba36ec488d7eebc7b17835dd82207c0a7d970a5e42528119ba6c90107bef73" Mar 18 17:32:10.010139 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:32:10.010117 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1ba36ec488d7eebc7b17835dd82207c0a7d970a5e42528119ba6c90107bef73\": container with ID starting with b1ba36ec488d7eebc7b17835dd82207c0a7d970a5e42528119ba6c90107bef73 not found: ID does not exist" containerID="b1ba36ec488d7eebc7b17835dd82207c0a7d970a5e42528119ba6c90107bef73" Mar 18 17:32:10.010230 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:32:10.010148 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1ba36ec488d7eebc7b17835dd82207c0a7d970a5e42528119ba6c90107bef73"} err="failed to get container status \"b1ba36ec488d7eebc7b17835dd82207c0a7d970a5e42528119ba6c90107bef73\": rpc error: code = NotFound desc = could not find container \"b1ba36ec488d7eebc7b17835dd82207c0a7d970a5e42528119ba6c90107bef73\": container with ID starting with b1ba36ec488d7eebc7b17835dd82207c0a7d970a5e42528119ba6c90107bef73 not found: ID does not exist" Mar 18 17:32:10.293038 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:32:10.292730 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6809c3b0-cbf5-453b-a6f4-3c08d49a2403" path="/var/lib/kubelet/pods/6809c3b0-cbf5-453b-a6f4-3c08d49a2403/volumes" Mar 18 17:32:17.287955 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:32:17.287806 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:32:28.287727 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:32:28.287526 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:32:43.287442 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:32:43.287407 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:32:55.287092 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:32:55.286985 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:33:09.286976 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:33:09.286946 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:33:24.683723 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:33:24.683676 2575 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized" image="quay.io/opendatahub/odh-model-serving-api:fast" Mar 18 17:33:24.684196 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:33:24.683851 2575 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:server,Image:quay.io/opendatahub/odh-model-serving-api:fast,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},ContainerPort{Name:metrics,HostPort:0,ContainerPort:9090,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:TLS_CERT_DIR,Value:/tls,ValueFrom:nil,},EnvVar{Name:GATEWAY_LABEL_SELECTOR,Value:,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{50 -3} {} 50m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tls-certs,ReadOnly:true,MountPath:/tls,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vhs26,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000640000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod model-serving-api-9699c8d45-ttb7x_kserve(e41abd12-239d-46fb-9cdb-9d35fa51024d): ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized" logger="UnhandledError" Mar 18 17:33:24.685081 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:33:24.685043 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ErrImagePull: \"unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:33:37.287016 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:33:37.286916 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:33:39.302971 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:33:39.302940 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-54cbc8ffbb-28sjj" event={"ID":"7091abb3-5a2c-46a8-ae39-577cb8493fb3","Type":"ContainerStarted","Data":"5d3ecf0f91c38105121809a2846162a848000f0490fd7a9423d8a7d89a44367e"} Mar 18 17:33:39.303395 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:33:39.303197 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-triton-predictor-54cbc8ffbb-28sjj" Mar 18 17:33:39.304513 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:33:39.304479 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-triton-predictor-54cbc8ffbb-28sjj" podUID="7091abb3-5a2c-46a8-ae39-577cb8493fb3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.65:8080: connect: connection refused" Mar 18 17:33:39.320099 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:33:39.320061 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-triton-predictor-54cbc8ffbb-28sjj" podStartSLOduration=5.322866429 podStartE2EDuration="2m0.3200483s" podCreationTimestamp="2026-03-18 17:31:39 +0000 UTC" firstStartedPulling="2026-03-18 17:31:43.833639303 +0000 UTC m=+2822.144892202" lastFinishedPulling="2026-03-18 17:33:38.830821171 +0000 UTC m=+2937.142074073" observedRunningTime="2026-03-18 17:33:39.318536504 +0000 UTC m=+2937.629789424" watchObservedRunningTime="2026-03-18 17:33:39.3200483 +0000 UTC m=+2937.631301220" Mar 18 17:33:40.305983 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:33:40.305948 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-triton-predictor-54cbc8ffbb-28sjj" podUID="7091abb3-5a2c-46a8-ae39-577cb8493fb3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.65:8080: connect: connection refused" Mar 18 17:33:50.307299 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:33:50.307229 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-triton-predictor-54cbc8ffbb-28sjj" Mar 18 17:33:52.289309 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:33:52.289274 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:34:01.481242 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:34:01.481206 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-54cbc8ffbb-28sjj"] Mar 18 17:34:01.481814 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:34:01.481590 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-triton-predictor-54cbc8ffbb-28sjj" podUID="7091abb3-5a2c-46a8-ae39-577cb8493fb3" containerName="kserve-container" containerID="cri-o://5d3ecf0f91c38105121809a2846162a848000f0490fd7a9423d8a7d89a44367e" gracePeriod=30 Mar 18 17:34:01.648382 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:34:01.648331 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-7c946d496c-vs4lz"] Mar 18 17:34:01.648700 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:34:01.648686 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7f9868a4-0feb-4e21-b01c-6bcba698b35a" containerName="storage-initializer" Mar 18 17:34:01.648748 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:34:01.648702 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f9868a4-0feb-4e21-b01c-6bcba698b35a" containerName="storage-initializer" Mar 18 17:34:01.648748 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:34:01.648722 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6809c3b0-cbf5-453b-a6f4-3c08d49a2403" containerName="storage-initializer" Mar 18 17:34:01.648748 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:34:01.648728 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="6809c3b0-cbf5-453b-a6f4-3c08d49a2403" containerName="storage-initializer" Mar 18 17:34:01.648748 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:34:01.648738 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7f9868a4-0feb-4e21-b01c-6bcba698b35a" containerName="kserve-container" Mar 18 17:34:01.648748 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:34:01.648744 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f9868a4-0feb-4e21-b01c-6bcba698b35a" containerName="kserve-container" Mar 18 17:34:01.648904 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:34:01.648750 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6809c3b0-cbf5-453b-a6f4-3c08d49a2403" containerName="kserve-container" Mar 18 17:34:01.648904 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:34:01.648756 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="6809c3b0-cbf5-453b-a6f4-3c08d49a2403" containerName="kserve-container" Mar 18 17:34:01.648904 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:34:01.648807 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="7f9868a4-0feb-4e21-b01c-6bcba698b35a" containerName="kserve-container" Mar 18 17:34:01.648904 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:34:01.648814 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="6809c3b0-cbf5-453b-a6f4-3c08d49a2403" containerName="kserve-container" Mar 18 17:34:01.653854 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:34:01.653835 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-7c946d496c-vs4lz" Mar 18 17:34:01.662044 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:34:01.662021 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-7c946d496c-vs4lz"] Mar 18 17:34:01.788118 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:34:01.788056 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/96b0921b-ceef-43c2-a811-c50a5d83de93-kserve-provision-location\") pod \"isvc-xgboost-predictor-7c946d496c-vs4lz\" (UID: \"96b0921b-ceef-43c2-a811-c50a5d83de93\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-7c946d496c-vs4lz" Mar 18 17:34:01.788118 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:34:01.788110 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttxtr\" (UniqueName: \"kubernetes.io/projected/96b0921b-ceef-43c2-a811-c50a5d83de93-kube-api-access-ttxtr\") pod \"isvc-xgboost-predictor-7c946d496c-vs4lz\" (UID: \"96b0921b-ceef-43c2-a811-c50a5d83de93\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-7c946d496c-vs4lz" Mar 18 17:34:01.888763 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:34:01.888739 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ttxtr\" (UniqueName: \"kubernetes.io/projected/96b0921b-ceef-43c2-a811-c50a5d83de93-kube-api-access-ttxtr\") pod \"isvc-xgboost-predictor-7c946d496c-vs4lz\" (UID: \"96b0921b-ceef-43c2-a811-c50a5d83de93\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-7c946d496c-vs4lz" Mar 18 17:34:01.888858 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:34:01.888788 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/96b0921b-ceef-43c2-a811-c50a5d83de93-kserve-provision-location\") pod \"isvc-xgboost-predictor-7c946d496c-vs4lz\" (UID: \"96b0921b-ceef-43c2-a811-c50a5d83de93\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-7c946d496c-vs4lz" Mar 18 17:34:01.889168 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:34:01.889152 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/96b0921b-ceef-43c2-a811-c50a5d83de93-kserve-provision-location\") pod \"isvc-xgboost-predictor-7c946d496c-vs4lz\" (UID: \"96b0921b-ceef-43c2-a811-c50a5d83de93\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-7c946d496c-vs4lz" Mar 18 17:34:01.896848 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:34:01.896832 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttxtr\" (UniqueName: \"kubernetes.io/projected/96b0921b-ceef-43c2-a811-c50a5d83de93-kube-api-access-ttxtr\") pod \"isvc-xgboost-predictor-7c946d496c-vs4lz\" (UID: \"96b0921b-ceef-43c2-a811-c50a5d83de93\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-7c946d496c-vs4lz" Mar 18 17:34:01.964969 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:34:01.964944 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-7c946d496c-vs4lz" Mar 18 17:34:02.089035 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:34:02.088982 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-7c946d496c-vs4lz"] Mar 18 17:34:02.091470 ip-10-0-139-49 kubenswrapper[2575]: W0318 17:34:02.091434 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96b0921b_ceef_43c2_a811_c50a5d83de93.slice/crio-09f0167ac5f0572c4d9c26bff019a11b2c697d1ef77275b11a73f3363c2f9fbf WatchSource:0}: Error finding container 09f0167ac5f0572c4d9c26bff019a11b2c697d1ef77275b11a73f3363c2f9fbf: Status 404 returned error can't find the container with id 09f0167ac5f0572c4d9c26bff019a11b2c697d1ef77275b11a73f3363c2f9fbf Mar 18 17:34:02.376895 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:34:02.376806 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-7c946d496c-vs4lz" event={"ID":"96b0921b-ceef-43c2-a811-c50a5d83de93","Type":"ContainerStarted","Data":"425d0c645b514689ab19712fd9b16cf0ec6c66fa88af3870a7c8a2210206c50d"} Mar 18 17:34:02.376895 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:34:02.376839 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-7c946d496c-vs4lz" event={"ID":"96b0921b-ceef-43c2-a811-c50a5d83de93","Type":"ContainerStarted","Data":"09f0167ac5f0572c4d9c26bff019a11b2c697d1ef77275b11a73f3363c2f9fbf"} Mar 18 17:34:04.523732 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:34:04.523711 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-54cbc8ffbb-28sjj" Mar 18 17:34:04.610990 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:34:04.610960 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6bp9x\" (UniqueName: \"kubernetes.io/projected/7091abb3-5a2c-46a8-ae39-577cb8493fb3-kube-api-access-6bp9x\") pod \"7091abb3-5a2c-46a8-ae39-577cb8493fb3\" (UID: \"7091abb3-5a2c-46a8-ae39-577cb8493fb3\") " Mar 18 17:34:04.611121 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:34:04.611078 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7091abb3-5a2c-46a8-ae39-577cb8493fb3-kserve-provision-location\") pod \"7091abb3-5a2c-46a8-ae39-577cb8493fb3\" (UID: \"7091abb3-5a2c-46a8-ae39-577cb8493fb3\") " Mar 18 17:34:04.611444 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:34:04.611422 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7091abb3-5a2c-46a8-ae39-577cb8493fb3-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "7091abb3-5a2c-46a8-ae39-577cb8493fb3" (UID: "7091abb3-5a2c-46a8-ae39-577cb8493fb3"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 18 17:34:04.613114 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:34:04.613092 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7091abb3-5a2c-46a8-ae39-577cb8493fb3-kube-api-access-6bp9x" (OuterVolumeSpecName: "kube-api-access-6bp9x") pod "7091abb3-5a2c-46a8-ae39-577cb8493fb3" (UID: "7091abb3-5a2c-46a8-ae39-577cb8493fb3"). InnerVolumeSpecName "kube-api-access-6bp9x". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 17:34:04.712492 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:34:04.712470 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7091abb3-5a2c-46a8-ae39-577cb8493fb3-kserve-provision-location\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 17:34:04.712492 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:34:04.712490 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6bp9x\" (UniqueName: \"kubernetes.io/projected/7091abb3-5a2c-46a8-ae39-577cb8493fb3-kube-api-access-6bp9x\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 17:34:05.287278 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:34:05.287250 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:34:05.388851 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:34:05.388825 2575 generic.go:358] "Generic (PLEG): container finished" podID="7091abb3-5a2c-46a8-ae39-577cb8493fb3" containerID="5d3ecf0f91c38105121809a2846162a848000f0490fd7a9423d8a7d89a44367e" exitCode=0 Mar 18 17:34:05.388968 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:34:05.388888 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-54cbc8ffbb-28sjj" Mar 18 17:34:05.388968 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:34:05.388894 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-54cbc8ffbb-28sjj" event={"ID":"7091abb3-5a2c-46a8-ae39-577cb8493fb3","Type":"ContainerDied","Data":"5d3ecf0f91c38105121809a2846162a848000f0490fd7a9423d8a7d89a44367e"} Mar 18 17:34:05.388968 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:34:05.388922 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-54cbc8ffbb-28sjj" event={"ID":"7091abb3-5a2c-46a8-ae39-577cb8493fb3","Type":"ContainerDied","Data":"2c277036e7619348c6cf2b5a27bba6d2415d22ddfa4b8c5410abaf8816d9277b"} Mar 18 17:34:05.388968 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:34:05.388937 2575 scope.go:117] "RemoveContainer" containerID="5d3ecf0f91c38105121809a2846162a848000f0490fd7a9423d8a7d89a44367e" Mar 18 17:34:05.398302 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:34:05.398282 2575 scope.go:117] "RemoveContainer" containerID="5968ccca4f32f3a7464bbe0650c9e3b7e46145c648adf36904a4e4ee123b43e4" Mar 18 17:34:05.409178 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:34:05.409162 2575 scope.go:117] "RemoveContainer" containerID="5d3ecf0f91c38105121809a2846162a848000f0490fd7a9423d8a7d89a44367e" Mar 18 17:34:05.409536 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:34:05.409505 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d3ecf0f91c38105121809a2846162a848000f0490fd7a9423d8a7d89a44367e\": container with ID starting with 5d3ecf0f91c38105121809a2846162a848000f0490fd7a9423d8a7d89a44367e not found: ID does not exist" containerID="5d3ecf0f91c38105121809a2846162a848000f0490fd7a9423d8a7d89a44367e" Mar 18 17:34:05.409629 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:34:05.409546 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d3ecf0f91c38105121809a2846162a848000f0490fd7a9423d8a7d89a44367e"} err="failed to get container status \"5d3ecf0f91c38105121809a2846162a848000f0490fd7a9423d8a7d89a44367e\": rpc error: code = NotFound desc = could not find container \"5d3ecf0f91c38105121809a2846162a848000f0490fd7a9423d8a7d89a44367e\": container with ID starting with 5d3ecf0f91c38105121809a2846162a848000f0490fd7a9423d8a7d89a44367e not found: ID does not exist" Mar 18 17:34:05.409629 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:34:05.409569 2575 scope.go:117] "RemoveContainer" containerID="5968ccca4f32f3a7464bbe0650c9e3b7e46145c648adf36904a4e4ee123b43e4" Mar 18 17:34:05.409874 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:34:05.409855 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5968ccca4f32f3a7464bbe0650c9e3b7e46145c648adf36904a4e4ee123b43e4\": container with ID starting with 5968ccca4f32f3a7464bbe0650c9e3b7e46145c648adf36904a4e4ee123b43e4 not found: ID does not exist" containerID="5968ccca4f32f3a7464bbe0650c9e3b7e46145c648adf36904a4e4ee123b43e4" Mar 18 17:34:05.409950 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:34:05.409880 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5968ccca4f32f3a7464bbe0650c9e3b7e46145c648adf36904a4e4ee123b43e4"} err="failed to get container status \"5968ccca4f32f3a7464bbe0650c9e3b7e46145c648adf36904a4e4ee123b43e4\": rpc error: code = NotFound desc = could not find container \"5968ccca4f32f3a7464bbe0650c9e3b7e46145c648adf36904a4e4ee123b43e4\": container with ID starting with 5968ccca4f32f3a7464bbe0650c9e3b7e46145c648adf36904a4e4ee123b43e4 not found: ID does not exist" Mar 18 17:34:05.410615 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:34:05.410594 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-54cbc8ffbb-28sjj"] Mar 18 17:34:05.415522 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:34:05.415501 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-54cbc8ffbb-28sjj"] Mar 18 17:34:06.290293 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:34:06.290228 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7091abb3-5a2c-46a8-ae39-577cb8493fb3" path="/var/lib/kubelet/pods/7091abb3-5a2c-46a8-ae39-577cb8493fb3/volumes" Mar 18 17:34:06.392593 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:34:06.392567 2575 generic.go:358] "Generic (PLEG): container finished" podID="96b0921b-ceef-43c2-a811-c50a5d83de93" containerID="425d0c645b514689ab19712fd9b16cf0ec6c66fa88af3870a7c8a2210206c50d" exitCode=0 Mar 18 17:34:06.392727 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:34:06.392645 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-7c946d496c-vs4lz" event={"ID":"96b0921b-ceef-43c2-a811-c50a5d83de93","Type":"ContainerDied","Data":"425d0c645b514689ab19712fd9b16cf0ec6c66fa88af3870a7c8a2210206c50d"} Mar 18 17:34:20.287123 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:34:20.287081 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:34:26.468648 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:34:26.468617 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-7c946d496c-vs4lz" event={"ID":"96b0921b-ceef-43c2-a811-c50a5d83de93","Type":"ContainerStarted","Data":"5ff5db0fcb18b45de4edb93f59e80cff9a17aa4123def1d34fcb6cfdd0df3246"} Mar 18 17:34:26.469088 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:34:26.468913 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-7c946d496c-vs4lz" Mar 18 17:34:26.470198 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:34:26.470173 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-7c946d496c-vs4lz" podUID="96b0921b-ceef-43c2-a811-c50a5d83de93" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.66:8080: connect: connection refused" Mar 18 17:34:26.485805 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:34:26.485760 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-7c946d496c-vs4lz" podStartSLOduration=5.91391098 podStartE2EDuration="25.485749572s" podCreationTimestamp="2026-03-18 17:34:01 +0000 UTC" firstStartedPulling="2026-03-18 17:34:06.39887821 +0000 UTC m=+2964.710131111" lastFinishedPulling="2026-03-18 17:34:25.970716794 +0000 UTC m=+2984.281969703" observedRunningTime="2026-03-18 17:34:26.484870513 +0000 UTC m=+2984.796123447" watchObservedRunningTime="2026-03-18 17:34:26.485749572 +0000 UTC m=+2984.797002493" Mar 18 17:34:27.472738 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:34:27.472701 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-7c946d496c-vs4lz" podUID="96b0921b-ceef-43c2-a811-c50a5d83de93" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.66:8080: connect: connection refused" Mar 18 17:34:33.286961 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:34:33.286928 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:34:37.473376 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:34:37.473328 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-7c946d496c-vs4lz" podUID="96b0921b-ceef-43c2-a811-c50a5d83de93" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.66:8080: connect: connection refused" Mar 18 17:34:42.388278 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:34:42.388239 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qktrm_a51f93e2-c750-4700-bb46-109456c7c78a/ovn-acl-logging/0.log" Mar 18 17:34:42.394625 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:34:42.394602 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qktrm_a51f93e2-c750-4700-bb46-109456c7c78a/ovn-acl-logging/0.log" Mar 18 17:34:44.287593 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:34:44.287524 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:34:47.472959 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:34:47.472917 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-7c946d496c-vs4lz" podUID="96b0921b-ceef-43c2-a811-c50a5d83de93" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.66:8080: connect: connection refused" Mar 18 17:34:57.286970 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:34:57.286938 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:34:57.473000 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:34:57.472963 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-7c946d496c-vs4lz" podUID="96b0921b-ceef-43c2-a811-c50a5d83de93" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.66:8080: connect: connection refused" Mar 18 17:35:07.473032 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:35:07.472992 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-7c946d496c-vs4lz" podUID="96b0921b-ceef-43c2-a811-c50a5d83de93" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.66:8080: connect: connection refused" Mar 18 17:35:10.287158 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:35:10.287124 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:35:17.472713 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:35:17.472670 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-7c946d496c-vs4lz" podUID="96b0921b-ceef-43c2-a811-c50a5d83de93" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.66:8080: connect: connection refused" Mar 18 17:35:21.287010 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:35:21.286979 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:35:27.473546 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:35:27.473513 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-7c946d496c-vs4lz" Mar 18 17:35:31.401252 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:35:31.401222 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-7c946d496c-vs4lz"] Mar 18 17:35:31.401701 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:35:31.401559 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-7c946d496c-vs4lz" podUID="96b0921b-ceef-43c2-a811-c50a5d83de93" containerName="kserve-container" containerID="cri-o://5ff5db0fcb18b45de4edb93f59e80cff9a17aa4123def1d34fcb6cfdd0df3246" gracePeriod=30 Mar 18 17:35:31.549038 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:35:31.548999 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-864f998569-kpbmp"] Mar 18 17:35:31.549453 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:35:31.549436 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7091abb3-5a2c-46a8-ae39-577cb8493fb3" containerName="kserve-container" Mar 18 17:35:31.549532 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:35:31.549454 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="7091abb3-5a2c-46a8-ae39-577cb8493fb3" containerName="kserve-container" Mar 18 17:35:31.549532 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:35:31.549468 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7091abb3-5a2c-46a8-ae39-577cb8493fb3" containerName="storage-initializer" Mar 18 17:35:31.549532 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:35:31.549474 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="7091abb3-5a2c-46a8-ae39-577cb8493fb3" containerName="storage-initializer" Mar 18 17:35:31.549647 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:35:31.549535 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="7091abb3-5a2c-46a8-ae39-577cb8493fb3" containerName="kserve-container" Mar 18 17:35:31.552590 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:35:31.552571 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-864f998569-kpbmp" Mar 18 17:35:31.563959 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:35:31.563935 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-864f998569-kpbmp"] Mar 18 17:35:31.650137 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:35:31.650109 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/60161dc3-803f-4e89-8d5e-672f73527951-kserve-provision-location\") pod \"isvc-xgboost-v2-mlserver-predictor-864f998569-kpbmp\" (UID: \"60161dc3-803f-4e89-8d5e-672f73527951\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-864f998569-kpbmp" Mar 18 17:35:31.650248 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:35:31.650153 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnptn\" (UniqueName: \"kubernetes.io/projected/60161dc3-803f-4e89-8d5e-672f73527951-kube-api-access-lnptn\") pod \"isvc-xgboost-v2-mlserver-predictor-864f998569-kpbmp\" (UID: \"60161dc3-803f-4e89-8d5e-672f73527951\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-864f998569-kpbmp" Mar 18 17:35:31.751140 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:35:31.751067 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/60161dc3-803f-4e89-8d5e-672f73527951-kserve-provision-location\") pod \"isvc-xgboost-v2-mlserver-predictor-864f998569-kpbmp\" (UID: \"60161dc3-803f-4e89-8d5e-672f73527951\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-864f998569-kpbmp" Mar 18 17:35:31.751140 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:35:31.751104 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lnptn\" (UniqueName: \"kubernetes.io/projected/60161dc3-803f-4e89-8d5e-672f73527951-kube-api-access-lnptn\") pod \"isvc-xgboost-v2-mlserver-predictor-864f998569-kpbmp\" (UID: \"60161dc3-803f-4e89-8d5e-672f73527951\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-864f998569-kpbmp" Mar 18 17:35:31.751437 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:35:31.751415 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/60161dc3-803f-4e89-8d5e-672f73527951-kserve-provision-location\") pod \"isvc-xgboost-v2-mlserver-predictor-864f998569-kpbmp\" (UID: \"60161dc3-803f-4e89-8d5e-672f73527951\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-864f998569-kpbmp" Mar 18 17:35:31.764114 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:35:31.764084 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnptn\" (UniqueName: \"kubernetes.io/projected/60161dc3-803f-4e89-8d5e-672f73527951-kube-api-access-lnptn\") pod \"isvc-xgboost-v2-mlserver-predictor-864f998569-kpbmp\" (UID: \"60161dc3-803f-4e89-8d5e-672f73527951\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-864f998569-kpbmp" Mar 18 17:35:31.863003 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:35:31.862976 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-864f998569-kpbmp" Mar 18 17:35:31.995728 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:35:31.995705 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-864f998569-kpbmp"] Mar 18 17:35:31.998144 ip-10-0-139-49 kubenswrapper[2575]: W0318 17:35:31.998116 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod60161dc3_803f_4e89_8d5e_672f73527951.slice/crio-1e198980a6921a211bed852c0790c02f208afc6d10d1909f32ab92ed5e3f4074 WatchSource:0}: Error finding container 1e198980a6921a211bed852c0790c02f208afc6d10d1909f32ab92ed5e3f4074: Status 404 returned error can't find the container with id 1e198980a6921a211bed852c0790c02f208afc6d10d1909f32ab92ed5e3f4074 Mar 18 17:35:32.689309 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:35:32.689272 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-864f998569-kpbmp" event={"ID":"60161dc3-803f-4e89-8d5e-672f73527951","Type":"ContainerStarted","Data":"2190f3ce1240f7bd157303a84dc1bb8ada07ee45a5a0b3c7c1c10dc2953be9dc"} Mar 18 17:35:32.689309 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:35:32.689311 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-864f998569-kpbmp" event={"ID":"60161dc3-803f-4e89-8d5e-672f73527951","Type":"ContainerStarted","Data":"1e198980a6921a211bed852c0790c02f208afc6d10d1909f32ab92ed5e3f4074"} Mar 18 17:35:34.287086 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:35:34.287046 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:35:34.554576 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:35:34.554555 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-7c946d496c-vs4lz" Mar 18 17:35:34.675981 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:35:34.675954 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/96b0921b-ceef-43c2-a811-c50a5d83de93-kserve-provision-location\") pod \"96b0921b-ceef-43c2-a811-c50a5d83de93\" (UID: \"96b0921b-ceef-43c2-a811-c50a5d83de93\") " Mar 18 17:35:34.676115 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:35:34.675987 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ttxtr\" (UniqueName: \"kubernetes.io/projected/96b0921b-ceef-43c2-a811-c50a5d83de93-kube-api-access-ttxtr\") pod \"96b0921b-ceef-43c2-a811-c50a5d83de93\" (UID: \"96b0921b-ceef-43c2-a811-c50a5d83de93\") " Mar 18 17:35:34.676298 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:35:34.676275 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96b0921b-ceef-43c2-a811-c50a5d83de93-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "96b0921b-ceef-43c2-a811-c50a5d83de93" (UID: "96b0921b-ceef-43c2-a811-c50a5d83de93"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 18 17:35:34.678148 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:35:34.678127 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b0921b-ceef-43c2-a811-c50a5d83de93-kube-api-access-ttxtr" (OuterVolumeSpecName: "kube-api-access-ttxtr") pod "96b0921b-ceef-43c2-a811-c50a5d83de93" (UID: "96b0921b-ceef-43c2-a811-c50a5d83de93"). InnerVolumeSpecName "kube-api-access-ttxtr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 17:35:34.697197 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:35:34.697170 2575 generic.go:358] "Generic (PLEG): container finished" podID="96b0921b-ceef-43c2-a811-c50a5d83de93" containerID="5ff5db0fcb18b45de4edb93f59e80cff9a17aa4123def1d34fcb6cfdd0df3246" exitCode=0 Mar 18 17:35:34.697287 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:35:34.697244 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-7c946d496c-vs4lz" event={"ID":"96b0921b-ceef-43c2-a811-c50a5d83de93","Type":"ContainerDied","Data":"5ff5db0fcb18b45de4edb93f59e80cff9a17aa4123def1d34fcb6cfdd0df3246"} Mar 18 17:35:34.697287 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:35:34.697251 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-7c946d496c-vs4lz" Mar 18 17:35:34.697287 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:35:34.697270 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-7c946d496c-vs4lz" event={"ID":"96b0921b-ceef-43c2-a811-c50a5d83de93","Type":"ContainerDied","Data":"09f0167ac5f0572c4d9c26bff019a11b2c697d1ef77275b11a73f3363c2f9fbf"} Mar 18 17:35:34.697287 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:35:34.697285 2575 scope.go:117] "RemoveContainer" containerID="5ff5db0fcb18b45de4edb93f59e80cff9a17aa4123def1d34fcb6cfdd0df3246" Mar 18 17:35:34.705622 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:35:34.705604 2575 scope.go:117] "RemoveContainer" containerID="425d0c645b514689ab19712fd9b16cf0ec6c66fa88af3870a7c8a2210206c50d" Mar 18 17:35:34.715036 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:35:34.715015 2575 scope.go:117] "RemoveContainer" containerID="5ff5db0fcb18b45de4edb93f59e80cff9a17aa4123def1d34fcb6cfdd0df3246" Mar 18 17:35:34.715320 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:35:34.715292 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ff5db0fcb18b45de4edb93f59e80cff9a17aa4123def1d34fcb6cfdd0df3246\": container with ID starting with 5ff5db0fcb18b45de4edb93f59e80cff9a17aa4123def1d34fcb6cfdd0df3246 not found: ID does not exist" containerID="5ff5db0fcb18b45de4edb93f59e80cff9a17aa4123def1d34fcb6cfdd0df3246" Mar 18 17:35:34.715409 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:35:34.715334 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ff5db0fcb18b45de4edb93f59e80cff9a17aa4123def1d34fcb6cfdd0df3246"} err="failed to get container status \"5ff5db0fcb18b45de4edb93f59e80cff9a17aa4123def1d34fcb6cfdd0df3246\": rpc error: code = NotFound desc = could not find container \"5ff5db0fcb18b45de4edb93f59e80cff9a17aa4123def1d34fcb6cfdd0df3246\": container with ID starting with 5ff5db0fcb18b45de4edb93f59e80cff9a17aa4123def1d34fcb6cfdd0df3246 not found: ID does not exist" Mar 18 17:35:34.715409 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:35:34.715385 2575 scope.go:117] "RemoveContainer" containerID="425d0c645b514689ab19712fd9b16cf0ec6c66fa88af3870a7c8a2210206c50d" Mar 18 17:35:34.715714 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:35:34.715686 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"425d0c645b514689ab19712fd9b16cf0ec6c66fa88af3870a7c8a2210206c50d\": container with ID starting with 425d0c645b514689ab19712fd9b16cf0ec6c66fa88af3870a7c8a2210206c50d not found: ID does not exist" containerID="425d0c645b514689ab19712fd9b16cf0ec6c66fa88af3870a7c8a2210206c50d" Mar 18 17:35:34.715829 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:35:34.715722 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"425d0c645b514689ab19712fd9b16cf0ec6c66fa88af3870a7c8a2210206c50d"} err="failed to get container status \"425d0c645b514689ab19712fd9b16cf0ec6c66fa88af3870a7c8a2210206c50d\": rpc error: code = NotFound desc = could not find container \"425d0c645b514689ab19712fd9b16cf0ec6c66fa88af3870a7c8a2210206c50d\": container with ID starting with 425d0c645b514689ab19712fd9b16cf0ec6c66fa88af3870a7c8a2210206c50d not found: ID does not exist" Mar 18 17:35:34.716670 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:35:34.716653 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-7c946d496c-vs4lz"] Mar 18 17:35:34.719526 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:35:34.719505 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-7c946d496c-vs4lz"] Mar 18 17:35:34.777198 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:35:34.777177 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/96b0921b-ceef-43c2-a811-c50a5d83de93-kserve-provision-location\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 17:35:34.777198 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:35:34.777196 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ttxtr\" (UniqueName: \"kubernetes.io/projected/96b0921b-ceef-43c2-a811-c50a5d83de93-kube-api-access-ttxtr\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 17:35:36.290211 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:35:36.290131 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b0921b-ceef-43c2-a811-c50a5d83de93" path="/var/lib/kubelet/pods/96b0921b-ceef-43c2-a811-c50a5d83de93/volumes" Mar 18 17:35:36.707514 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:35:36.707484 2575 generic.go:358] "Generic (PLEG): container finished" podID="60161dc3-803f-4e89-8d5e-672f73527951" containerID="2190f3ce1240f7bd157303a84dc1bb8ada07ee45a5a0b3c7c1c10dc2953be9dc" exitCode=0 Mar 18 17:35:36.707687 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:35:36.707546 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-864f998569-kpbmp" event={"ID":"60161dc3-803f-4e89-8d5e-672f73527951","Type":"ContainerDied","Data":"2190f3ce1240f7bd157303a84dc1bb8ada07ee45a5a0b3c7c1c10dc2953be9dc"} Mar 18 17:35:37.712221 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:35:37.712188 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-864f998569-kpbmp" event={"ID":"60161dc3-803f-4e89-8d5e-672f73527951","Type":"ContainerStarted","Data":"a7aea4160f31c903d1892bc23833305cb225b48ccfcf4916fdf4bf1a1db56ea5"} Mar 18 17:35:37.712648 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:35:37.712400 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-864f998569-kpbmp" Mar 18 17:35:37.727864 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:35:37.727814 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-864f998569-kpbmp" podStartSLOduration=6.727802006 podStartE2EDuration="6.727802006s" podCreationTimestamp="2026-03-18 17:35:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:35:37.726638293 +0000 UTC m=+3056.037891211" watchObservedRunningTime="2026-03-18 17:35:37.727802006 +0000 UTC m=+3056.039054926" Mar 18 17:35:49.287625 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:35:49.287591 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:36:04.287385 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:36:04.287263 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 17:36:04.287633 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:36:04.287479 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:36:08.720613 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:36:08.720580 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-864f998569-kpbmp" Mar 18 17:36:11.679460 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:36:11.679430 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-864f998569-kpbmp"] Mar 18 17:36:11.679856 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:36:11.679664 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-864f998569-kpbmp" podUID="60161dc3-803f-4e89-8d5e-672f73527951" containerName="kserve-container" containerID="cri-o://a7aea4160f31c903d1892bc23833305cb225b48ccfcf4916fdf4bf1a1db56ea5" gracePeriod=30 Mar 18 17:36:11.870429 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:36:11.870379 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-546cb4c58-8h7rk"] Mar 18 17:36:11.870705 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:36:11.870693 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="96b0921b-ceef-43c2-a811-c50a5d83de93" containerName="kserve-container" Mar 18 17:36:11.870752 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:36:11.870706 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="96b0921b-ceef-43c2-a811-c50a5d83de93" containerName="kserve-container" Mar 18 17:36:11.870752 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:36:11.870715 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="96b0921b-ceef-43c2-a811-c50a5d83de93" containerName="storage-initializer" Mar 18 17:36:11.870752 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:36:11.870721 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="96b0921b-ceef-43c2-a811-c50a5d83de93" containerName="storage-initializer" Mar 18 17:36:11.870848 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:36:11.870780 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="96b0921b-ceef-43c2-a811-c50a5d83de93" containerName="kserve-container" Mar 18 17:36:11.873851 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:36:11.873827 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-546cb4c58-8h7rk" Mar 18 17:36:11.883561 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:36:11.883535 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-546cb4c58-8h7rk"] Mar 18 17:36:11.943441 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:36:11.943374 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5b8ee605-c3ae-49e6-a2ab-37107648efe1-kserve-provision-location\") pod \"xgboost-v2-mlserver-predictor-546cb4c58-8h7rk\" (UID: \"5b8ee605-c3ae-49e6-a2ab-37107648efe1\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-546cb4c58-8h7rk" Mar 18 17:36:11.943441 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:36:11.943430 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7x5tz\" (UniqueName: \"kubernetes.io/projected/5b8ee605-c3ae-49e6-a2ab-37107648efe1-kube-api-access-7x5tz\") pod \"xgboost-v2-mlserver-predictor-546cb4c58-8h7rk\" (UID: \"5b8ee605-c3ae-49e6-a2ab-37107648efe1\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-546cb4c58-8h7rk" Mar 18 17:36:12.044398 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:36:12.044353 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5b8ee605-c3ae-49e6-a2ab-37107648efe1-kserve-provision-location\") pod \"xgboost-v2-mlserver-predictor-546cb4c58-8h7rk\" (UID: \"5b8ee605-c3ae-49e6-a2ab-37107648efe1\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-546cb4c58-8h7rk" Mar 18 17:36:12.044511 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:36:12.044409 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7x5tz\" (UniqueName: \"kubernetes.io/projected/5b8ee605-c3ae-49e6-a2ab-37107648efe1-kube-api-access-7x5tz\") pod \"xgboost-v2-mlserver-predictor-546cb4c58-8h7rk\" (UID: \"5b8ee605-c3ae-49e6-a2ab-37107648efe1\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-546cb4c58-8h7rk" Mar 18 17:36:12.044699 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:36:12.044682 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5b8ee605-c3ae-49e6-a2ab-37107648efe1-kserve-provision-location\") pod \"xgboost-v2-mlserver-predictor-546cb4c58-8h7rk\" (UID: \"5b8ee605-c3ae-49e6-a2ab-37107648efe1\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-546cb4c58-8h7rk" Mar 18 17:36:12.052811 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:36:12.052789 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7x5tz\" (UniqueName: \"kubernetes.io/projected/5b8ee605-c3ae-49e6-a2ab-37107648efe1-kube-api-access-7x5tz\") pod \"xgboost-v2-mlserver-predictor-546cb4c58-8h7rk\" (UID: \"5b8ee605-c3ae-49e6-a2ab-37107648efe1\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-546cb4c58-8h7rk" Mar 18 17:36:12.184850 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:36:12.184829 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-546cb4c58-8h7rk" Mar 18 17:36:12.309645 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:36:12.309620 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-546cb4c58-8h7rk"] Mar 18 17:36:12.311475 ip-10-0-139-49 kubenswrapper[2575]: W0318 17:36:12.311450 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b8ee605_c3ae_49e6_a2ab_37107648efe1.slice/crio-2ff26d7d8f28ceb884dd98e0eb5a6be4e65bf8c2fd7152c4e799b4aa71385a55 WatchSource:0}: Error finding container 2ff26d7d8f28ceb884dd98e0eb5a6be4e65bf8c2fd7152c4e799b4aa71385a55: Status 404 returned error can't find the container with id 2ff26d7d8f28ceb884dd98e0eb5a6be4e65bf8c2fd7152c4e799b4aa71385a55 Mar 18 17:36:12.827811 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:36:12.827780 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-546cb4c58-8h7rk" event={"ID":"5b8ee605-c3ae-49e6-a2ab-37107648efe1","Type":"ContainerStarted","Data":"da8e729033cfe11e7a1f6d0119e555a849ca19583614a8d290fe73fece5251ac"} Mar 18 17:36:12.827811 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:36:12.827813 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-546cb4c58-8h7rk" event={"ID":"5b8ee605-c3ae-49e6-a2ab-37107648efe1","Type":"ContainerStarted","Data":"2ff26d7d8f28ceb884dd98e0eb5a6be4e65bf8c2fd7152c4e799b4aa71385a55"} Mar 18 17:36:16.147748 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:36:16.147728 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-864f998569-kpbmp" Mar 18 17:36:16.175904 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:36:16.175876 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lnptn\" (UniqueName: \"kubernetes.io/projected/60161dc3-803f-4e89-8d5e-672f73527951-kube-api-access-lnptn\") pod \"60161dc3-803f-4e89-8d5e-672f73527951\" (UID: \"60161dc3-803f-4e89-8d5e-672f73527951\") " Mar 18 17:36:16.176033 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:36:16.175949 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/60161dc3-803f-4e89-8d5e-672f73527951-kserve-provision-location\") pod \"60161dc3-803f-4e89-8d5e-672f73527951\" (UID: \"60161dc3-803f-4e89-8d5e-672f73527951\") " Mar 18 17:36:16.176270 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:36:16.176245 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60161dc3-803f-4e89-8d5e-672f73527951-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "60161dc3-803f-4e89-8d5e-672f73527951" (UID: "60161dc3-803f-4e89-8d5e-672f73527951"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 18 17:36:16.178027 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:36:16.178006 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60161dc3-803f-4e89-8d5e-672f73527951-kube-api-access-lnptn" (OuterVolumeSpecName: "kube-api-access-lnptn") pod "60161dc3-803f-4e89-8d5e-672f73527951" (UID: "60161dc3-803f-4e89-8d5e-672f73527951"). InnerVolumeSpecName "kube-api-access-lnptn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 17:36:16.276542 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:36:16.276480 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lnptn\" (UniqueName: \"kubernetes.io/projected/60161dc3-803f-4e89-8d5e-672f73527951-kube-api-access-lnptn\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 17:36:16.276542 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:36:16.276507 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/60161dc3-803f-4e89-8d5e-672f73527951-kserve-provision-location\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 17:36:16.841201 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:36:16.841173 2575 generic.go:358] "Generic (PLEG): container finished" podID="60161dc3-803f-4e89-8d5e-672f73527951" containerID="a7aea4160f31c903d1892bc23833305cb225b48ccfcf4916fdf4bf1a1db56ea5" exitCode=0 Mar 18 17:36:16.841389 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:36:16.841250 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-864f998569-kpbmp" Mar 18 17:36:16.841389 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:36:16.841269 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-864f998569-kpbmp" event={"ID":"60161dc3-803f-4e89-8d5e-672f73527951","Type":"ContainerDied","Data":"a7aea4160f31c903d1892bc23833305cb225b48ccfcf4916fdf4bf1a1db56ea5"} Mar 18 17:36:16.841389 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:36:16.841312 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-864f998569-kpbmp" event={"ID":"60161dc3-803f-4e89-8d5e-672f73527951","Type":"ContainerDied","Data":"1e198980a6921a211bed852c0790c02f208afc6d10d1909f32ab92ed5e3f4074"} Mar 18 17:36:16.841389 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:36:16.841333 2575 scope.go:117] "RemoveContainer" containerID="a7aea4160f31c903d1892bc23833305cb225b48ccfcf4916fdf4bf1a1db56ea5" Mar 18 17:36:16.842801 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:36:16.842772 2575 generic.go:358] "Generic (PLEG): container finished" podID="5b8ee605-c3ae-49e6-a2ab-37107648efe1" containerID="da8e729033cfe11e7a1f6d0119e555a849ca19583614a8d290fe73fece5251ac" exitCode=0 Mar 18 17:36:16.842914 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:36:16.842873 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-546cb4c58-8h7rk" event={"ID":"5b8ee605-c3ae-49e6-a2ab-37107648efe1","Type":"ContainerDied","Data":"da8e729033cfe11e7a1f6d0119e555a849ca19583614a8d290fe73fece5251ac"} Mar 18 17:36:16.849902 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:36:16.849889 2575 scope.go:117] "RemoveContainer" containerID="2190f3ce1240f7bd157303a84dc1bb8ada07ee45a5a0b3c7c1c10dc2953be9dc" Mar 18 17:36:16.859278 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:36:16.859262 2575 scope.go:117] "RemoveContainer" containerID="a7aea4160f31c903d1892bc23833305cb225b48ccfcf4916fdf4bf1a1db56ea5" Mar 18 17:36:16.859534 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:36:16.859513 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7aea4160f31c903d1892bc23833305cb225b48ccfcf4916fdf4bf1a1db56ea5\": container with ID starting with a7aea4160f31c903d1892bc23833305cb225b48ccfcf4916fdf4bf1a1db56ea5 not found: ID does not exist" containerID="a7aea4160f31c903d1892bc23833305cb225b48ccfcf4916fdf4bf1a1db56ea5" Mar 18 17:36:16.859624 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:36:16.859544 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7aea4160f31c903d1892bc23833305cb225b48ccfcf4916fdf4bf1a1db56ea5"} err="failed to get container status \"a7aea4160f31c903d1892bc23833305cb225b48ccfcf4916fdf4bf1a1db56ea5\": rpc error: code = NotFound desc = could not find container \"a7aea4160f31c903d1892bc23833305cb225b48ccfcf4916fdf4bf1a1db56ea5\": container with ID starting with a7aea4160f31c903d1892bc23833305cb225b48ccfcf4916fdf4bf1a1db56ea5 not found: ID does not exist" Mar 18 17:36:16.859624 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:36:16.859565 2575 scope.go:117] "RemoveContainer" containerID="2190f3ce1240f7bd157303a84dc1bb8ada07ee45a5a0b3c7c1c10dc2953be9dc" Mar 18 17:36:16.859810 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:36:16.859793 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2190f3ce1240f7bd157303a84dc1bb8ada07ee45a5a0b3c7c1c10dc2953be9dc\": container with ID starting with 2190f3ce1240f7bd157303a84dc1bb8ada07ee45a5a0b3c7c1c10dc2953be9dc not found: ID does not exist" containerID="2190f3ce1240f7bd157303a84dc1bb8ada07ee45a5a0b3c7c1c10dc2953be9dc" Mar 18 17:36:16.859857 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:36:16.859817 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2190f3ce1240f7bd157303a84dc1bb8ada07ee45a5a0b3c7c1c10dc2953be9dc"} err="failed to get container status \"2190f3ce1240f7bd157303a84dc1bb8ada07ee45a5a0b3c7c1c10dc2953be9dc\": rpc error: code = NotFound desc = could not find container \"2190f3ce1240f7bd157303a84dc1bb8ada07ee45a5a0b3c7c1c10dc2953be9dc\": container with ID starting with 2190f3ce1240f7bd157303a84dc1bb8ada07ee45a5a0b3c7c1c10dc2953be9dc not found: ID does not exist" Mar 18 17:36:16.872168 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:36:16.872147 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-864f998569-kpbmp"] Mar 18 17:36:16.887339 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:36:16.887321 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-864f998569-kpbmp"] Mar 18 17:36:17.287439 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:36:17.287409 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:36:17.847088 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:36:17.847056 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-546cb4c58-8h7rk" event={"ID":"5b8ee605-c3ae-49e6-a2ab-37107648efe1","Type":"ContainerStarted","Data":"a235da3cfd084e1b38573e2fdc8d75652f979c491ae85a729cf8a8df5cf24267"} Mar 18 17:36:17.847291 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:36:17.847275 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-546cb4c58-8h7rk" Mar 18 17:36:17.864432 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:36:17.864383 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-546cb4c58-8h7rk" podStartSLOduration=6.864351487 podStartE2EDuration="6.864351487s" podCreationTimestamp="2026-03-18 17:36:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:36:17.864151939 +0000 UTC m=+3096.175404860" watchObservedRunningTime="2026-03-18 17:36:17.864351487 +0000 UTC m=+3096.175604408" Mar 18 17:36:18.290163 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:36:18.290121 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60161dc3-803f-4e89-8d5e-672f73527951" path="/var/lib/kubelet/pods/60161dc3-803f-4e89-8d5e-672f73527951/volumes" Mar 18 17:36:29.287187 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:36:29.287143 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:36:40.286811 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:36:40.286736 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:36:48.856279 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:36:48.856251 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-546cb4c58-8h7rk" Mar 18 17:36:51.286822 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:36:51.286791 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:36:51.769233 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:36:51.769204 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-546cb4c58-8h7rk"] Mar 18 17:36:51.769608 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:36:51.769573 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-546cb4c58-8h7rk" podUID="5b8ee605-c3ae-49e6-a2ab-37107648efe1" containerName="kserve-container" containerID="cri-o://a235da3cfd084e1b38573e2fdc8d75652f979c491ae85a729cf8a8df5cf24267" gracePeriod=30 Mar 18 17:36:51.944005 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:36:51.943970 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-98d86b655-krfrp"] Mar 18 17:36:51.944379 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:36:51.944349 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="60161dc3-803f-4e89-8d5e-672f73527951" containerName="storage-initializer" Mar 18 17:36:51.944429 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:36:51.944383 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="60161dc3-803f-4e89-8d5e-672f73527951" containerName="storage-initializer" Mar 18 17:36:51.944429 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:36:51.944393 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="60161dc3-803f-4e89-8d5e-672f73527951" containerName="kserve-container" Mar 18 17:36:51.944429 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:36:51.944398 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="60161dc3-803f-4e89-8d5e-672f73527951" containerName="kserve-container" Mar 18 17:36:51.944528 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:36:51.944466 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="60161dc3-803f-4e89-8d5e-672f73527951" containerName="kserve-container" Mar 18 17:36:51.947795 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:36:51.947776 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-98d86b655-krfrp" Mar 18 17:36:51.956111 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:36:51.956085 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-98d86b655-krfrp"] Mar 18 17:36:52.031812 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:36:52.031742 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f6c225cd-65b8-4b10-b5fa-710d49755921-kserve-provision-location\") pod \"isvc-xgboost-runtime-predictor-98d86b655-krfrp\" (UID: \"f6c225cd-65b8-4b10-b5fa-710d49755921\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-98d86b655-krfrp" Mar 18 17:36:52.031812 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:36:52.031799 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvd5z\" (UniqueName: \"kubernetes.io/projected/f6c225cd-65b8-4b10-b5fa-710d49755921-kube-api-access-zvd5z\") pod \"isvc-xgboost-runtime-predictor-98d86b655-krfrp\" (UID: \"f6c225cd-65b8-4b10-b5fa-710d49755921\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-98d86b655-krfrp" Mar 18 17:36:52.132448 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:36:52.132408 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f6c225cd-65b8-4b10-b5fa-710d49755921-kserve-provision-location\") pod \"isvc-xgboost-runtime-predictor-98d86b655-krfrp\" (UID: \"f6c225cd-65b8-4b10-b5fa-710d49755921\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-98d86b655-krfrp" Mar 18 17:36:52.132566 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:36:52.132483 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zvd5z\" (UniqueName: \"kubernetes.io/projected/f6c225cd-65b8-4b10-b5fa-710d49755921-kube-api-access-zvd5z\") pod \"isvc-xgboost-runtime-predictor-98d86b655-krfrp\" (UID: \"f6c225cd-65b8-4b10-b5fa-710d49755921\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-98d86b655-krfrp" Mar 18 17:36:52.132745 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:36:52.132722 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f6c225cd-65b8-4b10-b5fa-710d49755921-kserve-provision-location\") pod \"isvc-xgboost-runtime-predictor-98d86b655-krfrp\" (UID: \"f6c225cd-65b8-4b10-b5fa-710d49755921\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-98d86b655-krfrp" Mar 18 17:36:52.140374 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:36:52.140342 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvd5z\" (UniqueName: \"kubernetes.io/projected/f6c225cd-65b8-4b10-b5fa-710d49755921-kube-api-access-zvd5z\") pod \"isvc-xgboost-runtime-predictor-98d86b655-krfrp\" (UID: \"f6c225cd-65b8-4b10-b5fa-710d49755921\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-98d86b655-krfrp" Mar 18 17:36:52.258037 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:36:52.258011 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-98d86b655-krfrp" Mar 18 17:36:52.381854 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:36:52.381822 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-98d86b655-krfrp"] Mar 18 17:36:52.385013 ip-10-0-139-49 kubenswrapper[2575]: W0318 17:36:52.384987 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf6c225cd_65b8_4b10_b5fa_710d49755921.slice/crio-42a4a2a65434afd13af2e2f91e68e95ce0f3ffac3273feef2b096cc167aad2e5 WatchSource:0}: Error finding container 42a4a2a65434afd13af2e2f91e68e95ce0f3ffac3273feef2b096cc167aad2e5: Status 404 returned error can't find the container with id 42a4a2a65434afd13af2e2f91e68e95ce0f3ffac3273feef2b096cc167aad2e5 Mar 18 17:36:52.965499 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:36:52.965463 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-98d86b655-krfrp" event={"ID":"f6c225cd-65b8-4b10-b5fa-710d49755921","Type":"ContainerStarted","Data":"32f440e4e4f4b4f08b0dad219a1f33a69fed8766a7e5bcdc634796f2e664abe2"} Mar 18 17:36:52.965685 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:36:52.965503 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-98d86b655-krfrp" event={"ID":"f6c225cd-65b8-4b10-b5fa-710d49755921","Type":"ContainerStarted","Data":"42a4a2a65434afd13af2e2f91e68e95ce0f3ffac3273feef2b096cc167aad2e5"} Mar 18 17:36:56.316501 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:36:56.316479 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-546cb4c58-8h7rk" Mar 18 17:36:56.462136 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:36:56.462114 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7x5tz\" (UniqueName: \"kubernetes.io/projected/5b8ee605-c3ae-49e6-a2ab-37107648efe1-kube-api-access-7x5tz\") pod \"5b8ee605-c3ae-49e6-a2ab-37107648efe1\" (UID: \"5b8ee605-c3ae-49e6-a2ab-37107648efe1\") " Mar 18 17:36:56.462262 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:36:56.462159 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5b8ee605-c3ae-49e6-a2ab-37107648efe1-kserve-provision-location\") pod \"5b8ee605-c3ae-49e6-a2ab-37107648efe1\" (UID: \"5b8ee605-c3ae-49e6-a2ab-37107648efe1\") " Mar 18 17:36:56.462521 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:36:56.462497 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b8ee605-c3ae-49e6-a2ab-37107648efe1-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "5b8ee605-c3ae-49e6-a2ab-37107648efe1" (UID: "5b8ee605-c3ae-49e6-a2ab-37107648efe1"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 18 17:36:56.464093 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:36:56.464067 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b8ee605-c3ae-49e6-a2ab-37107648efe1-kube-api-access-7x5tz" (OuterVolumeSpecName: "kube-api-access-7x5tz") pod "5b8ee605-c3ae-49e6-a2ab-37107648efe1" (UID: "5b8ee605-c3ae-49e6-a2ab-37107648efe1"). InnerVolumeSpecName "kube-api-access-7x5tz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 17:36:56.563466 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:36:56.563432 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7x5tz\" (UniqueName: \"kubernetes.io/projected/5b8ee605-c3ae-49e6-a2ab-37107648efe1-kube-api-access-7x5tz\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 17:36:56.563466 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:36:56.563463 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5b8ee605-c3ae-49e6-a2ab-37107648efe1-kserve-provision-location\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 17:36:56.979631 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:36:56.979599 2575 generic.go:358] "Generic (PLEG): container finished" podID="f6c225cd-65b8-4b10-b5fa-710d49755921" containerID="32f440e4e4f4b4f08b0dad219a1f33a69fed8766a7e5bcdc634796f2e664abe2" exitCode=0 Mar 18 17:36:56.979740 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:36:56.979668 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-98d86b655-krfrp" event={"ID":"f6c225cd-65b8-4b10-b5fa-710d49755921","Type":"ContainerDied","Data":"32f440e4e4f4b4f08b0dad219a1f33a69fed8766a7e5bcdc634796f2e664abe2"} Mar 18 17:36:56.981144 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:36:56.981122 2575 generic.go:358] "Generic (PLEG): container finished" podID="5b8ee605-c3ae-49e6-a2ab-37107648efe1" containerID="a235da3cfd084e1b38573e2fdc8d75652f979c491ae85a729cf8a8df5cf24267" exitCode=0 Mar 18 17:36:56.981230 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:36:56.981177 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-546cb4c58-8h7rk" Mar 18 17:36:56.981274 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:36:56.981178 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-546cb4c58-8h7rk" event={"ID":"5b8ee605-c3ae-49e6-a2ab-37107648efe1","Type":"ContainerDied","Data":"a235da3cfd084e1b38573e2fdc8d75652f979c491ae85a729cf8a8df5cf24267"} Mar 18 17:36:56.981311 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:36:56.981292 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-546cb4c58-8h7rk" event={"ID":"5b8ee605-c3ae-49e6-a2ab-37107648efe1","Type":"ContainerDied","Data":"2ff26d7d8f28ceb884dd98e0eb5a6be4e65bf8c2fd7152c4e799b4aa71385a55"} Mar 18 17:36:56.981347 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:36:56.981317 2575 scope.go:117] "RemoveContainer" containerID="a235da3cfd084e1b38573e2fdc8d75652f979c491ae85a729cf8a8df5cf24267" Mar 18 17:36:56.989596 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:36:56.989582 2575 scope.go:117] "RemoveContainer" containerID="da8e729033cfe11e7a1f6d0119e555a849ca19583614a8d290fe73fece5251ac" Mar 18 17:36:57.000867 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:36:57.000850 2575 scope.go:117] "RemoveContainer" containerID="a235da3cfd084e1b38573e2fdc8d75652f979c491ae85a729cf8a8df5cf24267" Mar 18 17:36:57.001139 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:36:57.001119 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a235da3cfd084e1b38573e2fdc8d75652f979c491ae85a729cf8a8df5cf24267\": container with ID starting with a235da3cfd084e1b38573e2fdc8d75652f979c491ae85a729cf8a8df5cf24267 not found: ID does not exist" containerID="a235da3cfd084e1b38573e2fdc8d75652f979c491ae85a729cf8a8df5cf24267" Mar 18 17:36:57.001230 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:36:57.001151 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a235da3cfd084e1b38573e2fdc8d75652f979c491ae85a729cf8a8df5cf24267"} err="failed to get container status \"a235da3cfd084e1b38573e2fdc8d75652f979c491ae85a729cf8a8df5cf24267\": rpc error: code = NotFound desc = could not find container \"a235da3cfd084e1b38573e2fdc8d75652f979c491ae85a729cf8a8df5cf24267\": container with ID starting with a235da3cfd084e1b38573e2fdc8d75652f979c491ae85a729cf8a8df5cf24267 not found: ID does not exist" Mar 18 17:36:57.001230 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:36:57.001173 2575 scope.go:117] "RemoveContainer" containerID="da8e729033cfe11e7a1f6d0119e555a849ca19583614a8d290fe73fece5251ac" Mar 18 17:36:57.001457 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:36:57.001440 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da8e729033cfe11e7a1f6d0119e555a849ca19583614a8d290fe73fece5251ac\": container with ID starting with da8e729033cfe11e7a1f6d0119e555a849ca19583614a8d290fe73fece5251ac not found: ID does not exist" containerID="da8e729033cfe11e7a1f6d0119e555a849ca19583614a8d290fe73fece5251ac" Mar 18 17:36:57.001530 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:36:57.001465 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da8e729033cfe11e7a1f6d0119e555a849ca19583614a8d290fe73fece5251ac"} err="failed to get container status \"da8e729033cfe11e7a1f6d0119e555a849ca19583614a8d290fe73fece5251ac\": rpc error: code = NotFound desc = could not find container \"da8e729033cfe11e7a1f6d0119e555a849ca19583614a8d290fe73fece5251ac\": container with ID starting with da8e729033cfe11e7a1f6d0119e555a849ca19583614a8d290fe73fece5251ac not found: ID does not exist" Mar 18 17:36:57.007157 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:36:57.007135 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-546cb4c58-8h7rk"] Mar 18 17:36:57.010925 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:36:57.010907 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-546cb4c58-8h7rk"] Mar 18 17:36:57.986667 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:36:57.986632 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-98d86b655-krfrp" event={"ID":"f6c225cd-65b8-4b10-b5fa-710d49755921","Type":"ContainerStarted","Data":"3c139ea6b0e24d1742fd0de64a513e7341e3c7b81ea21026f89cc2159f2b17d6"} Mar 18 17:36:57.987025 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:36:57.987000 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-98d86b655-krfrp" Mar 18 17:36:57.988110 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:36:57.988077 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-98d86b655-krfrp" podUID="f6c225cd-65b8-4b10-b5fa-710d49755921" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.69:8080: connect: connection refused" Mar 18 17:36:58.004863 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:36:58.004828 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-98d86b655-krfrp" podStartSLOduration=7.004817052 podStartE2EDuration="7.004817052s" podCreationTimestamp="2026-03-18 17:36:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:36:58.003015492 +0000 UTC m=+3136.314268412" watchObservedRunningTime="2026-03-18 17:36:58.004817052 +0000 UTC m=+3136.316070028" Mar 18 17:36:58.290322 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:36:58.290247 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b8ee605-c3ae-49e6-a2ab-37107648efe1" path="/var/lib/kubelet/pods/5b8ee605-c3ae-49e6-a2ab-37107648efe1/volumes" Mar 18 17:36:58.989667 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:36:58.989630 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-98d86b655-krfrp" podUID="f6c225cd-65b8-4b10-b5fa-710d49755921" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.69:8080: connect: connection refused" Mar 18 17:37:04.287055 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:37:04.287025 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:37:08.989623 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:37:08.989585 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-98d86b655-krfrp" podUID="f6c225cd-65b8-4b10-b5fa-710d49755921" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.69:8080: connect: connection refused" Mar 18 17:37:18.286943 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:37:18.286899 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:37:18.990606 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:37:18.990565 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-98d86b655-krfrp" podUID="f6c225cd-65b8-4b10-b5fa-710d49755921" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.69:8080: connect: connection refused" Mar 18 17:37:28.989879 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:37:28.989839 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-98d86b655-krfrp" podUID="f6c225cd-65b8-4b10-b5fa-710d49755921" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.69:8080: connect: connection refused" Mar 18 17:37:32.289136 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:37:32.289097 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:37:38.990077 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:37:38.990035 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-98d86b655-krfrp" podUID="f6c225cd-65b8-4b10-b5fa-710d49755921" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.69:8080: connect: connection refused" Mar 18 17:37:46.286897 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:37:46.286867 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:37:48.989930 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:37:48.989888 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-98d86b655-krfrp" podUID="f6c225cd-65b8-4b10-b5fa-710d49755921" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.69:8080: connect: connection refused" Mar 18 17:37:58.287458 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:37:58.287419 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:37:58.990550 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:37:58.990519 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-98d86b655-krfrp" Mar 18 17:38:01.921172 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:38:01.921138 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-98d86b655-krfrp"] Mar 18 17:38:01.921598 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:38:01.921378 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-98d86b655-krfrp" podUID="f6c225cd-65b8-4b10-b5fa-710d49755921" containerName="kserve-container" containerID="cri-o://3c139ea6b0e24d1742fd0de64a513e7341e3c7b81ea21026f89cc2159f2b17d6" gracePeriod=30 Mar 18 17:38:02.175612 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:38:02.175531 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-598464cd6d-nm95z"] Mar 18 17:38:02.175881 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:38:02.175868 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5b8ee605-c3ae-49e6-a2ab-37107648efe1" containerName="storage-initializer" Mar 18 17:38:02.175929 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:38:02.175882 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b8ee605-c3ae-49e6-a2ab-37107648efe1" containerName="storage-initializer" Mar 18 17:38:02.175929 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:38:02.175901 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5b8ee605-c3ae-49e6-a2ab-37107648efe1" containerName="kserve-container" Mar 18 17:38:02.175929 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:38:02.175907 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b8ee605-c3ae-49e6-a2ab-37107648efe1" containerName="kserve-container" Mar 18 17:38:02.176033 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:38:02.175965 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="5b8ee605-c3ae-49e6-a2ab-37107648efe1" containerName="kserve-container" Mar 18 17:38:02.179078 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:38:02.179058 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-598464cd6d-nm95z" Mar 18 17:38:02.186572 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:38:02.186546 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-598464cd6d-nm95z"] Mar 18 17:38:02.221129 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:38:02.221107 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4jwb\" (UniqueName: \"kubernetes.io/projected/744e550d-83f4-4de8-a1a1-60a46c800e4a-kube-api-access-f4jwb\") pod \"isvc-xgboost-v2-runtime-predictor-598464cd6d-nm95z\" (UID: \"744e550d-83f4-4de8-a1a1-60a46c800e4a\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-598464cd6d-nm95z" Mar 18 17:38:02.221243 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:38:02.221153 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/744e550d-83f4-4de8-a1a1-60a46c800e4a-kserve-provision-location\") pod \"isvc-xgboost-v2-runtime-predictor-598464cd6d-nm95z\" (UID: \"744e550d-83f4-4de8-a1a1-60a46c800e4a\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-598464cd6d-nm95z" Mar 18 17:38:02.321608 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:38:02.321581 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f4jwb\" (UniqueName: \"kubernetes.io/projected/744e550d-83f4-4de8-a1a1-60a46c800e4a-kube-api-access-f4jwb\") pod \"isvc-xgboost-v2-runtime-predictor-598464cd6d-nm95z\" (UID: \"744e550d-83f4-4de8-a1a1-60a46c800e4a\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-598464cd6d-nm95z" Mar 18 17:38:02.321741 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:38:02.321633 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/744e550d-83f4-4de8-a1a1-60a46c800e4a-kserve-provision-location\") pod \"isvc-xgboost-v2-runtime-predictor-598464cd6d-nm95z\" (UID: \"744e550d-83f4-4de8-a1a1-60a46c800e4a\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-598464cd6d-nm95z" Mar 18 17:38:02.322024 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:38:02.322006 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/744e550d-83f4-4de8-a1a1-60a46c800e4a-kserve-provision-location\") pod \"isvc-xgboost-v2-runtime-predictor-598464cd6d-nm95z\" (UID: \"744e550d-83f4-4de8-a1a1-60a46c800e4a\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-598464cd6d-nm95z" Mar 18 17:38:02.329115 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:38:02.329095 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4jwb\" (UniqueName: \"kubernetes.io/projected/744e550d-83f4-4de8-a1a1-60a46c800e4a-kube-api-access-f4jwb\") pod \"isvc-xgboost-v2-runtime-predictor-598464cd6d-nm95z\" (UID: \"744e550d-83f4-4de8-a1a1-60a46c800e4a\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-598464cd6d-nm95z" Mar 18 17:38:02.490121 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:38:02.490040 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-598464cd6d-nm95z" Mar 18 17:38:02.813632 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:38:02.813600 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-598464cd6d-nm95z"] Mar 18 17:38:02.816406 ip-10-0-139-49 kubenswrapper[2575]: W0318 17:38:02.816377 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod744e550d_83f4_4de8_a1a1_60a46c800e4a.slice/crio-191c4caae327805610d6f45c40d832c414925a1bb32dbe792895a3ae7152ac89 WatchSource:0}: Error finding container 191c4caae327805610d6f45c40d832c414925a1bb32dbe792895a3ae7152ac89: Status 404 returned error can't find the container with id 191c4caae327805610d6f45c40d832c414925a1bb32dbe792895a3ae7152ac89 Mar 18 17:38:03.197210 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:38:03.197175 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-598464cd6d-nm95z" event={"ID":"744e550d-83f4-4de8-a1a1-60a46c800e4a","Type":"ContainerStarted","Data":"6d7f5f1499a5e021e67c684d8e537ecffab18ebfb7d2b6fb94c16a2c8f9f1ba8"} Mar 18 17:38:03.197210 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:38:03.197211 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-598464cd6d-nm95z" event={"ID":"744e550d-83f4-4de8-a1a1-60a46c800e4a","Type":"ContainerStarted","Data":"191c4caae327805610d6f45c40d832c414925a1bb32dbe792895a3ae7152ac89"} Mar 18 17:38:05.162887 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:38:05.162864 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-98d86b655-krfrp" Mar 18 17:38:05.205912 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:38:05.205887 2575 generic.go:358] "Generic (PLEG): container finished" podID="f6c225cd-65b8-4b10-b5fa-710d49755921" containerID="3c139ea6b0e24d1742fd0de64a513e7341e3c7b81ea21026f89cc2159f2b17d6" exitCode=0 Mar 18 17:38:05.206043 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:38:05.205971 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-98d86b655-krfrp" event={"ID":"f6c225cd-65b8-4b10-b5fa-710d49755921","Type":"ContainerDied","Data":"3c139ea6b0e24d1742fd0de64a513e7341e3c7b81ea21026f89cc2159f2b17d6"} Mar 18 17:38:05.206043 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:38:05.205993 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-98d86b655-krfrp" event={"ID":"f6c225cd-65b8-4b10-b5fa-710d49755921","Type":"ContainerDied","Data":"42a4a2a65434afd13af2e2f91e68e95ce0f3ffac3273feef2b096cc167aad2e5"} Mar 18 17:38:05.206043 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:38:05.206005 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-98d86b655-krfrp" Mar 18 17:38:05.206201 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:38:05.206008 2575 scope.go:117] "RemoveContainer" containerID="3c139ea6b0e24d1742fd0de64a513e7341e3c7b81ea21026f89cc2159f2b17d6" Mar 18 17:38:05.214145 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:38:05.214131 2575 scope.go:117] "RemoveContainer" containerID="32f440e4e4f4b4f08b0dad219a1f33a69fed8766a7e5bcdc634796f2e664abe2" Mar 18 17:38:05.225232 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:38:05.225215 2575 scope.go:117] "RemoveContainer" containerID="3c139ea6b0e24d1742fd0de64a513e7341e3c7b81ea21026f89cc2159f2b17d6" Mar 18 17:38:05.225521 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:38:05.225491 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c139ea6b0e24d1742fd0de64a513e7341e3c7b81ea21026f89cc2159f2b17d6\": container with ID starting with 3c139ea6b0e24d1742fd0de64a513e7341e3c7b81ea21026f89cc2159f2b17d6 not found: ID does not exist" containerID="3c139ea6b0e24d1742fd0de64a513e7341e3c7b81ea21026f89cc2159f2b17d6" Mar 18 17:38:05.225640 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:38:05.225530 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c139ea6b0e24d1742fd0de64a513e7341e3c7b81ea21026f89cc2159f2b17d6"} err="failed to get container status \"3c139ea6b0e24d1742fd0de64a513e7341e3c7b81ea21026f89cc2159f2b17d6\": rpc error: code = NotFound desc = could not find container \"3c139ea6b0e24d1742fd0de64a513e7341e3c7b81ea21026f89cc2159f2b17d6\": container with ID starting with 3c139ea6b0e24d1742fd0de64a513e7341e3c7b81ea21026f89cc2159f2b17d6 not found: ID does not exist" Mar 18 17:38:05.225640 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:38:05.225547 2575 scope.go:117] "RemoveContainer" containerID="32f440e4e4f4b4f08b0dad219a1f33a69fed8766a7e5bcdc634796f2e664abe2" Mar 18 17:38:05.225834 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:38:05.225818 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32f440e4e4f4b4f08b0dad219a1f33a69fed8766a7e5bcdc634796f2e664abe2\": container with ID starting with 32f440e4e4f4b4f08b0dad219a1f33a69fed8766a7e5bcdc634796f2e664abe2 not found: ID does not exist" containerID="32f440e4e4f4b4f08b0dad219a1f33a69fed8766a7e5bcdc634796f2e664abe2" Mar 18 17:38:05.225883 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:38:05.225841 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32f440e4e4f4b4f08b0dad219a1f33a69fed8766a7e5bcdc634796f2e664abe2"} err="failed to get container status \"32f440e4e4f4b4f08b0dad219a1f33a69fed8766a7e5bcdc634796f2e664abe2\": rpc error: code = NotFound desc = could not find container \"32f440e4e4f4b4f08b0dad219a1f33a69fed8766a7e5bcdc634796f2e664abe2\": container with ID starting with 32f440e4e4f4b4f08b0dad219a1f33a69fed8766a7e5bcdc634796f2e664abe2 not found: ID does not exist" Mar 18 17:38:05.242918 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:38:05.242866 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zvd5z\" (UniqueName: \"kubernetes.io/projected/f6c225cd-65b8-4b10-b5fa-710d49755921-kube-api-access-zvd5z\") pod \"f6c225cd-65b8-4b10-b5fa-710d49755921\" (UID: \"f6c225cd-65b8-4b10-b5fa-710d49755921\") " Mar 18 17:38:05.243008 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:38:05.242918 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f6c225cd-65b8-4b10-b5fa-710d49755921-kserve-provision-location\") pod \"f6c225cd-65b8-4b10-b5fa-710d49755921\" (UID: \"f6c225cd-65b8-4b10-b5fa-710d49755921\") " Mar 18 17:38:05.243193 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:38:05.243169 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6c225cd-65b8-4b10-b5fa-710d49755921-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f6c225cd-65b8-4b10-b5fa-710d49755921" (UID: "f6c225cd-65b8-4b10-b5fa-710d49755921"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 18 17:38:05.244893 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:38:05.244872 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6c225cd-65b8-4b10-b5fa-710d49755921-kube-api-access-zvd5z" (OuterVolumeSpecName: "kube-api-access-zvd5z") pod "f6c225cd-65b8-4b10-b5fa-710d49755921" (UID: "f6c225cd-65b8-4b10-b5fa-710d49755921"). InnerVolumeSpecName "kube-api-access-zvd5z". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 17:38:05.343490 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:38:05.343457 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zvd5z\" (UniqueName: \"kubernetes.io/projected/f6c225cd-65b8-4b10-b5fa-710d49755921-kube-api-access-zvd5z\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 17:38:05.343490 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:38:05.343482 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f6c225cd-65b8-4b10-b5fa-710d49755921-kserve-provision-location\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 17:38:05.525672 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:38:05.525626 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-98d86b655-krfrp"] Mar 18 17:38:05.528122 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:38:05.528097 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-98d86b655-krfrp"] Mar 18 17:38:06.290076 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:38:06.290045 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6c225cd-65b8-4b10-b5fa-710d49755921" path="/var/lib/kubelet/pods/f6c225cd-65b8-4b10-b5fa-710d49755921/volumes" Mar 18 17:38:07.214240 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:38:07.214206 2575 generic.go:358] "Generic (PLEG): container finished" podID="744e550d-83f4-4de8-a1a1-60a46c800e4a" containerID="6d7f5f1499a5e021e67c684d8e537ecffab18ebfb7d2b6fb94c16a2c8f9f1ba8" exitCode=0 Mar 18 17:38:07.214454 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:38:07.214285 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-598464cd6d-nm95z" event={"ID":"744e550d-83f4-4de8-a1a1-60a46c800e4a","Type":"ContainerDied","Data":"6d7f5f1499a5e021e67c684d8e537ecffab18ebfb7d2b6fb94c16a2c8f9f1ba8"} Mar 18 17:38:08.218726 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:38:08.218689 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-598464cd6d-nm95z" event={"ID":"744e550d-83f4-4de8-a1a1-60a46c800e4a","Type":"ContainerStarted","Data":"8b15bf476cc33b24414d0540cdb68ddcd484ae68fd9528e3df88b9bb3d512ab3"} Mar 18 17:38:08.219117 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:38:08.218910 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-598464cd6d-nm95z" Mar 18 17:38:08.236977 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:38:08.236906 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-598464cd6d-nm95z" podStartSLOduration=6.23689365 podStartE2EDuration="6.23689365s" podCreationTimestamp="2026-03-18 17:38:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:38:08.236082581 +0000 UTC m=+3206.547335502" watchObservedRunningTime="2026-03-18 17:38:08.23689365 +0000 UTC m=+3206.548146571" Mar 18 17:38:10.287248 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:38:10.287218 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:38:22.288353 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:38:22.288260 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:38:37.591518 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:38:37.591425 2575 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized" image="quay.io/opendatahub/odh-model-serving-api:fast" Mar 18 17:38:37.591943 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:38:37.591644 2575 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:server,Image:quay.io/opendatahub/odh-model-serving-api:fast,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},ContainerPort{Name:metrics,HostPort:0,ContainerPort:9090,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:TLS_CERT_DIR,Value:/tls,ValueFrom:nil,},EnvVar{Name:GATEWAY_LABEL_SELECTOR,Value:,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{50 -3} {} 50m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tls-certs,ReadOnly:true,MountPath:/tls,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vhs26,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000640000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod model-serving-api-9699c8d45-ttb7x_kserve(e41abd12-239d-46fb-9cdb-9d35fa51024d): ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized" logger="UnhandledError" Mar 18 17:38:37.592843 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:38:37.592813 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ErrImagePull: \"unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:38:39.226620 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:38:39.226587 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-598464cd6d-nm95z" Mar 18 17:38:42.173644 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:38:42.173607 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-598464cd6d-nm95z"] Mar 18 17:38:42.173990 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:38:42.173881 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-598464cd6d-nm95z" podUID="744e550d-83f4-4de8-a1a1-60a46c800e4a" containerName="kserve-container" containerID="cri-o://8b15bf476cc33b24414d0540cdb68ddcd484ae68fd9528e3df88b9bb3d512ab3" gracePeriod=30 Mar 18 17:38:42.356939 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:38:42.356909 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-86bc7486f5-p87l2"] Mar 18 17:38:42.357240 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:38:42.357227 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f6c225cd-65b8-4b10-b5fa-710d49755921" containerName="storage-initializer" Mar 18 17:38:42.357287 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:38:42.357242 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6c225cd-65b8-4b10-b5fa-710d49755921" containerName="storage-initializer" Mar 18 17:38:42.357287 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:38:42.357251 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f6c225cd-65b8-4b10-b5fa-710d49755921" containerName="kserve-container" Mar 18 17:38:42.357287 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:38:42.357257 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6c225cd-65b8-4b10-b5fa-710d49755921" containerName="kserve-container" Mar 18 17:38:42.357438 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:38:42.357314 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="f6c225cd-65b8-4b10-b5fa-710d49755921" containerName="kserve-container" Mar 18 17:38:42.360316 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:38:42.360302 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-86bc7486f5-p87l2" Mar 18 17:38:42.370935 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:38:42.370908 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-86bc7486f5-p87l2"] Mar 18 17:38:42.395922 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:38:42.395897 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgdw4\" (UniqueName: \"kubernetes.io/projected/ad63772a-9b66-4b62-9b5d-6468583e0479-kube-api-access-tgdw4\") pod \"isvc-xgboost-v2-predictor-86bc7486f5-p87l2\" (UID: \"ad63772a-9b66-4b62-9b5d-6468583e0479\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-86bc7486f5-p87l2" Mar 18 17:38:42.396028 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:38:42.395933 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ad63772a-9b66-4b62-9b5d-6468583e0479-kserve-provision-location\") pod \"isvc-xgboost-v2-predictor-86bc7486f5-p87l2\" (UID: \"ad63772a-9b66-4b62-9b5d-6468583e0479\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-86bc7486f5-p87l2" Mar 18 17:38:42.496553 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:38:42.496495 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tgdw4\" (UniqueName: \"kubernetes.io/projected/ad63772a-9b66-4b62-9b5d-6468583e0479-kube-api-access-tgdw4\") pod \"isvc-xgboost-v2-predictor-86bc7486f5-p87l2\" (UID: \"ad63772a-9b66-4b62-9b5d-6468583e0479\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-86bc7486f5-p87l2" Mar 18 17:38:42.496553 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:38:42.496527 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ad63772a-9b66-4b62-9b5d-6468583e0479-kserve-provision-location\") pod \"isvc-xgboost-v2-predictor-86bc7486f5-p87l2\" (UID: \"ad63772a-9b66-4b62-9b5d-6468583e0479\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-86bc7486f5-p87l2" Mar 18 17:38:42.496869 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:38:42.496852 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ad63772a-9b66-4b62-9b5d-6468583e0479-kserve-provision-location\") pod \"isvc-xgboost-v2-predictor-86bc7486f5-p87l2\" (UID: \"ad63772a-9b66-4b62-9b5d-6468583e0479\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-86bc7486f5-p87l2" Mar 18 17:38:42.505929 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:38:42.505903 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgdw4\" (UniqueName: \"kubernetes.io/projected/ad63772a-9b66-4b62-9b5d-6468583e0479-kube-api-access-tgdw4\") pod \"isvc-xgboost-v2-predictor-86bc7486f5-p87l2\" (UID: \"ad63772a-9b66-4b62-9b5d-6468583e0479\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-86bc7486f5-p87l2" Mar 18 17:38:42.671217 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:38:42.671190 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-86bc7486f5-p87l2" Mar 18 17:38:42.791344 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:38:42.791322 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-86bc7486f5-p87l2"] Mar 18 17:38:42.793891 ip-10-0-139-49 kubenswrapper[2575]: W0318 17:38:42.793868 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad63772a_9b66_4b62_9b5d_6468583e0479.slice/crio-d873bfb79e7011464cafacc735bfe49bff76d519d927c71f3d61e78fba71c2de WatchSource:0}: Error finding container d873bfb79e7011464cafacc735bfe49bff76d519d927c71f3d61e78fba71c2de: Status 404 returned error can't find the container with id d873bfb79e7011464cafacc735bfe49bff76d519d927c71f3d61e78fba71c2de Mar 18 17:38:43.331532 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:38:43.331494 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-86bc7486f5-p87l2" event={"ID":"ad63772a-9b66-4b62-9b5d-6468583e0479","Type":"ContainerStarted","Data":"ad37f4968ba1760c48f002a61ab6df4bbef0fd74ad5f9ebda27990bc5b689e7b"} Mar 18 17:38:43.331532 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:38:43.331538 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-86bc7486f5-p87l2" event={"ID":"ad63772a-9b66-4b62-9b5d-6468583e0479","Type":"ContainerStarted","Data":"d873bfb79e7011464cafacc735bfe49bff76d519d927c71f3d61e78fba71c2de"} Mar 18 17:38:47.345477 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:38:47.345442 2575 generic.go:358] "Generic (PLEG): container finished" podID="ad63772a-9b66-4b62-9b5d-6468583e0479" containerID="ad37f4968ba1760c48f002a61ab6df4bbef0fd74ad5f9ebda27990bc5b689e7b" exitCode=0 Mar 18 17:38:47.345908 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:38:47.345511 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-86bc7486f5-p87l2" event={"ID":"ad63772a-9b66-4b62-9b5d-6468583e0479","Type":"ContainerDied","Data":"ad37f4968ba1760c48f002a61ab6df4bbef0fd74ad5f9ebda27990bc5b689e7b"} Mar 18 17:38:47.712812 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:38:47.712790 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-598464cd6d-nm95z" Mar 18 17:38:47.838498 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:38:47.838476 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f4jwb\" (UniqueName: \"kubernetes.io/projected/744e550d-83f4-4de8-a1a1-60a46c800e4a-kube-api-access-f4jwb\") pod \"744e550d-83f4-4de8-a1a1-60a46c800e4a\" (UID: \"744e550d-83f4-4de8-a1a1-60a46c800e4a\") " Mar 18 17:38:47.838656 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:38:47.838541 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/744e550d-83f4-4de8-a1a1-60a46c800e4a-kserve-provision-location\") pod \"744e550d-83f4-4de8-a1a1-60a46c800e4a\" (UID: \"744e550d-83f4-4de8-a1a1-60a46c800e4a\") " Mar 18 17:38:47.838838 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:38:47.838814 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/744e550d-83f4-4de8-a1a1-60a46c800e4a-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "744e550d-83f4-4de8-a1a1-60a46c800e4a" (UID: "744e550d-83f4-4de8-a1a1-60a46c800e4a"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 18 17:38:47.840658 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:38:47.840638 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/744e550d-83f4-4de8-a1a1-60a46c800e4a-kube-api-access-f4jwb" (OuterVolumeSpecName: "kube-api-access-f4jwb") pod "744e550d-83f4-4de8-a1a1-60a46c800e4a" (UID: "744e550d-83f4-4de8-a1a1-60a46c800e4a"). InnerVolumeSpecName "kube-api-access-f4jwb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 17:38:47.939566 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:38:47.939540 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/744e550d-83f4-4de8-a1a1-60a46c800e4a-kserve-provision-location\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 17:38:47.939566 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:38:47.939561 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-f4jwb\" (UniqueName: \"kubernetes.io/projected/744e550d-83f4-4de8-a1a1-60a46c800e4a-kube-api-access-f4jwb\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 17:38:48.350741 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:38:48.350711 2575 generic.go:358] "Generic (PLEG): container finished" podID="744e550d-83f4-4de8-a1a1-60a46c800e4a" containerID="8b15bf476cc33b24414d0540cdb68ddcd484ae68fd9528e3df88b9bb3d512ab3" exitCode=0 Mar 18 17:38:48.351143 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:38:48.350773 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-598464cd6d-nm95z" Mar 18 17:38:48.351143 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:38:48.350796 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-598464cd6d-nm95z" event={"ID":"744e550d-83f4-4de8-a1a1-60a46c800e4a","Type":"ContainerDied","Data":"8b15bf476cc33b24414d0540cdb68ddcd484ae68fd9528e3df88b9bb3d512ab3"} Mar 18 17:38:48.351143 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:38:48.350837 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-598464cd6d-nm95z" event={"ID":"744e550d-83f4-4de8-a1a1-60a46c800e4a","Type":"ContainerDied","Data":"191c4caae327805610d6f45c40d832c414925a1bb32dbe792895a3ae7152ac89"} Mar 18 17:38:48.351143 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:38:48.350860 2575 scope.go:117] "RemoveContainer" containerID="8b15bf476cc33b24414d0540cdb68ddcd484ae68fd9528e3df88b9bb3d512ab3" Mar 18 17:38:48.352645 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:38:48.352615 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-86bc7486f5-p87l2" event={"ID":"ad63772a-9b66-4b62-9b5d-6468583e0479","Type":"ContainerStarted","Data":"f0ce525edfad883495bbeca00d06b2ce813315ba970dd1cd27bc3db83580d0f3"} Mar 18 17:38:48.353100 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:38:48.353053 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-86bc7486f5-p87l2" Mar 18 17:38:48.354385 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:38:48.354340 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-86bc7486f5-p87l2" podUID="ad63772a-9b66-4b62-9b5d-6468583e0479" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.71:8080: connect: connection refused" Mar 18 17:38:48.359376 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:38:48.359326 2575 scope.go:117] "RemoveContainer" containerID="6d7f5f1499a5e021e67c684d8e537ecffab18ebfb7d2b6fb94c16a2c8f9f1ba8" Mar 18 17:38:48.370865 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:38:48.370845 2575 scope.go:117] "RemoveContainer" containerID="8b15bf476cc33b24414d0540cdb68ddcd484ae68fd9528e3df88b9bb3d512ab3" Mar 18 17:38:48.371147 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:38:48.371128 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b15bf476cc33b24414d0540cdb68ddcd484ae68fd9528e3df88b9bb3d512ab3\": container with ID starting with 8b15bf476cc33b24414d0540cdb68ddcd484ae68fd9528e3df88b9bb3d512ab3 not found: ID does not exist" containerID="8b15bf476cc33b24414d0540cdb68ddcd484ae68fd9528e3df88b9bb3d512ab3" Mar 18 17:38:48.371222 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:38:48.371159 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b15bf476cc33b24414d0540cdb68ddcd484ae68fd9528e3df88b9bb3d512ab3"} err="failed to get container status \"8b15bf476cc33b24414d0540cdb68ddcd484ae68fd9528e3df88b9bb3d512ab3\": rpc error: code = NotFound desc = could not find container \"8b15bf476cc33b24414d0540cdb68ddcd484ae68fd9528e3df88b9bb3d512ab3\": container with ID starting with 8b15bf476cc33b24414d0540cdb68ddcd484ae68fd9528e3df88b9bb3d512ab3 not found: ID does not exist" Mar 18 17:38:48.371222 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:38:48.371184 2575 scope.go:117] "RemoveContainer" containerID="6d7f5f1499a5e021e67c684d8e537ecffab18ebfb7d2b6fb94c16a2c8f9f1ba8" Mar 18 17:38:48.371333 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:38:48.371298 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-598464cd6d-nm95z"] Mar 18 17:38:48.371492 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:38:48.371446 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d7f5f1499a5e021e67c684d8e537ecffab18ebfb7d2b6fb94c16a2c8f9f1ba8\": container with ID starting with 6d7f5f1499a5e021e67c684d8e537ecffab18ebfb7d2b6fb94c16a2c8f9f1ba8 not found: ID does not exist" containerID="6d7f5f1499a5e021e67c684d8e537ecffab18ebfb7d2b6fb94c16a2c8f9f1ba8" Mar 18 17:38:48.371492 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:38:48.371473 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d7f5f1499a5e021e67c684d8e537ecffab18ebfb7d2b6fb94c16a2c8f9f1ba8"} err="failed to get container status \"6d7f5f1499a5e021e67c684d8e537ecffab18ebfb7d2b6fb94c16a2c8f9f1ba8\": rpc error: code = NotFound desc = could not find container \"6d7f5f1499a5e021e67c684d8e537ecffab18ebfb7d2b6fb94c16a2c8f9f1ba8\": container with ID starting with 6d7f5f1499a5e021e67c684d8e537ecffab18ebfb7d2b6fb94c16a2c8f9f1ba8 not found: ID does not exist" Mar 18 17:38:48.375174 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:38:48.374976 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-598464cd6d-nm95z"] Mar 18 17:38:48.391218 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:38:48.391173 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-86bc7486f5-p87l2" podStartSLOduration=6.391159189 podStartE2EDuration="6.391159189s" podCreationTimestamp="2026-03-18 17:38:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:38:48.389394334 +0000 UTC m=+3246.700647258" watchObservedRunningTime="2026-03-18 17:38:48.391159189 +0000 UTC m=+3246.702412111" Mar 18 17:38:49.286842 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:38:49.286813 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:38:49.356868 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:38:49.356834 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-86bc7486f5-p87l2" podUID="ad63772a-9b66-4b62-9b5d-6468583e0479" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.71:8080: connect: connection refused" Mar 18 17:38:50.289971 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:38:50.289937 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="744e550d-83f4-4de8-a1a1-60a46c800e4a" path="/var/lib/kubelet/pods/744e550d-83f4-4de8-a1a1-60a46c800e4a/volumes" Mar 18 17:38:59.357252 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:38:59.357206 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-86bc7486f5-p87l2" podUID="ad63772a-9b66-4b62-9b5d-6468583e0479" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.71:8080: connect: connection refused" Mar 18 17:39:04.287325 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:39:04.287296 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:39:09.357817 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:39:09.357741 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-86bc7486f5-p87l2" podUID="ad63772a-9b66-4b62-9b5d-6468583e0479" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.71:8080: connect: connection refused" Mar 18 17:39:15.287167 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:39:15.287136 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:39:19.357666 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:39:19.357625 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-86bc7486f5-p87l2" podUID="ad63772a-9b66-4b62-9b5d-6468583e0479" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.71:8080: connect: connection refused" Mar 18 17:39:29.286656 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:39:29.286606 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:39:29.357428 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:39:29.357397 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-86bc7486f5-p87l2" podUID="ad63772a-9b66-4b62-9b5d-6468583e0479" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.71:8080: connect: connection refused" Mar 18 17:39:39.357886 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:39:39.357847 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-86bc7486f5-p87l2" podUID="ad63772a-9b66-4b62-9b5d-6468583e0479" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.71:8080: connect: connection refused" Mar 18 17:39:42.410628 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:39:42.410601 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qktrm_a51f93e2-c750-4700-bb46-109456c7c78a/ovn-acl-logging/0.log" Mar 18 17:39:42.419069 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:39:42.419048 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qktrm_a51f93e2-c750-4700-bb46-109456c7c78a/ovn-acl-logging/0.log" Mar 18 17:39:43.286628 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:39:43.286600 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:39:49.357950 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:39:49.357915 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-86bc7486f5-p87l2" Mar 18 17:39:52.264019 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:39:52.263932 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-86bc7486f5-p87l2"] Mar 18 17:39:52.264468 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:39:52.264251 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-86bc7486f5-p87l2" podUID="ad63772a-9b66-4b62-9b5d-6468583e0479" containerName="kserve-container" containerID="cri-o://f0ce525edfad883495bbeca00d06b2ce813315ba970dd1cd27bc3db83580d0f3" gracePeriod=30 Mar 18 17:39:52.549418 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:39:52.549329 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-7cdc46bbf5-844gw"] Mar 18 17:39:52.549709 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:39:52.549693 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="744e550d-83f4-4de8-a1a1-60a46c800e4a" containerName="kserve-container" Mar 18 17:39:52.549755 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:39:52.549712 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="744e550d-83f4-4de8-a1a1-60a46c800e4a" containerName="kserve-container" Mar 18 17:39:52.549755 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:39:52.549739 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="744e550d-83f4-4de8-a1a1-60a46c800e4a" containerName="storage-initializer" Mar 18 17:39:52.549755 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:39:52.549745 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="744e550d-83f4-4de8-a1a1-60a46c800e4a" containerName="storage-initializer" Mar 18 17:39:52.549864 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:39:52.549805 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="744e550d-83f4-4de8-a1a1-60a46c800e4a" containerName="kserve-container" Mar 18 17:39:52.552823 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:39:52.552805 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-7cdc46bbf5-844gw" Mar 18 17:39:52.554865 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:39:52.554844 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"storage-config\"" Mar 18 17:39:52.562936 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:39:52.562913 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-7cdc46bbf5-844gw"] Mar 18 17:39:52.686197 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:39:52.686164 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e1cdc0d1-3960-4554-864d-a5713c259cf4-kserve-provision-location\") pod \"isvc-sklearn-s3-predictor-7cdc46bbf5-844gw\" (UID: \"e1cdc0d1-3960-4554-864d-a5713c259cf4\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-7cdc46bbf5-844gw" Mar 18 17:39:52.686342 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:39:52.686220 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcq7z\" (UniqueName: \"kubernetes.io/projected/e1cdc0d1-3960-4554-864d-a5713c259cf4-kube-api-access-qcq7z\") pod \"isvc-sklearn-s3-predictor-7cdc46bbf5-844gw\" (UID: \"e1cdc0d1-3960-4554-864d-a5713c259cf4\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-7cdc46bbf5-844gw" Mar 18 17:39:52.787181 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:39:52.787151 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e1cdc0d1-3960-4554-864d-a5713c259cf4-kserve-provision-location\") pod \"isvc-sklearn-s3-predictor-7cdc46bbf5-844gw\" (UID: \"e1cdc0d1-3960-4554-864d-a5713c259cf4\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-7cdc46bbf5-844gw" Mar 18 17:39:52.787307 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:39:52.787208 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qcq7z\" (UniqueName: \"kubernetes.io/projected/e1cdc0d1-3960-4554-864d-a5713c259cf4-kube-api-access-qcq7z\") pod \"isvc-sklearn-s3-predictor-7cdc46bbf5-844gw\" (UID: \"e1cdc0d1-3960-4554-864d-a5713c259cf4\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-7cdc46bbf5-844gw" Mar 18 17:39:52.787520 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:39:52.787493 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e1cdc0d1-3960-4554-864d-a5713c259cf4-kserve-provision-location\") pod \"isvc-sklearn-s3-predictor-7cdc46bbf5-844gw\" (UID: \"e1cdc0d1-3960-4554-864d-a5713c259cf4\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-7cdc46bbf5-844gw" Mar 18 17:39:52.796476 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:39:52.796456 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcq7z\" (UniqueName: \"kubernetes.io/projected/e1cdc0d1-3960-4554-864d-a5713c259cf4-kube-api-access-qcq7z\") pod \"isvc-sklearn-s3-predictor-7cdc46bbf5-844gw\" (UID: \"e1cdc0d1-3960-4554-864d-a5713c259cf4\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-7cdc46bbf5-844gw" Mar 18 17:39:52.864083 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:39:52.864029 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-7cdc46bbf5-844gw" Mar 18 17:39:52.983575 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:39:52.983500 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-7cdc46bbf5-844gw"] Mar 18 17:39:52.986322 ip-10-0-139-49 kubenswrapper[2575]: W0318 17:39:52.986292 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1cdc0d1_3960_4554_864d_a5713c259cf4.slice/crio-3bab399c66084221bb2a2ec17f1a092cfdee0e109a78b96c915fa3218a201d0c WatchSource:0}: Error finding container 3bab399c66084221bb2a2ec17f1a092cfdee0e109a78b96c915fa3218a201d0c: Status 404 returned error can't find the container with id 3bab399c66084221bb2a2ec17f1a092cfdee0e109a78b96c915fa3218a201d0c Mar 18 17:39:53.564166 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:39:53.564128 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-7cdc46bbf5-844gw" event={"ID":"e1cdc0d1-3960-4554-864d-a5713c259cf4","Type":"ContainerStarted","Data":"6d9855a436e5adf4ffe223ad61df8234fe58a33ca41f1a27ac4a3494989be2fb"} Mar 18 17:39:53.564166 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:39:53.564170 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-7cdc46bbf5-844gw" event={"ID":"e1cdc0d1-3960-4554-864d-a5713c259cf4","Type":"ContainerStarted","Data":"3bab399c66084221bb2a2ec17f1a092cfdee0e109a78b96c915fa3218a201d0c"} Mar 18 17:39:54.568849 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:39:54.568813 2575 generic.go:358] "Generic (PLEG): container finished" podID="e1cdc0d1-3960-4554-864d-a5713c259cf4" containerID="6d9855a436e5adf4ffe223ad61df8234fe58a33ca41f1a27ac4a3494989be2fb" exitCode=0 Mar 18 17:39:54.569230 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:39:54.568893 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-7cdc46bbf5-844gw" event={"ID":"e1cdc0d1-3960-4554-864d-a5713c259cf4","Type":"ContainerDied","Data":"6d9855a436e5adf4ffe223ad61df8234fe58a33ca41f1a27ac4a3494989be2fb"} Mar 18 17:39:55.403488 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:39:55.403463 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-86bc7486f5-p87l2" Mar 18 17:39:55.411961 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:39:55.411943 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ad63772a-9b66-4b62-9b5d-6468583e0479-kserve-provision-location\") pod \"ad63772a-9b66-4b62-9b5d-6468583e0479\" (UID: \"ad63772a-9b66-4b62-9b5d-6468583e0479\") " Mar 18 17:39:55.412035 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:39:55.412004 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tgdw4\" (UniqueName: \"kubernetes.io/projected/ad63772a-9b66-4b62-9b5d-6468583e0479-kube-api-access-tgdw4\") pod \"ad63772a-9b66-4b62-9b5d-6468583e0479\" (UID: \"ad63772a-9b66-4b62-9b5d-6468583e0479\") " Mar 18 17:39:55.412219 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:39:55.412197 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad63772a-9b66-4b62-9b5d-6468583e0479-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "ad63772a-9b66-4b62-9b5d-6468583e0479" (UID: "ad63772a-9b66-4b62-9b5d-6468583e0479"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 18 17:39:55.414150 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:39:55.414130 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad63772a-9b66-4b62-9b5d-6468583e0479-kube-api-access-tgdw4" (OuterVolumeSpecName: "kube-api-access-tgdw4") pod "ad63772a-9b66-4b62-9b5d-6468583e0479" (UID: "ad63772a-9b66-4b62-9b5d-6468583e0479"). InnerVolumeSpecName "kube-api-access-tgdw4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 17:39:55.512616 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:39:55.512550 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ad63772a-9b66-4b62-9b5d-6468583e0479-kserve-provision-location\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 17:39:55.512616 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:39:55.512574 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tgdw4\" (UniqueName: \"kubernetes.io/projected/ad63772a-9b66-4b62-9b5d-6468583e0479-kube-api-access-tgdw4\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 17:39:55.573384 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:39:55.573339 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-7cdc46bbf5-844gw" event={"ID":"e1cdc0d1-3960-4554-864d-a5713c259cf4","Type":"ContainerStarted","Data":"eef39dcad7aefee74ab07adcf2fc008647d8d8f336cc7ec143dc8ec268c4128a"} Mar 18 17:39:55.573765 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:39:55.573621 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-7cdc46bbf5-844gw" Mar 18 17:39:55.574650 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:39:55.574619 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-7cdc46bbf5-844gw" podUID="e1cdc0d1-3960-4554-864d-a5713c259cf4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.72:8080: connect: connection refused" Mar 18 17:39:55.574896 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:39:55.574875 2575 generic.go:358] "Generic (PLEG): container finished" podID="ad63772a-9b66-4b62-9b5d-6468583e0479" containerID="f0ce525edfad883495bbeca00d06b2ce813315ba970dd1cd27bc3db83580d0f3" exitCode=0 Mar 18 17:39:55.574966 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:39:55.574915 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-86bc7486f5-p87l2" event={"ID":"ad63772a-9b66-4b62-9b5d-6468583e0479","Type":"ContainerDied","Data":"f0ce525edfad883495bbeca00d06b2ce813315ba970dd1cd27bc3db83580d0f3"} Mar 18 17:39:55.574966 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:39:55.574938 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-86bc7486f5-p87l2" Mar 18 17:39:55.574966 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:39:55.574950 2575 scope.go:117] "RemoveContainer" containerID="f0ce525edfad883495bbeca00d06b2ce813315ba970dd1cd27bc3db83580d0f3" Mar 18 17:39:55.575087 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:39:55.574938 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-86bc7486f5-p87l2" event={"ID":"ad63772a-9b66-4b62-9b5d-6468583e0479","Type":"ContainerDied","Data":"d873bfb79e7011464cafacc735bfe49bff76d519d927c71f3d61e78fba71c2de"} Mar 18 17:39:55.583728 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:39:55.583706 2575 scope.go:117] "RemoveContainer" containerID="ad37f4968ba1760c48f002a61ab6df4bbef0fd74ad5f9ebda27990bc5b689e7b" Mar 18 17:39:55.593024 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:39:55.592933 2575 scope.go:117] "RemoveContainer" containerID="f0ce525edfad883495bbeca00d06b2ce813315ba970dd1cd27bc3db83580d0f3" Mar 18 17:39:55.593262 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:39:55.593235 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0ce525edfad883495bbeca00d06b2ce813315ba970dd1cd27bc3db83580d0f3\": container with ID starting with f0ce525edfad883495bbeca00d06b2ce813315ba970dd1cd27bc3db83580d0f3 not found: ID does not exist" containerID="f0ce525edfad883495bbeca00d06b2ce813315ba970dd1cd27bc3db83580d0f3" Mar 18 17:39:55.593385 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:39:55.593269 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0ce525edfad883495bbeca00d06b2ce813315ba970dd1cd27bc3db83580d0f3"} err="failed to get container status \"f0ce525edfad883495bbeca00d06b2ce813315ba970dd1cd27bc3db83580d0f3\": rpc error: code = NotFound desc = could not find container \"f0ce525edfad883495bbeca00d06b2ce813315ba970dd1cd27bc3db83580d0f3\": container with ID starting with f0ce525edfad883495bbeca00d06b2ce813315ba970dd1cd27bc3db83580d0f3 not found: ID does not exist" Mar 18 17:39:55.593385 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:39:55.593288 2575 scope.go:117] "RemoveContainer" containerID="ad37f4968ba1760c48f002a61ab6df4bbef0fd74ad5f9ebda27990bc5b689e7b" Mar 18 17:39:55.593586 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:39:55.593564 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad37f4968ba1760c48f002a61ab6df4bbef0fd74ad5f9ebda27990bc5b689e7b\": container with ID starting with ad37f4968ba1760c48f002a61ab6df4bbef0fd74ad5f9ebda27990bc5b689e7b not found: ID does not exist" containerID="ad37f4968ba1760c48f002a61ab6df4bbef0fd74ad5f9ebda27990bc5b689e7b" Mar 18 17:39:55.593636 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:39:55.593595 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad37f4968ba1760c48f002a61ab6df4bbef0fd74ad5f9ebda27990bc5b689e7b"} err="failed to get container status \"ad37f4968ba1760c48f002a61ab6df4bbef0fd74ad5f9ebda27990bc5b689e7b\": rpc error: code = NotFound desc = could not find container \"ad37f4968ba1760c48f002a61ab6df4bbef0fd74ad5f9ebda27990bc5b689e7b\": container with ID starting with ad37f4968ba1760c48f002a61ab6df4bbef0fd74ad5f9ebda27990bc5b689e7b not found: ID does not exist" Mar 18 17:39:55.593722 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:39:55.593686 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-7cdc46bbf5-844gw" podStartSLOduration=3.593672831 podStartE2EDuration="3.593672831s" podCreationTimestamp="2026-03-18 17:39:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:39:55.592020157 +0000 UTC m=+3313.903273100" watchObservedRunningTime="2026-03-18 17:39:55.593672831 +0000 UTC m=+3313.904925753" Mar 18 17:39:55.604682 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:39:55.604662 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-86bc7486f5-p87l2"] Mar 18 17:39:55.610400 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:39:55.610380 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-86bc7486f5-p87l2"] Mar 18 17:39:56.286758 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:39:56.286730 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:39:56.289923 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:39:56.289901 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad63772a-9b66-4b62-9b5d-6468583e0479" path="/var/lib/kubelet/pods/ad63772a-9b66-4b62-9b5d-6468583e0479/volumes" Mar 18 17:39:56.579334 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:39:56.579259 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-7cdc46bbf5-844gw" podUID="e1cdc0d1-3960-4554-864d-a5713c259cf4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.72:8080: connect: connection refused" Mar 18 17:40:06.580093 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:40:06.580053 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-7cdc46bbf5-844gw" podUID="e1cdc0d1-3960-4554-864d-a5713c259cf4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.72:8080: connect: connection refused" Mar 18 17:40:10.286727 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:40:10.286694 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:40:16.580145 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:40:16.580104 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-7cdc46bbf5-844gw" podUID="e1cdc0d1-3960-4554-864d-a5713c259cf4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.72:8080: connect: connection refused" Mar 18 17:40:24.286645 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:40:24.286610 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:40:26.580041 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:40:26.579997 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-7cdc46bbf5-844gw" podUID="e1cdc0d1-3960-4554-864d-a5713c259cf4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.72:8080: connect: connection refused" Mar 18 17:40:36.579271 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:40:36.579225 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-7cdc46bbf5-844gw" podUID="e1cdc0d1-3960-4554-864d-a5713c259cf4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.72:8080: connect: connection refused" Mar 18 17:40:39.286856 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:40:39.286825 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:40:46.580144 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:40:46.580095 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-7cdc46bbf5-844gw" podUID="e1cdc0d1-3960-4554-864d-a5713c259cf4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.72:8080: connect: connection refused" Mar 18 17:40:52.288573 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:40:52.288503 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:40:56.579348 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:40:56.579306 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-7cdc46bbf5-844gw" podUID="e1cdc0d1-3960-4554-864d-a5713c259cf4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.72:8080: connect: connection refused" Mar 18 17:41:03.286839 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:41:03.286808 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:41:06.579626 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:41:06.579585 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-7cdc46bbf5-844gw" podUID="e1cdc0d1-3960-4554-864d-a5713c259cf4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.72:8080: connect: connection refused" Mar 18 17:41:11.286524 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:41:11.286497 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-7cdc46bbf5-844gw" Mar 18 17:41:12.442953 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:41:12.442918 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-7cdc46bbf5-844gw"] Mar 18 17:41:12.649881 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:41:12.649851 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-7c8d8c5bc9-jclpm"] Mar 18 17:41:12.650174 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:41:12.650163 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ad63772a-9b66-4b62-9b5d-6468583e0479" containerName="kserve-container" Mar 18 17:41:12.650219 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:41:12.650177 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad63772a-9b66-4b62-9b5d-6468583e0479" containerName="kserve-container" Mar 18 17:41:12.650219 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:41:12.650189 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ad63772a-9b66-4b62-9b5d-6468583e0479" containerName="storage-initializer" Mar 18 17:41:12.650219 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:41:12.650195 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad63772a-9b66-4b62-9b5d-6468583e0479" containerName="storage-initializer" Mar 18 17:41:12.650318 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:41:12.650246 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="ad63772a-9b66-4b62-9b5d-6468583e0479" containerName="kserve-container" Mar 18 17:41:12.653187 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:41:12.653174 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-7c8d8c5bc9-jclpm" Mar 18 17:41:12.655240 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:41:12.655223 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Mar 18 17:41:12.665606 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:41:12.665584 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-7c8d8c5bc9-jclpm"] Mar 18 17:41:12.760436 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:41:12.760380 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/84fbebf0-df31-4f97-8abc-7f1d814fd214-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-7c8d8c5bc9-jclpm\" (UID: \"84fbebf0-df31-4f97-8abc-7f1d814fd214\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-7c8d8c5bc9-jclpm" Mar 18 17:41:12.760436 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:41:12.760415 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbrmc\" (UniqueName: \"kubernetes.io/projected/84fbebf0-df31-4f97-8abc-7f1d814fd214-kube-api-access-lbrmc\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-7c8d8c5bc9-jclpm\" (UID: \"84fbebf0-df31-4f97-8abc-7f1d814fd214\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-7c8d8c5bc9-jclpm" Mar 18 17:41:12.760593 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:41:12.760438 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/84fbebf0-df31-4f97-8abc-7f1d814fd214-cabundle-cert\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-7c8d8c5bc9-jclpm\" (UID: \"84fbebf0-df31-4f97-8abc-7f1d814fd214\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-7c8d8c5bc9-jclpm" Mar 18 17:41:12.822231 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:41:12.822167 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-7cdc46bbf5-844gw" podUID="e1cdc0d1-3960-4554-864d-a5713c259cf4" containerName="kserve-container" containerID="cri-o://eef39dcad7aefee74ab07adcf2fc008647d8d8f336cc7ec143dc8ec268c4128a" gracePeriod=30 Mar 18 17:41:12.861747 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:41:12.861725 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/84fbebf0-df31-4f97-8abc-7f1d814fd214-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-7c8d8c5bc9-jclpm\" (UID: \"84fbebf0-df31-4f97-8abc-7f1d814fd214\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-7c8d8c5bc9-jclpm" Mar 18 17:41:12.861863 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:41:12.861761 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lbrmc\" (UniqueName: \"kubernetes.io/projected/84fbebf0-df31-4f97-8abc-7f1d814fd214-kube-api-access-lbrmc\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-7c8d8c5bc9-jclpm\" (UID: \"84fbebf0-df31-4f97-8abc-7f1d814fd214\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-7c8d8c5bc9-jclpm" Mar 18 17:41:12.861863 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:41:12.861787 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/84fbebf0-df31-4f97-8abc-7f1d814fd214-cabundle-cert\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-7c8d8c5bc9-jclpm\" (UID: \"84fbebf0-df31-4f97-8abc-7f1d814fd214\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-7c8d8c5bc9-jclpm" Mar 18 17:41:12.862129 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:41:12.862109 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/84fbebf0-df31-4f97-8abc-7f1d814fd214-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-7c8d8c5bc9-jclpm\" (UID: \"84fbebf0-df31-4f97-8abc-7f1d814fd214\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-7c8d8c5bc9-jclpm" Mar 18 17:41:12.862346 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:41:12.862330 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/84fbebf0-df31-4f97-8abc-7f1d814fd214-cabundle-cert\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-7c8d8c5bc9-jclpm\" (UID: \"84fbebf0-df31-4f97-8abc-7f1d814fd214\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-7c8d8c5bc9-jclpm" Mar 18 17:41:12.869146 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:41:12.869122 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbrmc\" (UniqueName: \"kubernetes.io/projected/84fbebf0-df31-4f97-8abc-7f1d814fd214-kube-api-access-lbrmc\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-7c8d8c5bc9-jclpm\" (UID: \"84fbebf0-df31-4f97-8abc-7f1d814fd214\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-7c8d8c5bc9-jclpm" Mar 18 17:41:12.963014 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:41:12.962994 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-7c8d8c5bc9-jclpm" Mar 18 17:41:13.088563 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:41:13.088536 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-7c8d8c5bc9-jclpm"] Mar 18 17:41:13.089941 ip-10-0-139-49 kubenswrapper[2575]: W0318 17:41:13.089905 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84fbebf0_df31_4f97_8abc_7f1d814fd214.slice/crio-d5e9a1d3cb66a1d8110a45b3a6ca9bc2695b6a586d327f25b1361c3a5fddfde9 WatchSource:0}: Error finding container d5e9a1d3cb66a1d8110a45b3a6ca9bc2695b6a586d327f25b1361c3a5fddfde9: Status 404 returned error can't find the container with id d5e9a1d3cb66a1d8110a45b3a6ca9bc2695b6a586d327f25b1361c3a5fddfde9 Mar 18 17:41:13.091877 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:41:13.091860 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 17:41:13.826830 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:41:13.826797 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-7c8d8c5bc9-jclpm" event={"ID":"84fbebf0-df31-4f97-8abc-7f1d814fd214","Type":"ContainerStarted","Data":"8a7e07ad0c8feba21c097aab61b0a6cf8c879961076b787e2257a875f59b20be"} Mar 18 17:41:13.827205 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:41:13.826837 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-7c8d8c5bc9-jclpm" event={"ID":"84fbebf0-df31-4f97-8abc-7f1d814fd214","Type":"ContainerStarted","Data":"d5e9a1d3cb66a1d8110a45b3a6ca9bc2695b6a586d327f25b1361c3a5fddfde9"} Mar 18 17:41:14.831421 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:41:14.831380 2575 generic.go:358] "Generic (PLEG): container finished" podID="84fbebf0-df31-4f97-8abc-7f1d814fd214" containerID="8a7e07ad0c8feba21c097aab61b0a6cf8c879961076b787e2257a875f59b20be" exitCode=0 Mar 18 17:41:14.831805 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:41:14.831455 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-7c8d8c5bc9-jclpm" event={"ID":"84fbebf0-df31-4f97-8abc-7f1d814fd214","Type":"ContainerDied","Data":"8a7e07ad0c8feba21c097aab61b0a6cf8c879961076b787e2257a875f59b20be"} Mar 18 17:41:15.286476 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:41:15.286448 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:41:15.836202 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:41:15.836169 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-7c8d8c5bc9-jclpm" event={"ID":"84fbebf0-df31-4f97-8abc-7f1d814fd214","Type":"ContainerStarted","Data":"516522b4cfd24c7fa54ed1e8043b2cf26a967cc78e5d14af01838249c4f85f40"} Mar 18 17:41:15.836615 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:41:15.836396 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-7c8d8c5bc9-jclpm" Mar 18 17:41:15.837766 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:41:15.837735 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-7c8d8c5bc9-jclpm" podUID="84fbebf0-df31-4f97-8abc-7f1d814fd214" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.73:8080: connect: connection refused" Mar 18 17:41:15.855269 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:41:15.855222 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-7c8d8c5bc9-jclpm" podStartSLOduration=3.855208742 podStartE2EDuration="3.855208742s" podCreationTimestamp="2026-03-18 17:41:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:41:15.853522256 +0000 UTC m=+3394.164775168" watchObservedRunningTime="2026-03-18 17:41:15.855208742 +0000 UTC m=+3394.166461662" Mar 18 17:41:16.840241 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:41:16.840197 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-7c8d8c5bc9-jclpm" podUID="84fbebf0-df31-4f97-8abc-7f1d814fd214" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.73:8080: connect: connection refused" Mar 18 17:41:17.159389 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:41:17.159347 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-7cdc46bbf5-844gw" Mar 18 17:41:17.299002 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:41:17.298971 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qcq7z\" (UniqueName: \"kubernetes.io/projected/e1cdc0d1-3960-4554-864d-a5713c259cf4-kube-api-access-qcq7z\") pod \"e1cdc0d1-3960-4554-864d-a5713c259cf4\" (UID: \"e1cdc0d1-3960-4554-864d-a5713c259cf4\") " Mar 18 17:41:17.299154 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:41:17.299031 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e1cdc0d1-3960-4554-864d-a5713c259cf4-kserve-provision-location\") pod \"e1cdc0d1-3960-4554-864d-a5713c259cf4\" (UID: \"e1cdc0d1-3960-4554-864d-a5713c259cf4\") " Mar 18 17:41:17.299350 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:41:17.299331 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1cdc0d1-3960-4554-864d-a5713c259cf4-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "e1cdc0d1-3960-4554-864d-a5713c259cf4" (UID: "e1cdc0d1-3960-4554-864d-a5713c259cf4"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 18 17:41:17.301076 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:41:17.301045 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1cdc0d1-3960-4554-864d-a5713c259cf4-kube-api-access-qcq7z" (OuterVolumeSpecName: "kube-api-access-qcq7z") pod "e1cdc0d1-3960-4554-864d-a5713c259cf4" (UID: "e1cdc0d1-3960-4554-864d-a5713c259cf4"). InnerVolumeSpecName "kube-api-access-qcq7z". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 17:41:17.399751 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:41:17.399677 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qcq7z\" (UniqueName: \"kubernetes.io/projected/e1cdc0d1-3960-4554-864d-a5713c259cf4-kube-api-access-qcq7z\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 17:41:17.399751 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:41:17.399713 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e1cdc0d1-3960-4554-864d-a5713c259cf4-kserve-provision-location\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 17:41:17.843888 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:41:17.843800 2575 generic.go:358] "Generic (PLEG): container finished" podID="e1cdc0d1-3960-4554-864d-a5713c259cf4" containerID="eef39dcad7aefee74ab07adcf2fc008647d8d8f336cc7ec143dc8ec268c4128a" exitCode=0 Mar 18 17:41:17.843888 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:41:17.843854 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-7cdc46bbf5-844gw" event={"ID":"e1cdc0d1-3960-4554-864d-a5713c259cf4","Type":"ContainerDied","Data":"eef39dcad7aefee74ab07adcf2fc008647d8d8f336cc7ec143dc8ec268c4128a"} Mar 18 17:41:17.843888 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:41:17.843883 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-7cdc46bbf5-844gw" event={"ID":"e1cdc0d1-3960-4554-864d-a5713c259cf4","Type":"ContainerDied","Data":"3bab399c66084221bb2a2ec17f1a092cfdee0e109a78b96c915fa3218a201d0c"} Mar 18 17:41:17.844464 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:41:17.843887 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-7cdc46bbf5-844gw" Mar 18 17:41:17.844464 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:41:17.843897 2575 scope.go:117] "RemoveContainer" containerID="eef39dcad7aefee74ab07adcf2fc008647d8d8f336cc7ec143dc8ec268c4128a" Mar 18 17:41:17.852544 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:41:17.852527 2575 scope.go:117] "RemoveContainer" containerID="6d9855a436e5adf4ffe223ad61df8234fe58a33ca41f1a27ac4a3494989be2fb" Mar 18 17:41:17.865639 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:41:17.865613 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-7cdc46bbf5-844gw"] Mar 18 17:41:17.869293 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:41:17.869274 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-7cdc46bbf5-844gw"] Mar 18 17:41:17.878102 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:41:17.878081 2575 scope.go:117] "RemoveContainer" containerID="eef39dcad7aefee74ab07adcf2fc008647d8d8f336cc7ec143dc8ec268c4128a" Mar 18 17:41:17.878400 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:41:17.878377 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eef39dcad7aefee74ab07adcf2fc008647d8d8f336cc7ec143dc8ec268c4128a\": container with ID starting with eef39dcad7aefee74ab07adcf2fc008647d8d8f336cc7ec143dc8ec268c4128a not found: ID does not exist" containerID="eef39dcad7aefee74ab07adcf2fc008647d8d8f336cc7ec143dc8ec268c4128a" Mar 18 17:41:17.878498 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:41:17.878415 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eef39dcad7aefee74ab07adcf2fc008647d8d8f336cc7ec143dc8ec268c4128a"} err="failed to get container status \"eef39dcad7aefee74ab07adcf2fc008647d8d8f336cc7ec143dc8ec268c4128a\": rpc error: code = NotFound desc = could not find container \"eef39dcad7aefee74ab07adcf2fc008647d8d8f336cc7ec143dc8ec268c4128a\": container with ID starting with eef39dcad7aefee74ab07adcf2fc008647d8d8f336cc7ec143dc8ec268c4128a not found: ID does not exist" Mar 18 17:41:17.878498 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:41:17.878438 2575 scope.go:117] "RemoveContainer" containerID="6d9855a436e5adf4ffe223ad61df8234fe58a33ca41f1a27ac4a3494989be2fb" Mar 18 17:41:17.878736 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:41:17.878705 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d9855a436e5adf4ffe223ad61df8234fe58a33ca41f1a27ac4a3494989be2fb\": container with ID starting with 6d9855a436e5adf4ffe223ad61df8234fe58a33ca41f1a27ac4a3494989be2fb not found: ID does not exist" containerID="6d9855a436e5adf4ffe223ad61df8234fe58a33ca41f1a27ac4a3494989be2fb" Mar 18 17:41:17.878841 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:41:17.878744 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d9855a436e5adf4ffe223ad61df8234fe58a33ca41f1a27ac4a3494989be2fb"} err="failed to get container status \"6d9855a436e5adf4ffe223ad61df8234fe58a33ca41f1a27ac4a3494989be2fb\": rpc error: code = NotFound desc = could not find container \"6d9855a436e5adf4ffe223ad61df8234fe58a33ca41f1a27ac4a3494989be2fb\": container with ID starting with 6d9855a436e5adf4ffe223ad61df8234fe58a33ca41f1a27ac4a3494989be2fb not found: ID does not exist" Mar 18 17:41:18.290696 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:41:18.290667 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1cdc0d1-3960-4554-864d-a5713c259cf4" path="/var/lib/kubelet/pods/e1cdc0d1-3960-4554-864d-a5713c259cf4/volumes" Mar 18 17:41:26.841087 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:41:26.841046 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-7c8d8c5bc9-jclpm" podUID="84fbebf0-df31-4f97-8abc-7f1d814fd214" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.73:8080: connect: connection refused" Mar 18 17:41:29.286649 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:41:29.286618 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:41:36.840581 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:41:36.840541 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-7c8d8c5bc9-jclpm" podUID="84fbebf0-df31-4f97-8abc-7f1d814fd214" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.73:8080: connect: connection refused" Mar 18 17:41:40.287731 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:41:40.287700 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:41:46.841069 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:41:46.841024 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-7c8d8c5bc9-jclpm" podUID="84fbebf0-df31-4f97-8abc-7f1d814fd214" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.73:8080: connect: connection refused" Mar 18 17:41:55.286614 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:41:55.286582 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:41:56.841046 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:41:56.841011 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-7c8d8c5bc9-jclpm" podUID="84fbebf0-df31-4f97-8abc-7f1d814fd214" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.73:8080: connect: connection refused" Mar 18 17:42:06.286695 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:42:06.286652 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:42:06.841206 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:42:06.841166 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-7c8d8c5bc9-jclpm" podUID="84fbebf0-df31-4f97-8abc-7f1d814fd214" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.73:8080: connect: connection refused" Mar 18 17:42:16.840650 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:42:16.840610 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-7c8d8c5bc9-jclpm" podUID="84fbebf0-df31-4f97-8abc-7f1d814fd214" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.73:8080: connect: connection refused" Mar 18 17:42:18.287538 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:42:18.287508 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:42:22.287786 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:42:22.287746 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-7c8d8c5bc9-jclpm" podUID="84fbebf0-df31-4f97-8abc-7f1d814fd214" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.73:8080: connect: connection refused" Mar 18 17:42:32.288770 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:42:32.288738 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:42:32.289930 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:42:32.289912 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-7c8d8c5bc9-jclpm" Mar 18 17:42:32.665630 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:42:32.665600 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-7c8d8c5bc9-jclpm"] Mar 18 17:42:33.097158 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:42:33.097085 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-7c8d8c5bc9-jclpm" podUID="84fbebf0-df31-4f97-8abc-7f1d814fd214" containerName="kserve-container" containerID="cri-o://516522b4cfd24c7fa54ed1e8043b2cf26a967cc78e5d14af01838249c4f85f40" gracePeriod=30 Mar 18 17:42:33.665736 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:42:33.665700 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-777b7cd77-h52xd"] Mar 18 17:42:33.666108 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:42:33.666025 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e1cdc0d1-3960-4554-864d-a5713c259cf4" containerName="storage-initializer" Mar 18 17:42:33.666108 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:42:33.666036 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1cdc0d1-3960-4554-864d-a5713c259cf4" containerName="storage-initializer" Mar 18 17:42:33.666108 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:42:33.666060 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e1cdc0d1-3960-4554-864d-a5713c259cf4" containerName="kserve-container" Mar 18 17:42:33.666108 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:42:33.666065 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1cdc0d1-3960-4554-864d-a5713c259cf4" containerName="kserve-container" Mar 18 17:42:33.666253 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:42:33.666112 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="e1cdc0d1-3960-4554-864d-a5713c259cf4" containerName="kserve-container" Mar 18 17:42:33.669188 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:42:33.669169 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-777b7cd77-h52xd" Mar 18 17:42:33.678249 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:42:33.678230 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-777b7cd77-h52xd"] Mar 18 17:42:33.754079 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:42:33.754048 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqb2m\" (UniqueName: \"kubernetes.io/projected/44bd78f6-b29b-40cb-985f-8392c36fe1c9-kube-api-access-mqb2m\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-777b7cd77-h52xd\" (UID: \"44bd78f6-b29b-40cb-985f-8392c36fe1c9\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-777b7cd77-h52xd" Mar 18 17:42:33.754207 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:42:33.754139 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/44bd78f6-b29b-40cb-985f-8392c36fe1c9-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-777b7cd77-h52xd\" (UID: \"44bd78f6-b29b-40cb-985f-8392c36fe1c9\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-777b7cd77-h52xd" Mar 18 17:42:33.854700 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:42:33.854673 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/44bd78f6-b29b-40cb-985f-8392c36fe1c9-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-777b7cd77-h52xd\" (UID: \"44bd78f6-b29b-40cb-985f-8392c36fe1c9\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-777b7cd77-h52xd" Mar 18 17:42:33.854810 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:42:33.854720 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mqb2m\" (UniqueName: \"kubernetes.io/projected/44bd78f6-b29b-40cb-985f-8392c36fe1c9-kube-api-access-mqb2m\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-777b7cd77-h52xd\" (UID: \"44bd78f6-b29b-40cb-985f-8392c36fe1c9\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-777b7cd77-h52xd" Mar 18 17:42:33.855035 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:42:33.855016 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/44bd78f6-b29b-40cb-985f-8392c36fe1c9-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-777b7cd77-h52xd\" (UID: \"44bd78f6-b29b-40cb-985f-8392c36fe1c9\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-777b7cd77-h52xd" Mar 18 17:42:33.862733 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:42:33.862709 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqb2m\" (UniqueName: \"kubernetes.io/projected/44bd78f6-b29b-40cb-985f-8392c36fe1c9-kube-api-access-mqb2m\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-777b7cd77-h52xd\" (UID: \"44bd78f6-b29b-40cb-985f-8392c36fe1c9\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-777b7cd77-h52xd" Mar 18 17:42:33.981104 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:42:33.981038 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-777b7cd77-h52xd" Mar 18 17:42:34.104622 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:42:34.104588 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-777b7cd77-h52xd"] Mar 18 17:42:34.107269 ip-10-0-139-49 kubenswrapper[2575]: W0318 17:42:34.107242 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44bd78f6_b29b_40cb_985f_8392c36fe1c9.slice/crio-103df6bfe0a2b0a268b938918631a2e03f6a40d00e7837c05ce006960541d383 WatchSource:0}: Error finding container 103df6bfe0a2b0a268b938918631a2e03f6a40d00e7837c05ce006960541d383: Status 404 returned error can't find the container with id 103df6bfe0a2b0a268b938918631a2e03f6a40d00e7837c05ce006960541d383 Mar 18 17:42:35.104713 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:42:35.104676 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-777b7cd77-h52xd" event={"ID":"44bd78f6-b29b-40cb-985f-8392c36fe1c9","Type":"ContainerStarted","Data":"ecb8d2156bd597e18f9bd061d4cf4f7da9b1e212f516919a039dfa3c5fb7ee9c"} Mar 18 17:42:35.104713 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:42:35.104713 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-777b7cd77-h52xd" event={"ID":"44bd78f6-b29b-40cb-985f-8392c36fe1c9","Type":"ContainerStarted","Data":"103df6bfe0a2b0a268b938918631a2e03f6a40d00e7837c05ce006960541d383"} Mar 18 17:42:37.337112 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:42:37.337090 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-7c8d8c5bc9-jclpm" Mar 18 17:42:37.481672 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:42:37.481598 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lbrmc\" (UniqueName: \"kubernetes.io/projected/84fbebf0-df31-4f97-8abc-7f1d814fd214-kube-api-access-lbrmc\") pod \"84fbebf0-df31-4f97-8abc-7f1d814fd214\" (UID: \"84fbebf0-df31-4f97-8abc-7f1d814fd214\") " Mar 18 17:42:37.481672 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:42:37.481635 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/84fbebf0-df31-4f97-8abc-7f1d814fd214-kserve-provision-location\") pod \"84fbebf0-df31-4f97-8abc-7f1d814fd214\" (UID: \"84fbebf0-df31-4f97-8abc-7f1d814fd214\") " Mar 18 17:42:37.481852 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:42:37.481685 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/84fbebf0-df31-4f97-8abc-7f1d814fd214-cabundle-cert\") pod \"84fbebf0-df31-4f97-8abc-7f1d814fd214\" (UID: \"84fbebf0-df31-4f97-8abc-7f1d814fd214\") " Mar 18 17:42:37.481983 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:42:37.481953 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84fbebf0-df31-4f97-8abc-7f1d814fd214-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "84fbebf0-df31-4f97-8abc-7f1d814fd214" (UID: "84fbebf0-df31-4f97-8abc-7f1d814fd214"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 18 17:42:37.482090 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:42:37.482041 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84fbebf0-df31-4f97-8abc-7f1d814fd214-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "84fbebf0-df31-4f97-8abc-7f1d814fd214" (UID: "84fbebf0-df31-4f97-8abc-7f1d814fd214"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 17:42:37.483750 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:42:37.483727 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84fbebf0-df31-4f97-8abc-7f1d814fd214-kube-api-access-lbrmc" (OuterVolumeSpecName: "kube-api-access-lbrmc") pod "84fbebf0-df31-4f97-8abc-7f1d814fd214" (UID: "84fbebf0-df31-4f97-8abc-7f1d814fd214"). InnerVolumeSpecName "kube-api-access-lbrmc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 17:42:37.582647 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:42:37.582614 2575 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/84fbebf0-df31-4f97-8abc-7f1d814fd214-cabundle-cert\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 17:42:37.582647 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:42:37.582646 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lbrmc\" (UniqueName: \"kubernetes.io/projected/84fbebf0-df31-4f97-8abc-7f1d814fd214-kube-api-access-lbrmc\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 17:42:37.582647 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:42:37.582655 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/84fbebf0-df31-4f97-8abc-7f1d814fd214-kserve-provision-location\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 17:42:38.115045 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:42:38.115011 2575 generic.go:358] "Generic (PLEG): container finished" podID="84fbebf0-df31-4f97-8abc-7f1d814fd214" containerID="516522b4cfd24c7fa54ed1e8043b2cf26a967cc78e5d14af01838249c4f85f40" exitCode=0 Mar 18 17:42:38.115203 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:42:38.115089 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-7c8d8c5bc9-jclpm" Mar 18 17:42:38.115203 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:42:38.115094 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-7c8d8c5bc9-jclpm" event={"ID":"84fbebf0-df31-4f97-8abc-7f1d814fd214","Type":"ContainerDied","Data":"516522b4cfd24c7fa54ed1e8043b2cf26a967cc78e5d14af01838249c4f85f40"} Mar 18 17:42:38.115203 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:42:38.115131 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-7c8d8c5bc9-jclpm" event={"ID":"84fbebf0-df31-4f97-8abc-7f1d814fd214","Type":"ContainerDied","Data":"d5e9a1d3cb66a1d8110a45b3a6ca9bc2695b6a586d327f25b1361c3a5fddfde9"} Mar 18 17:42:38.115203 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:42:38.115147 2575 scope.go:117] "RemoveContainer" containerID="516522b4cfd24c7fa54ed1e8043b2cf26a967cc78e5d14af01838249c4f85f40" Mar 18 17:42:38.123969 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:42:38.123953 2575 scope.go:117] "RemoveContainer" containerID="8a7e07ad0c8feba21c097aab61b0a6cf8c879961076b787e2257a875f59b20be" Mar 18 17:42:38.135883 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:42:38.135858 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-7c8d8c5bc9-jclpm"] Mar 18 17:42:38.137296 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:42:38.137278 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-7c8d8c5bc9-jclpm"] Mar 18 17:42:38.148100 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:42:38.148085 2575 scope.go:117] "RemoveContainer" containerID="516522b4cfd24c7fa54ed1e8043b2cf26a967cc78e5d14af01838249c4f85f40" Mar 18 17:42:38.148348 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:42:38.148330 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"516522b4cfd24c7fa54ed1e8043b2cf26a967cc78e5d14af01838249c4f85f40\": container with ID starting with 516522b4cfd24c7fa54ed1e8043b2cf26a967cc78e5d14af01838249c4f85f40 not found: ID does not exist" containerID="516522b4cfd24c7fa54ed1e8043b2cf26a967cc78e5d14af01838249c4f85f40" Mar 18 17:42:38.148429 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:42:38.148372 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"516522b4cfd24c7fa54ed1e8043b2cf26a967cc78e5d14af01838249c4f85f40"} err="failed to get container status \"516522b4cfd24c7fa54ed1e8043b2cf26a967cc78e5d14af01838249c4f85f40\": rpc error: code = NotFound desc = could not find container \"516522b4cfd24c7fa54ed1e8043b2cf26a967cc78e5d14af01838249c4f85f40\": container with ID starting with 516522b4cfd24c7fa54ed1e8043b2cf26a967cc78e5d14af01838249c4f85f40 not found: ID does not exist" Mar 18 17:42:38.148429 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:42:38.148390 2575 scope.go:117] "RemoveContainer" containerID="8a7e07ad0c8feba21c097aab61b0a6cf8c879961076b787e2257a875f59b20be" Mar 18 17:42:38.148648 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:42:38.148631 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a7e07ad0c8feba21c097aab61b0a6cf8c879961076b787e2257a875f59b20be\": container with ID starting with 8a7e07ad0c8feba21c097aab61b0a6cf8c879961076b787e2257a875f59b20be not found: ID does not exist" containerID="8a7e07ad0c8feba21c097aab61b0a6cf8c879961076b787e2257a875f59b20be" Mar 18 17:42:38.148697 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:42:38.148651 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a7e07ad0c8feba21c097aab61b0a6cf8c879961076b787e2257a875f59b20be"} err="failed to get container status \"8a7e07ad0c8feba21c097aab61b0a6cf8c879961076b787e2257a875f59b20be\": rpc error: code = NotFound desc = could not find container \"8a7e07ad0c8feba21c097aab61b0a6cf8c879961076b787e2257a875f59b20be\": container with ID starting with 8a7e07ad0c8feba21c097aab61b0a6cf8c879961076b787e2257a875f59b20be not found: ID does not exist" Mar 18 17:42:38.290309 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:42:38.290284 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84fbebf0-df31-4f97-8abc-7f1d814fd214" path="/var/lib/kubelet/pods/84fbebf0-df31-4f97-8abc-7f1d814fd214/volumes" Mar 18 17:42:39.119294 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:42:39.119275 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-777b7cd77-h52xd_44bd78f6-b29b-40cb-985f-8392c36fe1c9/storage-initializer/0.log" Mar 18 17:42:39.119609 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:42:39.119311 2575 generic.go:358] "Generic (PLEG): container finished" podID="44bd78f6-b29b-40cb-985f-8392c36fe1c9" containerID="ecb8d2156bd597e18f9bd061d4cf4f7da9b1e212f516919a039dfa3c5fb7ee9c" exitCode=1 Mar 18 17:42:39.119609 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:42:39.119396 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-777b7cd77-h52xd" event={"ID":"44bd78f6-b29b-40cb-985f-8392c36fe1c9","Type":"ContainerDied","Data":"ecb8d2156bd597e18f9bd061d4cf4f7da9b1e212f516919a039dfa3c5fb7ee9c"} Mar 18 17:42:40.127528 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:42:40.127499 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-777b7cd77-h52xd_44bd78f6-b29b-40cb-985f-8392c36fe1c9/storage-initializer/0.log" Mar 18 17:42:40.127916 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:42:40.127584 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-777b7cd77-h52xd" event={"ID":"44bd78f6-b29b-40cb-985f-8392c36fe1c9","Type":"ContainerStarted","Data":"f9118c69a6a8de55e7783bbdb6598169706bca69fda81a78af9d9d224c598bfb"} Mar 18 17:42:43.775543 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:42:43.775508 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-777b7cd77-h52xd"] Mar 18 17:42:43.775928 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:42:43.775758 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-777b7cd77-h52xd" podUID="44bd78f6-b29b-40cb-985f-8392c36fe1c9" containerName="storage-initializer" containerID="cri-o://f9118c69a6a8de55e7783bbdb6598169706bca69fda81a78af9d9d224c598bfb" gracePeriod=30 Mar 18 17:42:44.843672 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:42:44.843644 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-84bf76db47-kpvtc"] Mar 18 17:42:44.844041 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:42:44.843954 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="84fbebf0-df31-4f97-8abc-7f1d814fd214" containerName="storage-initializer" Mar 18 17:42:44.844041 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:42:44.843964 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="84fbebf0-df31-4f97-8abc-7f1d814fd214" containerName="storage-initializer" Mar 18 17:42:44.844041 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:42:44.843987 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="84fbebf0-df31-4f97-8abc-7f1d814fd214" containerName="kserve-container" Mar 18 17:42:44.844041 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:42:44.843993 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="84fbebf0-df31-4f97-8abc-7f1d814fd214" containerName="kserve-container" Mar 18 17:42:44.844176 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:42:44.844045 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="84fbebf0-df31-4f97-8abc-7f1d814fd214" containerName="kserve-container" Mar 18 17:42:44.846913 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:42:44.846895 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-84bf76db47-kpvtc" Mar 18 17:42:44.848663 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:42:44.848643 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Mar 18 17:42:44.854415 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:42:44.854390 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-84bf76db47-kpvtc"] Mar 18 17:42:44.937243 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:42:44.937222 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dded66b1-cfad-4535-816c-935027fc98d5-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-84bf76db47-kpvtc\" (UID: \"dded66b1-cfad-4535-816c-935027fc98d5\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-84bf76db47-kpvtc" Mar 18 17:42:44.937346 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:42:44.937254 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/dded66b1-cfad-4535-816c-935027fc98d5-cabundle-cert\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-84bf76db47-kpvtc\" (UID: \"dded66b1-cfad-4535-816c-935027fc98d5\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-84bf76db47-kpvtc" Mar 18 17:42:44.937346 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:42:44.937332 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62mjg\" (UniqueName: \"kubernetes.io/projected/dded66b1-cfad-4535-816c-935027fc98d5-kube-api-access-62mjg\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-84bf76db47-kpvtc\" (UID: \"dded66b1-cfad-4535-816c-935027fc98d5\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-84bf76db47-kpvtc" Mar 18 17:42:45.038718 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:42:45.038689 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-62mjg\" (UniqueName: \"kubernetes.io/projected/dded66b1-cfad-4535-816c-935027fc98d5-kube-api-access-62mjg\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-84bf76db47-kpvtc\" (UID: \"dded66b1-cfad-4535-816c-935027fc98d5\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-84bf76db47-kpvtc" Mar 18 17:42:45.038805 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:42:45.038774 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dded66b1-cfad-4535-816c-935027fc98d5-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-84bf76db47-kpvtc\" (UID: \"dded66b1-cfad-4535-816c-935027fc98d5\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-84bf76db47-kpvtc" Mar 18 17:42:45.038865 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:42:45.038818 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/dded66b1-cfad-4535-816c-935027fc98d5-cabundle-cert\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-84bf76db47-kpvtc\" (UID: \"dded66b1-cfad-4535-816c-935027fc98d5\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-84bf76db47-kpvtc" Mar 18 17:42:45.039096 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:42:45.039079 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dded66b1-cfad-4535-816c-935027fc98d5-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-84bf76db47-kpvtc\" (UID: \"dded66b1-cfad-4535-816c-935027fc98d5\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-84bf76db47-kpvtc" Mar 18 17:42:45.039328 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:42:45.039312 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/dded66b1-cfad-4535-816c-935027fc98d5-cabundle-cert\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-84bf76db47-kpvtc\" (UID: \"dded66b1-cfad-4535-816c-935027fc98d5\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-84bf76db47-kpvtc" Mar 18 17:42:45.046403 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:42:45.046380 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-62mjg\" (UniqueName: \"kubernetes.io/projected/dded66b1-cfad-4535-816c-935027fc98d5-kube-api-access-62mjg\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-84bf76db47-kpvtc\" (UID: \"dded66b1-cfad-4535-816c-935027fc98d5\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-84bf76db47-kpvtc" Mar 18 17:42:45.158792 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:42:45.158771 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-84bf76db47-kpvtc" Mar 18 17:42:45.280403 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:42:45.280377 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-84bf76db47-kpvtc"] Mar 18 17:42:45.282130 ip-10-0-139-49 kubenswrapper[2575]: W0318 17:42:45.282102 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddded66b1_cfad_4535_816c_935027fc98d5.slice/crio-3e277cd1a9b384e80deb809cf94d3d6e32f8b4893647bdd60394609634a57114 WatchSource:0}: Error finding container 3e277cd1a9b384e80deb809cf94d3d6e32f8b4893647bdd60394609634a57114: Status 404 returned error can't find the container with id 3e277cd1a9b384e80deb809cf94d3d6e32f8b4893647bdd60394609634a57114 Mar 18 17:42:45.614448 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:42:45.614428 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-777b7cd77-h52xd_44bd78f6-b29b-40cb-985f-8392c36fe1c9/storage-initializer/1.log" Mar 18 17:42:45.614853 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:42:45.614840 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-777b7cd77-h52xd_44bd78f6-b29b-40cb-985f-8392c36fe1c9/storage-initializer/0.log" Mar 18 17:42:45.614928 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:42:45.614898 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-777b7cd77-h52xd" Mar 18 17:42:45.746071 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:42:45.746001 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/44bd78f6-b29b-40cb-985f-8392c36fe1c9-kserve-provision-location\") pod \"44bd78f6-b29b-40cb-985f-8392c36fe1c9\" (UID: \"44bd78f6-b29b-40cb-985f-8392c36fe1c9\") " Mar 18 17:42:45.746071 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:42:45.746053 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mqb2m\" (UniqueName: \"kubernetes.io/projected/44bd78f6-b29b-40cb-985f-8392c36fe1c9-kube-api-access-mqb2m\") pod \"44bd78f6-b29b-40cb-985f-8392c36fe1c9\" (UID: \"44bd78f6-b29b-40cb-985f-8392c36fe1c9\") " Mar 18 17:42:45.746317 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:42:45.746294 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44bd78f6-b29b-40cb-985f-8392c36fe1c9-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "44bd78f6-b29b-40cb-985f-8392c36fe1c9" (UID: "44bd78f6-b29b-40cb-985f-8392c36fe1c9"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 18 17:42:45.748203 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:42:45.748182 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44bd78f6-b29b-40cb-985f-8392c36fe1c9-kube-api-access-mqb2m" (OuterVolumeSpecName: "kube-api-access-mqb2m") pod "44bd78f6-b29b-40cb-985f-8392c36fe1c9" (UID: "44bd78f6-b29b-40cb-985f-8392c36fe1c9"). InnerVolumeSpecName "kube-api-access-mqb2m". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 17:42:45.847200 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:42:45.847173 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/44bd78f6-b29b-40cb-985f-8392c36fe1c9-kserve-provision-location\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 17:42:45.847200 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:42:45.847195 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mqb2m\" (UniqueName: \"kubernetes.io/projected/44bd78f6-b29b-40cb-985f-8392c36fe1c9-kube-api-access-mqb2m\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 17:42:46.145487 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:42:46.145460 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-777b7cd77-h52xd_44bd78f6-b29b-40cb-985f-8392c36fe1c9/storage-initializer/1.log" Mar 18 17:42:46.145804 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:42:46.145787 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-777b7cd77-h52xd_44bd78f6-b29b-40cb-985f-8392c36fe1c9/storage-initializer/0.log" Mar 18 17:42:46.145914 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:42:46.145829 2575 generic.go:358] "Generic (PLEG): container finished" podID="44bd78f6-b29b-40cb-985f-8392c36fe1c9" containerID="f9118c69a6a8de55e7783bbdb6598169706bca69fda81a78af9d9d224c598bfb" exitCode=1 Mar 18 17:42:46.145914 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:42:46.145900 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-777b7cd77-h52xd" Mar 18 17:42:46.146027 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:42:46.145909 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-777b7cd77-h52xd" event={"ID":"44bd78f6-b29b-40cb-985f-8392c36fe1c9","Type":"ContainerDied","Data":"f9118c69a6a8de55e7783bbdb6598169706bca69fda81a78af9d9d224c598bfb"} Mar 18 17:42:46.146027 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:42:46.145947 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-777b7cd77-h52xd" event={"ID":"44bd78f6-b29b-40cb-985f-8392c36fe1c9","Type":"ContainerDied","Data":"103df6bfe0a2b0a268b938918631a2e03f6a40d00e7837c05ce006960541d383"} Mar 18 17:42:46.146027 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:42:46.145969 2575 scope.go:117] "RemoveContainer" containerID="f9118c69a6a8de55e7783bbdb6598169706bca69fda81a78af9d9d224c598bfb" Mar 18 17:42:46.147378 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:42:46.147333 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-84bf76db47-kpvtc" event={"ID":"dded66b1-cfad-4535-816c-935027fc98d5","Type":"ContainerStarted","Data":"185de8591fafda7afa73ffb7750d641862f4b1f0ab06bb46348b14e1ff754ac5"} Mar 18 17:42:46.147481 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:42:46.147386 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-84bf76db47-kpvtc" event={"ID":"dded66b1-cfad-4535-816c-935027fc98d5","Type":"ContainerStarted","Data":"3e277cd1a9b384e80deb809cf94d3d6e32f8b4893647bdd60394609634a57114"} Mar 18 17:42:46.155508 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:42:46.155484 2575 scope.go:117] "RemoveContainer" containerID="ecb8d2156bd597e18f9bd061d4cf4f7da9b1e212f516919a039dfa3c5fb7ee9c" Mar 18 17:42:46.176455 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:42:46.176436 2575 scope.go:117] "RemoveContainer" containerID="f9118c69a6a8de55e7783bbdb6598169706bca69fda81a78af9d9d224c598bfb" Mar 18 17:42:46.176730 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:42:46.176710 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9118c69a6a8de55e7783bbdb6598169706bca69fda81a78af9d9d224c598bfb\": container with ID starting with f9118c69a6a8de55e7783bbdb6598169706bca69fda81a78af9d9d224c598bfb not found: ID does not exist" containerID="f9118c69a6a8de55e7783bbdb6598169706bca69fda81a78af9d9d224c598bfb" Mar 18 17:42:46.176792 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:42:46.176738 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9118c69a6a8de55e7783bbdb6598169706bca69fda81a78af9d9d224c598bfb"} err="failed to get container status \"f9118c69a6a8de55e7783bbdb6598169706bca69fda81a78af9d9d224c598bfb\": rpc error: code = NotFound desc = could not find container \"f9118c69a6a8de55e7783bbdb6598169706bca69fda81a78af9d9d224c598bfb\": container with ID starting with f9118c69a6a8de55e7783bbdb6598169706bca69fda81a78af9d9d224c598bfb not found: ID does not exist" Mar 18 17:42:46.176792 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:42:46.176758 2575 scope.go:117] "RemoveContainer" containerID="ecb8d2156bd597e18f9bd061d4cf4f7da9b1e212f516919a039dfa3c5fb7ee9c" Mar 18 17:42:46.177007 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:42:46.176988 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ecb8d2156bd597e18f9bd061d4cf4f7da9b1e212f516919a039dfa3c5fb7ee9c\": container with ID starting with ecb8d2156bd597e18f9bd061d4cf4f7da9b1e212f516919a039dfa3c5fb7ee9c not found: ID does not exist" containerID="ecb8d2156bd597e18f9bd061d4cf4f7da9b1e212f516919a039dfa3c5fb7ee9c" Mar 18 17:42:46.177052 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:42:46.177014 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ecb8d2156bd597e18f9bd061d4cf4f7da9b1e212f516919a039dfa3c5fb7ee9c"} err="failed to get container status \"ecb8d2156bd597e18f9bd061d4cf4f7da9b1e212f516919a039dfa3c5fb7ee9c\": rpc error: code = NotFound desc = could not find container \"ecb8d2156bd597e18f9bd061d4cf4f7da9b1e212f516919a039dfa3c5fb7ee9c\": container with ID starting with ecb8d2156bd597e18f9bd061d4cf4f7da9b1e212f516919a039dfa3c5fb7ee9c not found: ID does not exist" Mar 18 17:42:46.198019 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:42:46.197993 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-777b7cd77-h52xd"] Mar 18 17:42:46.199829 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:42:46.199808 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-777b7cd77-h52xd"] Mar 18 17:42:46.289639 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:42:46.289612 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44bd78f6-b29b-40cb-985f-8392c36fe1c9" path="/var/lib/kubelet/pods/44bd78f6-b29b-40cb-985f-8392c36fe1c9/volumes" Mar 18 17:42:47.152080 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:42:47.152047 2575 generic.go:358] "Generic (PLEG): container finished" podID="dded66b1-cfad-4535-816c-935027fc98d5" containerID="185de8591fafda7afa73ffb7750d641862f4b1f0ab06bb46348b14e1ff754ac5" exitCode=0 Mar 18 17:42:47.152438 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:42:47.152136 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-84bf76db47-kpvtc" event={"ID":"dded66b1-cfad-4535-816c-935027fc98d5","Type":"ContainerDied","Data":"185de8591fafda7afa73ffb7750d641862f4b1f0ab06bb46348b14e1ff754ac5"} Mar 18 17:42:47.287571 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:42:47.287547 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:42:48.156773 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:42:48.156739 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-84bf76db47-kpvtc" event={"ID":"dded66b1-cfad-4535-816c-935027fc98d5","Type":"ContainerStarted","Data":"30a3580673a49c2521788bd4aa663e29d3c15fe1702a4a77516c558a111fca70"} Mar 18 17:42:48.157197 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:42:48.156972 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-84bf76db47-kpvtc" Mar 18 17:42:48.158339 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:42:48.158311 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-84bf76db47-kpvtc" podUID="dded66b1-cfad-4535-816c-935027fc98d5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.75:8080: connect: connection refused" Mar 18 17:42:48.177139 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:42:48.177095 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-84bf76db47-kpvtc" podStartSLOduration=4.177081658 podStartE2EDuration="4.177081658s" podCreationTimestamp="2026-03-18 17:42:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:42:48.176402996 +0000 UTC m=+3486.487655919" watchObservedRunningTime="2026-03-18 17:42:48.177081658 +0000 UTC m=+3486.488334604" Mar 18 17:42:49.160133 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:42:49.160097 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-84bf76db47-kpvtc" podUID="dded66b1-cfad-4535-816c-935027fc98d5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.75:8080: connect: connection refused" Mar 18 17:42:59.160828 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:42:59.160787 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-84bf76db47-kpvtc" podUID="dded66b1-cfad-4535-816c-935027fc98d5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.75:8080: connect: connection refused" Mar 18 17:43:02.289103 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:43:02.289072 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:43:09.160346 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:43:09.160306 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-84bf76db47-kpvtc" podUID="dded66b1-cfad-4535-816c-935027fc98d5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.75:8080: connect: connection refused" Mar 18 17:43:15.286575 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:43:15.286541 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:43:19.160773 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:43:19.160727 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-84bf76db47-kpvtc" podUID="dded66b1-cfad-4535-816c-935027fc98d5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.75:8080: connect: connection refused" Mar 18 17:43:29.160888 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:43:29.160848 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-84bf76db47-kpvtc" podUID="dded66b1-cfad-4535-816c-935027fc98d5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.75:8080: connect: connection refused" Mar 18 17:43:29.286884 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:43:29.286848 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:43:39.160501 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:43:39.160459 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-84bf76db47-kpvtc" podUID="dded66b1-cfad-4535-816c-935027fc98d5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.75:8080: connect: connection refused" Mar 18 17:43:40.563387 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:43:40.563285 2575 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized" image="quay.io/opendatahub/odh-model-serving-api:fast" Mar 18 17:43:40.563794 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:43:40.563472 2575 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:server,Image:quay.io/opendatahub/odh-model-serving-api:fast,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},ContainerPort{Name:metrics,HostPort:0,ContainerPort:9090,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:TLS_CERT_DIR,Value:/tls,ValueFrom:nil,},EnvVar{Name:GATEWAY_LABEL_SELECTOR,Value:,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{50 -3} {} 50m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tls-certs,ReadOnly:true,MountPath:/tls,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vhs26,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000640000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod model-serving-api-9699c8d45-ttb7x_kserve(e41abd12-239d-46fb-9cdb-9d35fa51024d): ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized" logger="UnhandledError" Mar 18 17:43:40.564645 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:43:40.564618 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ErrImagePull: \"unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:43:49.160765 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:43:49.160712 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-84bf76db47-kpvtc" podUID="dded66b1-cfad-4535-816c-935027fc98d5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.75:8080: connect: connection refused" Mar 18 17:43:51.286761 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:43:51.286727 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:43:58.286502 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:43:58.286461 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-84bf76db47-kpvtc" podUID="dded66b1-cfad-4535-816c-935027fc98d5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.75:8080: connect: connection refused" Mar 18 17:44:05.287119 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:44:05.287090 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:44:08.289600 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:44:08.289569 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-84bf76db47-kpvtc" Mar 18 17:44:14.757573 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:44:14.757537 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-84bf76db47-kpvtc"] Mar 18 17:44:14.758022 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:44:14.757795 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-84bf76db47-kpvtc" podUID="dded66b1-cfad-4535-816c-935027fc98d5" containerName="kserve-container" containerID="cri-o://30a3580673a49c2521788bd4aa663e29d3c15fe1702a4a77516c558a111fca70" gracePeriod=30 Mar 18 17:44:16.146401 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:44:16.146347 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7cbf6fcdb9-5dwhg"] Mar 18 17:44:16.146773 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:44:16.146718 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="44bd78f6-b29b-40cb-985f-8392c36fe1c9" containerName="storage-initializer" Mar 18 17:44:16.146773 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:44:16.146731 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="44bd78f6-b29b-40cb-985f-8392c36fe1c9" containerName="storage-initializer" Mar 18 17:44:16.146773 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:44:16.146742 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="44bd78f6-b29b-40cb-985f-8392c36fe1c9" containerName="storage-initializer" Mar 18 17:44:16.146773 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:44:16.146747 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="44bd78f6-b29b-40cb-985f-8392c36fe1c9" containerName="storage-initializer" Mar 18 17:44:16.146902 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:44:16.146809 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="44bd78f6-b29b-40cb-985f-8392c36fe1c9" containerName="storage-initializer" Mar 18 17:44:16.146902 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:44:16.146822 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="44bd78f6-b29b-40cb-985f-8392c36fe1c9" containerName="storage-initializer" Mar 18 17:44:16.149780 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:44:16.149764 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7cbf6fcdb9-5dwhg" Mar 18 17:44:16.158515 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:44:16.158494 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7cbf6fcdb9-5dwhg"] Mar 18 17:44:16.265572 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:44:16.265543 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cc5pz\" (UniqueName: \"kubernetes.io/projected/902e409a-cbd4-4ff5-afea-c6b2a066e081-kube-api-access-cc5pz\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-7cbf6fcdb9-5dwhg\" (UID: \"902e409a-cbd4-4ff5-afea-c6b2a066e081\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7cbf6fcdb9-5dwhg" Mar 18 17:44:16.265741 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:44:16.265612 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/902e409a-cbd4-4ff5-afea-c6b2a066e081-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-7cbf6fcdb9-5dwhg\" (UID: \"902e409a-cbd4-4ff5-afea-c6b2a066e081\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7cbf6fcdb9-5dwhg" Mar 18 17:44:16.366608 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:44:16.366576 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/902e409a-cbd4-4ff5-afea-c6b2a066e081-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-7cbf6fcdb9-5dwhg\" (UID: \"902e409a-cbd4-4ff5-afea-c6b2a066e081\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7cbf6fcdb9-5dwhg" Mar 18 17:44:16.366765 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:44:16.366630 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cc5pz\" (UniqueName: \"kubernetes.io/projected/902e409a-cbd4-4ff5-afea-c6b2a066e081-kube-api-access-cc5pz\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-7cbf6fcdb9-5dwhg\" (UID: \"902e409a-cbd4-4ff5-afea-c6b2a066e081\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7cbf6fcdb9-5dwhg" Mar 18 17:44:16.366950 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:44:16.366930 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/902e409a-cbd4-4ff5-afea-c6b2a066e081-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-7cbf6fcdb9-5dwhg\" (UID: \"902e409a-cbd4-4ff5-afea-c6b2a066e081\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7cbf6fcdb9-5dwhg" Mar 18 17:44:16.375762 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:44:16.375731 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cc5pz\" (UniqueName: \"kubernetes.io/projected/902e409a-cbd4-4ff5-afea-c6b2a066e081-kube-api-access-cc5pz\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-7cbf6fcdb9-5dwhg\" (UID: \"902e409a-cbd4-4ff5-afea-c6b2a066e081\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7cbf6fcdb9-5dwhg" Mar 18 17:44:16.460468 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:44:16.460398 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7cbf6fcdb9-5dwhg" Mar 18 17:44:16.587527 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:44:16.587438 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7cbf6fcdb9-5dwhg"] Mar 18 17:44:16.589925 ip-10-0-139-49 kubenswrapper[2575]: W0318 17:44:16.589896 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod902e409a_cbd4_4ff5_afea_c6b2a066e081.slice/crio-c86fee9f74e55770565006c5439a81d4b24b7bfd0991f492ce532bde3fab14db WatchSource:0}: Error finding container c86fee9f74e55770565006c5439a81d4b24b7bfd0991f492ce532bde3fab14db: Status 404 returned error can't find the container with id c86fee9f74e55770565006c5439a81d4b24b7bfd0991f492ce532bde3fab14db Mar 18 17:44:17.287215 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:44:17.287184 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:44:17.444020 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:44:17.443977 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7cbf6fcdb9-5dwhg" event={"ID":"902e409a-cbd4-4ff5-afea-c6b2a066e081","Type":"ContainerStarted","Data":"c816c99a8afaf0886f2d727e61481527455221429e17dc21a3d56d41356104ea"} Mar 18 17:44:17.444020 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:44:17.444011 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7cbf6fcdb9-5dwhg" event={"ID":"902e409a-cbd4-4ff5-afea-c6b2a066e081","Type":"ContainerStarted","Data":"c86fee9f74e55770565006c5439a81d4b24b7bfd0991f492ce532bde3fab14db"} Mar 18 17:44:18.287198 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:44:18.287154 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-84bf76db47-kpvtc" podUID="dded66b1-cfad-4535-816c-935027fc98d5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.75:8080: connect: connection refused" Mar 18 17:44:19.601035 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:44:19.601013 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-84bf76db47-kpvtc" Mar 18 17:44:19.694907 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:44:19.694877 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-62mjg\" (UniqueName: \"kubernetes.io/projected/dded66b1-cfad-4535-816c-935027fc98d5-kube-api-access-62mjg\") pod \"dded66b1-cfad-4535-816c-935027fc98d5\" (UID: \"dded66b1-cfad-4535-816c-935027fc98d5\") " Mar 18 17:44:19.695060 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:44:19.694955 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dded66b1-cfad-4535-816c-935027fc98d5-kserve-provision-location\") pod \"dded66b1-cfad-4535-816c-935027fc98d5\" (UID: \"dded66b1-cfad-4535-816c-935027fc98d5\") " Mar 18 17:44:19.695060 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:44:19.694985 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/dded66b1-cfad-4535-816c-935027fc98d5-cabundle-cert\") pod \"dded66b1-cfad-4535-816c-935027fc98d5\" (UID: \"dded66b1-cfad-4535-816c-935027fc98d5\") " Mar 18 17:44:19.695282 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:44:19.695260 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dded66b1-cfad-4535-816c-935027fc98d5-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "dded66b1-cfad-4535-816c-935027fc98d5" (UID: "dded66b1-cfad-4535-816c-935027fc98d5"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 18 17:44:19.695323 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:44:19.695288 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dded66b1-cfad-4535-816c-935027fc98d5-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "dded66b1-cfad-4535-816c-935027fc98d5" (UID: "dded66b1-cfad-4535-816c-935027fc98d5"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 17:44:19.697145 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:44:19.697127 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dded66b1-cfad-4535-816c-935027fc98d5-kube-api-access-62mjg" (OuterVolumeSpecName: "kube-api-access-62mjg") pod "dded66b1-cfad-4535-816c-935027fc98d5" (UID: "dded66b1-cfad-4535-816c-935027fc98d5"). InnerVolumeSpecName "kube-api-access-62mjg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 17:44:19.796110 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:44:19.796081 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dded66b1-cfad-4535-816c-935027fc98d5-kserve-provision-location\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 17:44:19.796110 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:44:19.796106 2575 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/dded66b1-cfad-4535-816c-935027fc98d5-cabundle-cert\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 17:44:19.796110 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:44:19.796117 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-62mjg\" (UniqueName: \"kubernetes.io/projected/dded66b1-cfad-4535-816c-935027fc98d5-kube-api-access-62mjg\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 17:44:20.453535 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:44:20.453495 2575 generic.go:358] "Generic (PLEG): container finished" podID="dded66b1-cfad-4535-816c-935027fc98d5" containerID="30a3580673a49c2521788bd4aa663e29d3c15fe1702a4a77516c558a111fca70" exitCode=0 Mar 18 17:44:20.453772 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:44:20.453587 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-84bf76db47-kpvtc" Mar 18 17:44:20.453772 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:44:20.453586 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-84bf76db47-kpvtc" event={"ID":"dded66b1-cfad-4535-816c-935027fc98d5","Type":"ContainerDied","Data":"30a3580673a49c2521788bd4aa663e29d3c15fe1702a4a77516c558a111fca70"} Mar 18 17:44:20.453772 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:44:20.453628 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-84bf76db47-kpvtc" event={"ID":"dded66b1-cfad-4535-816c-935027fc98d5","Type":"ContainerDied","Data":"3e277cd1a9b384e80deb809cf94d3d6e32f8b4893647bdd60394609634a57114"} Mar 18 17:44:20.453772 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:44:20.453651 2575 scope.go:117] "RemoveContainer" containerID="30a3580673a49c2521788bd4aa663e29d3c15fe1702a4a77516c558a111fca70" Mar 18 17:44:20.461910 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:44:20.461893 2575 scope.go:117] "RemoveContainer" containerID="185de8591fafda7afa73ffb7750d641862f4b1f0ab06bb46348b14e1ff754ac5" Mar 18 17:44:20.473305 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:44:20.473284 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-84bf76db47-kpvtc"] Mar 18 17:44:20.474803 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:44:20.474701 2575 scope.go:117] "RemoveContainer" containerID="30a3580673a49c2521788bd4aa663e29d3c15fe1702a4a77516c558a111fca70" Mar 18 17:44:20.474981 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:44:20.474962 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30a3580673a49c2521788bd4aa663e29d3c15fe1702a4a77516c558a111fca70\": container with ID starting with 30a3580673a49c2521788bd4aa663e29d3c15fe1702a4a77516c558a111fca70 not found: ID does not exist" containerID="30a3580673a49c2521788bd4aa663e29d3c15fe1702a4a77516c558a111fca70" Mar 18 17:44:20.475022 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:44:20.474990 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30a3580673a49c2521788bd4aa663e29d3c15fe1702a4a77516c558a111fca70"} err="failed to get container status \"30a3580673a49c2521788bd4aa663e29d3c15fe1702a4a77516c558a111fca70\": rpc error: code = NotFound desc = could not find container \"30a3580673a49c2521788bd4aa663e29d3c15fe1702a4a77516c558a111fca70\": container with ID starting with 30a3580673a49c2521788bd4aa663e29d3c15fe1702a4a77516c558a111fca70 not found: ID does not exist" Mar 18 17:44:20.475022 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:44:20.475005 2575 scope.go:117] "RemoveContainer" containerID="185de8591fafda7afa73ffb7750d641862f4b1f0ab06bb46348b14e1ff754ac5" Mar 18 17:44:20.475255 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:44:20.475234 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"185de8591fafda7afa73ffb7750d641862f4b1f0ab06bb46348b14e1ff754ac5\": container with ID starting with 185de8591fafda7afa73ffb7750d641862f4b1f0ab06bb46348b14e1ff754ac5 not found: ID does not exist" containerID="185de8591fafda7afa73ffb7750d641862f4b1f0ab06bb46348b14e1ff754ac5" Mar 18 17:44:20.475307 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:44:20.475263 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"185de8591fafda7afa73ffb7750d641862f4b1f0ab06bb46348b14e1ff754ac5"} err="failed to get container status \"185de8591fafda7afa73ffb7750d641862f4b1f0ab06bb46348b14e1ff754ac5\": rpc error: code = NotFound desc = could not find container \"185de8591fafda7afa73ffb7750d641862f4b1f0ab06bb46348b14e1ff754ac5\": container with ID starting with 185de8591fafda7afa73ffb7750d641862f4b1f0ab06bb46348b14e1ff754ac5 not found: ID does not exist" Mar 18 17:44:20.478991 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:44:20.478972 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-84bf76db47-kpvtc"] Mar 18 17:44:22.289920 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:44:22.289889 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dded66b1-cfad-4535-816c-935027fc98d5" path="/var/lib/kubelet/pods/dded66b1-cfad-4535-816c-935027fc98d5/volumes" Mar 18 17:44:22.461725 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:44:22.461692 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-7cbf6fcdb9-5dwhg_902e409a-cbd4-4ff5-afea-c6b2a066e081/storage-initializer/0.log" Mar 18 17:44:22.461910 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:44:22.461729 2575 generic.go:358] "Generic (PLEG): container finished" podID="902e409a-cbd4-4ff5-afea-c6b2a066e081" containerID="c816c99a8afaf0886f2d727e61481527455221429e17dc21a3d56d41356104ea" exitCode=1 Mar 18 17:44:22.461910 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:44:22.461817 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7cbf6fcdb9-5dwhg" event={"ID":"902e409a-cbd4-4ff5-afea-c6b2a066e081","Type":"ContainerDied","Data":"c816c99a8afaf0886f2d727e61481527455221429e17dc21a3d56d41356104ea"} Mar 18 17:44:23.467884 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:44:23.467859 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-7cbf6fcdb9-5dwhg_902e409a-cbd4-4ff5-afea-c6b2a066e081/storage-initializer/0.log" Mar 18 17:44:23.468270 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:44:23.467975 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7cbf6fcdb9-5dwhg" event={"ID":"902e409a-cbd4-4ff5-afea-c6b2a066e081","Type":"ContainerStarted","Data":"088f1f027848e6a7ebfa02c465ddd55b29d06cb15823c281c60063ff6e49ba47"} Mar 18 17:44:25.475871 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:44:25.475841 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-7cbf6fcdb9-5dwhg_902e409a-cbd4-4ff5-afea-c6b2a066e081/storage-initializer/1.log" Mar 18 17:44:25.476253 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:44:25.476166 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-7cbf6fcdb9-5dwhg_902e409a-cbd4-4ff5-afea-c6b2a066e081/storage-initializer/0.log" Mar 18 17:44:25.476253 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:44:25.476197 2575 generic.go:358] "Generic (PLEG): container finished" podID="902e409a-cbd4-4ff5-afea-c6b2a066e081" containerID="088f1f027848e6a7ebfa02c465ddd55b29d06cb15823c281c60063ff6e49ba47" exitCode=1 Mar 18 17:44:25.476253 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:44:25.476247 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7cbf6fcdb9-5dwhg" event={"ID":"902e409a-cbd4-4ff5-afea-c6b2a066e081","Type":"ContainerDied","Data":"088f1f027848e6a7ebfa02c465ddd55b29d06cb15823c281c60063ff6e49ba47"} Mar 18 17:44:25.476374 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:44:25.476280 2575 scope.go:117] "RemoveContainer" containerID="c816c99a8afaf0886f2d727e61481527455221429e17dc21a3d56d41356104ea" Mar 18 17:44:25.476639 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:44:25.476615 2575 scope.go:117] "RemoveContainer" containerID="c816c99a8afaf0886f2d727e61481527455221429e17dc21a3d56d41356104ea" Mar 18 17:44:25.491000 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:44:25.490973 2575 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-sklearn-s3-tls-custom-fail-predictor-7cbf6fcdb9-5dwhg_kserve-ci-e2e-test_902e409a-cbd4-4ff5-afea-c6b2a066e081_0 in pod sandbox c86fee9f74e55770565006c5439a81d4b24b7bfd0991f492ce532bde3fab14db from index: no such id: 'c816c99a8afaf0886f2d727e61481527455221429e17dc21a3d56d41356104ea'" containerID="c816c99a8afaf0886f2d727e61481527455221429e17dc21a3d56d41356104ea" Mar 18 17:44:25.491066 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:44:25.491023 2575 kuberuntime_container.go:951] "Unhandled Error" err="failed to remove pod init container \"storage-initializer\": rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-sklearn-s3-tls-custom-fail-predictor-7cbf6fcdb9-5dwhg_kserve-ci-e2e-test_902e409a-cbd4-4ff5-afea-c6b2a066e081_0 in pod sandbox c86fee9f74e55770565006c5439a81d4b24b7bfd0991f492ce532bde3fab14db from index: no such id: 'c816c99a8afaf0886f2d727e61481527455221429e17dc21a3d56d41356104ea'; Skipping pod \"isvc-sklearn-s3-tls-custom-fail-predictor-7cbf6fcdb9-5dwhg_kserve-ci-e2e-test(902e409a-cbd4-4ff5-afea-c6b2a066e081)\"" logger="UnhandledError" Mar 18 17:44:25.492335 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:44:25.492317 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-sklearn-s3-tls-custom-fail-predictor-7cbf6fcdb9-5dwhg_kserve-ci-e2e-test(902e409a-cbd4-4ff5-afea-c6b2a066e081)\"" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7cbf6fcdb9-5dwhg" podUID="902e409a-cbd4-4ff5-afea-c6b2a066e081" Mar 18 17:44:25.926281 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:44:25.926252 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7cbf6fcdb9-5dwhg"] Mar 18 17:44:26.480993 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:44:26.480963 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-7cbf6fcdb9-5dwhg_902e409a-cbd4-4ff5-afea-c6b2a066e081/storage-initializer/1.log" Mar 18 17:44:26.612545 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:44:26.612527 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-7cbf6fcdb9-5dwhg_902e409a-cbd4-4ff5-afea-c6b2a066e081/storage-initializer/1.log" Mar 18 17:44:26.612655 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:44:26.612587 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7cbf6fcdb9-5dwhg" Mar 18 17:44:26.646909 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:44:26.646887 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cc5pz\" (UniqueName: \"kubernetes.io/projected/902e409a-cbd4-4ff5-afea-c6b2a066e081-kube-api-access-cc5pz\") pod \"902e409a-cbd4-4ff5-afea-c6b2a066e081\" (UID: \"902e409a-cbd4-4ff5-afea-c6b2a066e081\") " Mar 18 17:44:26.647047 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:44:26.646967 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/902e409a-cbd4-4ff5-afea-c6b2a066e081-kserve-provision-location\") pod \"902e409a-cbd4-4ff5-afea-c6b2a066e081\" (UID: \"902e409a-cbd4-4ff5-afea-c6b2a066e081\") " Mar 18 17:44:26.647243 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:44:26.647217 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/902e409a-cbd4-4ff5-afea-c6b2a066e081-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "902e409a-cbd4-4ff5-afea-c6b2a066e081" (UID: "902e409a-cbd4-4ff5-afea-c6b2a066e081"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 18 17:44:26.649072 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:44:26.649043 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/902e409a-cbd4-4ff5-afea-c6b2a066e081-kube-api-access-cc5pz" (OuterVolumeSpecName: "kube-api-access-cc5pz") pod "902e409a-cbd4-4ff5-afea-c6b2a066e081" (UID: "902e409a-cbd4-4ff5-afea-c6b2a066e081"). InnerVolumeSpecName "kube-api-access-cc5pz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 17:44:26.748138 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:44:26.748085 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cc5pz\" (UniqueName: \"kubernetes.io/projected/902e409a-cbd4-4ff5-afea-c6b2a066e081-kube-api-access-cc5pz\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 17:44:26.748138 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:44:26.748106 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/902e409a-cbd4-4ff5-afea-c6b2a066e081-kserve-provision-location\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 17:44:26.948990 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:44:26.948956 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-78f59d8b66-jbcv7"] Mar 18 17:44:26.949282 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:44:26.949267 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="902e409a-cbd4-4ff5-afea-c6b2a066e081" containerName="storage-initializer" Mar 18 17:44:26.949282 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:44:26.949284 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="902e409a-cbd4-4ff5-afea-c6b2a066e081" containerName="storage-initializer" Mar 18 17:44:26.949390 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:44:26.949304 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dded66b1-cfad-4535-816c-935027fc98d5" containerName="kserve-container" Mar 18 17:44:26.949390 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:44:26.949310 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="dded66b1-cfad-4535-816c-935027fc98d5" containerName="kserve-container" Mar 18 17:44:26.949390 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:44:26.949316 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dded66b1-cfad-4535-816c-935027fc98d5" containerName="storage-initializer" Mar 18 17:44:26.949390 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:44:26.949322 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="dded66b1-cfad-4535-816c-935027fc98d5" containerName="storage-initializer" Mar 18 17:44:26.949531 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:44:26.949396 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="902e409a-cbd4-4ff5-afea-c6b2a066e081" containerName="storage-initializer" Mar 18 17:44:26.949531 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:44:26.949405 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="dded66b1-cfad-4535-816c-935027fc98d5" containerName="kserve-container" Mar 18 17:44:26.949531 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:44:26.949459 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="902e409a-cbd4-4ff5-afea-c6b2a066e081" containerName="storage-initializer" Mar 18 17:44:26.949531 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:44:26.949466 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="902e409a-cbd4-4ff5-afea-c6b2a066e081" containerName="storage-initializer" Mar 18 17:44:26.949531 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:44:26.949513 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="902e409a-cbd4-4ff5-afea-c6b2a066e081" containerName="storage-initializer" Mar 18 17:44:26.952589 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:44:26.952571 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-78f59d8b66-jbcv7" Mar 18 17:44:26.954706 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:44:26.954686 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Mar 18 17:44:26.961329 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:44:26.961308 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-78f59d8b66-jbcv7"] Mar 18 17:44:27.050523 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:44:27.050448 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c24905c7-188a-4ce8-b261-ae501cb946d6-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-78f59d8b66-jbcv7\" (UID: \"c24905c7-188a-4ce8-b261-ae501cb946d6\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-78f59d8b66-jbcv7" Mar 18 17:44:27.050523 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:44:27.050491 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mp4x\" (UniqueName: \"kubernetes.io/projected/c24905c7-188a-4ce8-b261-ae501cb946d6-kube-api-access-5mp4x\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-78f59d8b66-jbcv7\" (UID: \"c24905c7-188a-4ce8-b261-ae501cb946d6\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-78f59d8b66-jbcv7" Mar 18 17:44:27.050690 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:44:27.050538 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/c24905c7-188a-4ce8-b261-ae501cb946d6-cabundle-cert\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-78f59d8b66-jbcv7\" (UID: \"c24905c7-188a-4ce8-b261-ae501cb946d6\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-78f59d8b66-jbcv7" Mar 18 17:44:27.151745 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:44:27.151709 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c24905c7-188a-4ce8-b261-ae501cb946d6-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-78f59d8b66-jbcv7\" (UID: \"c24905c7-188a-4ce8-b261-ae501cb946d6\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-78f59d8b66-jbcv7" Mar 18 17:44:27.151872 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:44:27.151765 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5mp4x\" (UniqueName: \"kubernetes.io/projected/c24905c7-188a-4ce8-b261-ae501cb946d6-kube-api-access-5mp4x\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-78f59d8b66-jbcv7\" (UID: \"c24905c7-188a-4ce8-b261-ae501cb946d6\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-78f59d8b66-jbcv7" Mar 18 17:44:27.151872 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:44:27.151809 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/c24905c7-188a-4ce8-b261-ae501cb946d6-cabundle-cert\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-78f59d8b66-jbcv7\" (UID: \"c24905c7-188a-4ce8-b261-ae501cb946d6\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-78f59d8b66-jbcv7" Mar 18 17:44:27.152085 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:44:27.152065 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c24905c7-188a-4ce8-b261-ae501cb946d6-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-78f59d8b66-jbcv7\" (UID: \"c24905c7-188a-4ce8-b261-ae501cb946d6\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-78f59d8b66-jbcv7" Mar 18 17:44:27.152522 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:44:27.152503 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/c24905c7-188a-4ce8-b261-ae501cb946d6-cabundle-cert\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-78f59d8b66-jbcv7\" (UID: \"c24905c7-188a-4ce8-b261-ae501cb946d6\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-78f59d8b66-jbcv7" Mar 18 17:44:27.159584 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:44:27.159561 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mp4x\" (UniqueName: \"kubernetes.io/projected/c24905c7-188a-4ce8-b261-ae501cb946d6-kube-api-access-5mp4x\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-78f59d8b66-jbcv7\" (UID: \"c24905c7-188a-4ce8-b261-ae501cb946d6\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-78f59d8b66-jbcv7" Mar 18 17:44:27.263678 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:44:27.263650 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-78f59d8b66-jbcv7" Mar 18 17:44:27.386290 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:44:27.386258 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-78f59d8b66-jbcv7"] Mar 18 17:44:27.389152 ip-10-0-139-49 kubenswrapper[2575]: W0318 17:44:27.389126 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc24905c7_188a_4ce8_b261_ae501cb946d6.slice/crio-ccc7719fdc82be2ce8300d34cdba155592be61a6dde90fc4d4757158b85b8b5a WatchSource:0}: Error finding container ccc7719fdc82be2ce8300d34cdba155592be61a6dde90fc4d4757158b85b8b5a: Status 404 returned error can't find the container with id ccc7719fdc82be2ce8300d34cdba155592be61a6dde90fc4d4757158b85b8b5a Mar 18 17:44:27.487292 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:44:27.487261 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-78f59d8b66-jbcv7" event={"ID":"c24905c7-188a-4ce8-b261-ae501cb946d6","Type":"ContainerStarted","Data":"5adffd85e6654d8c4f0ccc1cb7d8f8f33d11684c5277f457abdbee2c97db300e"} Mar 18 17:44:27.487706 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:44:27.487298 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-78f59d8b66-jbcv7" event={"ID":"c24905c7-188a-4ce8-b261-ae501cb946d6","Type":"ContainerStarted","Data":"ccc7719fdc82be2ce8300d34cdba155592be61a6dde90fc4d4757158b85b8b5a"} Mar 18 17:44:27.488536 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:44:27.488512 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-7cbf6fcdb9-5dwhg_902e409a-cbd4-4ff5-afea-c6b2a066e081/storage-initializer/1.log" Mar 18 17:44:27.488623 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:44:27.488570 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7cbf6fcdb9-5dwhg" event={"ID":"902e409a-cbd4-4ff5-afea-c6b2a066e081","Type":"ContainerDied","Data":"c86fee9f74e55770565006c5439a81d4b24b7bfd0991f492ce532bde3fab14db"} Mar 18 17:44:27.488623 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:44:27.488594 2575 scope.go:117] "RemoveContainer" containerID="088f1f027848e6a7ebfa02c465ddd55b29d06cb15823c281c60063ff6e49ba47" Mar 18 17:44:27.488731 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:44:27.488623 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7cbf6fcdb9-5dwhg" Mar 18 17:44:27.530590 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:44:27.530562 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7cbf6fcdb9-5dwhg"] Mar 18 17:44:27.536530 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:44:27.536507 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7cbf6fcdb9-5dwhg"] Mar 18 17:44:28.290147 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:44:28.290118 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="902e409a-cbd4-4ff5-afea-c6b2a066e081" path="/var/lib/kubelet/pods/902e409a-cbd4-4ff5-afea-c6b2a066e081/volumes" Mar 18 17:44:28.493758 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:44:28.493670 2575 generic.go:358] "Generic (PLEG): container finished" podID="c24905c7-188a-4ce8-b261-ae501cb946d6" containerID="5adffd85e6654d8c4f0ccc1cb7d8f8f33d11684c5277f457abdbee2c97db300e" exitCode=0 Mar 18 17:44:28.494182 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:44:28.493758 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-78f59d8b66-jbcv7" event={"ID":"c24905c7-188a-4ce8-b261-ae501cb946d6","Type":"ContainerDied","Data":"5adffd85e6654d8c4f0ccc1cb7d8f8f33d11684c5277f457abdbee2c97db300e"} Mar 18 17:44:29.498632 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:44:29.498596 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-78f59d8b66-jbcv7" event={"ID":"c24905c7-188a-4ce8-b261-ae501cb946d6","Type":"ContainerStarted","Data":"9528aaab9098b96dfb07ebed964caeb8ffb50aaf8194c8221e698df4d2db17eb"} Mar 18 17:44:29.498992 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:44:29.498812 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-78f59d8b66-jbcv7" Mar 18 17:44:29.500100 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:44:29.500076 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-78f59d8b66-jbcv7" podUID="c24905c7-188a-4ce8-b261-ae501cb946d6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.77:8080: connect: connection refused" Mar 18 17:44:29.517220 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:44:29.517184 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-78f59d8b66-jbcv7" podStartSLOduration=3.517170632 podStartE2EDuration="3.517170632s" podCreationTimestamp="2026-03-18 17:44:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:44:29.516436049 +0000 UTC m=+3587.827688981" watchObservedRunningTime="2026-03-18 17:44:29.517170632 +0000 UTC m=+3587.828423553" Mar 18 17:44:30.503303 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:44:30.503271 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-78f59d8b66-jbcv7" podUID="c24905c7-188a-4ce8-b261-ae501cb946d6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.77:8080: connect: connection refused" Mar 18 17:44:32.289859 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:44:32.289825 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:44:40.504090 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:44:40.504050 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-78f59d8b66-jbcv7" podUID="c24905c7-188a-4ce8-b261-ae501cb946d6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.77:8080: connect: connection refused" Mar 18 17:44:42.432987 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:44:42.432959 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qktrm_a51f93e2-c750-4700-bb46-109456c7c78a/ovn-acl-logging/0.log" Mar 18 17:44:42.441559 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:44:42.441538 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qktrm_a51f93e2-c750-4700-bb46-109456c7c78a/ovn-acl-logging/0.log" Mar 18 17:44:47.287409 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:44:47.287381 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:44:50.503947 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:44:50.503908 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-78f59d8b66-jbcv7" podUID="c24905c7-188a-4ce8-b261-ae501cb946d6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.77:8080: connect: connection refused" Mar 18 17:45:00.287315 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:45:00.287268 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:45:00.503778 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:45:00.503740 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-78f59d8b66-jbcv7" podUID="c24905c7-188a-4ce8-b261-ae501cb946d6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.77:8080: connect: connection refused" Mar 18 17:45:10.504085 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:45:10.504043 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-78f59d8b66-jbcv7" podUID="c24905c7-188a-4ce8-b261-ae501cb946d6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.77:8080: connect: connection refused" Mar 18 17:45:13.286832 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:45:13.286801 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:45:20.504276 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:45:20.504229 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-78f59d8b66-jbcv7" podUID="c24905c7-188a-4ce8-b261-ae501cb946d6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.77:8080: connect: connection refused" Mar 18 17:45:27.286798 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:45:27.286738 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:45:30.503458 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:45:30.503420 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-78f59d8b66-jbcv7" podUID="c24905c7-188a-4ce8-b261-ae501cb946d6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.77:8080: connect: connection refused" Mar 18 17:45:40.289505 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:45:40.287615 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:45:40.504201 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:45:40.504165 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-78f59d8b66-jbcv7" podUID="c24905c7-188a-4ce8-b261-ae501cb946d6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.77:8080: connect: connection refused" Mar 18 17:45:43.286817 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:45:43.286766 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-78f59d8b66-jbcv7" podUID="c24905c7-188a-4ce8-b261-ae501cb946d6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.77:8080: connect: connection refused" Mar 18 17:45:53.287498 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:45:53.287414 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-78f59d8b66-jbcv7" Mar 18 17:45:55.286554 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:45:55.286525 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:45:57.080992 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:45:57.080956 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-78f59d8b66-jbcv7"] Mar 18 17:45:57.081350 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:45:57.081174 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-78f59d8b66-jbcv7" podUID="c24905c7-188a-4ce8-b261-ae501cb946d6" containerName="kserve-container" containerID="cri-o://9528aaab9098b96dfb07ebed964caeb8ffb50aaf8194c8221e698df4d2db17eb" gracePeriod=30 Mar 18 17:45:58.173936 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:45:58.173898 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-8444f5d8d4-csvgz"] Mar 18 17:45:58.177295 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:45:58.177274 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-8444f5d8d4-csvgz" Mar 18 17:45:58.187714 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:45:58.187691 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-8444f5d8d4-csvgz"] Mar 18 17:45:58.236454 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:45:58.236428 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vtqq\" (UniqueName: \"kubernetes.io/projected/ae3a0f98-7a37-4668-80c1-b1c06a9fa0a8-kube-api-access-6vtqq\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-8444f5d8d4-csvgz\" (UID: \"ae3a0f98-7a37-4668-80c1-b1c06a9fa0a8\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-8444f5d8d4-csvgz" Mar 18 17:45:58.236595 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:45:58.236460 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ae3a0f98-7a37-4668-80c1-b1c06a9fa0a8-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-8444f5d8d4-csvgz\" (UID: \"ae3a0f98-7a37-4668-80c1-b1c06a9fa0a8\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-8444f5d8d4-csvgz" Mar 18 17:45:58.336839 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:45:58.336801 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6vtqq\" (UniqueName: \"kubernetes.io/projected/ae3a0f98-7a37-4668-80c1-b1c06a9fa0a8-kube-api-access-6vtqq\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-8444f5d8d4-csvgz\" (UID: \"ae3a0f98-7a37-4668-80c1-b1c06a9fa0a8\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-8444f5d8d4-csvgz" Mar 18 17:45:58.336839 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:45:58.336838 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ae3a0f98-7a37-4668-80c1-b1c06a9fa0a8-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-8444f5d8d4-csvgz\" (UID: \"ae3a0f98-7a37-4668-80c1-b1c06a9fa0a8\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-8444f5d8d4-csvgz" Mar 18 17:45:58.337212 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:45:58.337191 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ae3a0f98-7a37-4668-80c1-b1c06a9fa0a8-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-8444f5d8d4-csvgz\" (UID: \"ae3a0f98-7a37-4668-80c1-b1c06a9fa0a8\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-8444f5d8d4-csvgz" Mar 18 17:45:58.345255 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:45:58.345233 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vtqq\" (UniqueName: \"kubernetes.io/projected/ae3a0f98-7a37-4668-80c1-b1c06a9fa0a8-kube-api-access-6vtqq\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-8444f5d8d4-csvgz\" (UID: \"ae3a0f98-7a37-4668-80c1-b1c06a9fa0a8\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-8444f5d8d4-csvgz" Mar 18 17:45:58.487907 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:45:58.487844 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-8444f5d8d4-csvgz" Mar 18 17:45:58.610741 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:45:58.610576 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-8444f5d8d4-csvgz"] Mar 18 17:45:58.613449 ip-10-0-139-49 kubenswrapper[2575]: W0318 17:45:58.613415 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae3a0f98_7a37_4668_80c1_b1c06a9fa0a8.slice/crio-4c9f737559f91bd3f57b6a10d52f50f60e3d5fb2a0da243fc297a55231403b7c WatchSource:0}: Error finding container 4c9f737559f91bd3f57b6a10d52f50f60e3d5fb2a0da243fc297a55231403b7c: Status 404 returned error can't find the container with id 4c9f737559f91bd3f57b6a10d52f50f60e3d5fb2a0da243fc297a55231403b7c Mar 18 17:45:58.782907 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:45:58.782819 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-8444f5d8d4-csvgz" event={"ID":"ae3a0f98-7a37-4668-80c1-b1c06a9fa0a8","Type":"ContainerStarted","Data":"8a9c3cb4fdc9356f5e971cadc9ce1edb05bdea7a1a5390d458bfc2d633ff9a30"} Mar 18 17:45:58.782907 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:45:58.782858 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-8444f5d8d4-csvgz" event={"ID":"ae3a0f98-7a37-4668-80c1-b1c06a9fa0a8","Type":"ContainerStarted","Data":"4c9f737559f91bd3f57b6a10d52f50f60e3d5fb2a0da243fc297a55231403b7c"} Mar 18 17:46:01.527968 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:46:01.527941 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-78f59d8b66-jbcv7" Mar 18 17:46:01.559853 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:46:01.559827 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/c24905c7-188a-4ce8-b261-ae501cb946d6-cabundle-cert\") pod \"c24905c7-188a-4ce8-b261-ae501cb946d6\" (UID: \"c24905c7-188a-4ce8-b261-ae501cb946d6\") " Mar 18 17:46:01.560004 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:46:01.559864 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c24905c7-188a-4ce8-b261-ae501cb946d6-kserve-provision-location\") pod \"c24905c7-188a-4ce8-b261-ae501cb946d6\" (UID: \"c24905c7-188a-4ce8-b261-ae501cb946d6\") " Mar 18 17:46:01.560004 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:46:01.559888 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5mp4x\" (UniqueName: \"kubernetes.io/projected/c24905c7-188a-4ce8-b261-ae501cb946d6-kube-api-access-5mp4x\") pod \"c24905c7-188a-4ce8-b261-ae501cb946d6\" (UID: \"c24905c7-188a-4ce8-b261-ae501cb946d6\") " Mar 18 17:46:01.560156 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:46:01.560138 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c24905c7-188a-4ce8-b261-ae501cb946d6-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c24905c7-188a-4ce8-b261-ae501cb946d6" (UID: "c24905c7-188a-4ce8-b261-ae501cb946d6"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 18 17:46:01.560255 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:46:01.560226 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c24905c7-188a-4ce8-b261-ae501cb946d6-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "c24905c7-188a-4ce8-b261-ae501cb946d6" (UID: "c24905c7-188a-4ce8-b261-ae501cb946d6"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 17:46:01.562123 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:46:01.562101 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c24905c7-188a-4ce8-b261-ae501cb946d6-kube-api-access-5mp4x" (OuterVolumeSpecName: "kube-api-access-5mp4x") pod "c24905c7-188a-4ce8-b261-ae501cb946d6" (UID: "c24905c7-188a-4ce8-b261-ae501cb946d6"). InnerVolumeSpecName "kube-api-access-5mp4x". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 17:46:01.660757 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:46:01.660730 2575 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/c24905c7-188a-4ce8-b261-ae501cb946d6-cabundle-cert\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 17:46:01.660757 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:46:01.660758 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c24905c7-188a-4ce8-b261-ae501cb946d6-kserve-provision-location\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 17:46:01.660941 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:46:01.660768 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5mp4x\" (UniqueName: \"kubernetes.io/projected/c24905c7-188a-4ce8-b261-ae501cb946d6-kube-api-access-5mp4x\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 17:46:01.794407 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:46:01.794375 2575 generic.go:358] "Generic (PLEG): container finished" podID="c24905c7-188a-4ce8-b261-ae501cb946d6" containerID="9528aaab9098b96dfb07ebed964caeb8ffb50aaf8194c8221e698df4d2db17eb" exitCode=0 Mar 18 17:46:01.794585 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:46:01.794444 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-78f59d8b66-jbcv7" Mar 18 17:46:01.794585 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:46:01.794447 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-78f59d8b66-jbcv7" event={"ID":"c24905c7-188a-4ce8-b261-ae501cb946d6","Type":"ContainerDied","Data":"9528aaab9098b96dfb07ebed964caeb8ffb50aaf8194c8221e698df4d2db17eb"} Mar 18 17:46:01.794585 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:46:01.794482 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-78f59d8b66-jbcv7" event={"ID":"c24905c7-188a-4ce8-b261-ae501cb946d6","Type":"ContainerDied","Data":"ccc7719fdc82be2ce8300d34cdba155592be61a6dde90fc4d4757158b85b8b5a"} Mar 18 17:46:01.794585 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:46:01.794497 2575 scope.go:117] "RemoveContainer" containerID="9528aaab9098b96dfb07ebed964caeb8ffb50aaf8194c8221e698df4d2db17eb" Mar 18 17:46:01.803022 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:46:01.803007 2575 scope.go:117] "RemoveContainer" containerID="5adffd85e6654d8c4f0ccc1cb7d8f8f33d11684c5277f457abdbee2c97db300e" Mar 18 17:46:01.814554 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:46:01.814537 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-78f59d8b66-jbcv7"] Mar 18 17:46:01.817823 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:46:01.817803 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-78f59d8b66-jbcv7"] Mar 18 17:46:01.826222 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:46:01.826203 2575 scope.go:117] "RemoveContainer" containerID="9528aaab9098b96dfb07ebed964caeb8ffb50aaf8194c8221e698df4d2db17eb" Mar 18 17:46:01.826517 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:46:01.826499 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9528aaab9098b96dfb07ebed964caeb8ffb50aaf8194c8221e698df4d2db17eb\": container with ID starting with 9528aaab9098b96dfb07ebed964caeb8ffb50aaf8194c8221e698df4d2db17eb not found: ID does not exist" containerID="9528aaab9098b96dfb07ebed964caeb8ffb50aaf8194c8221e698df4d2db17eb" Mar 18 17:46:01.826589 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:46:01.826530 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9528aaab9098b96dfb07ebed964caeb8ffb50aaf8194c8221e698df4d2db17eb"} err="failed to get container status \"9528aaab9098b96dfb07ebed964caeb8ffb50aaf8194c8221e698df4d2db17eb\": rpc error: code = NotFound desc = could not find container \"9528aaab9098b96dfb07ebed964caeb8ffb50aaf8194c8221e698df4d2db17eb\": container with ID starting with 9528aaab9098b96dfb07ebed964caeb8ffb50aaf8194c8221e698df4d2db17eb not found: ID does not exist" Mar 18 17:46:01.826589 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:46:01.826548 2575 scope.go:117] "RemoveContainer" containerID="5adffd85e6654d8c4f0ccc1cb7d8f8f33d11684c5277f457abdbee2c97db300e" Mar 18 17:46:01.826770 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:46:01.826756 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5adffd85e6654d8c4f0ccc1cb7d8f8f33d11684c5277f457abdbee2c97db300e\": container with ID starting with 5adffd85e6654d8c4f0ccc1cb7d8f8f33d11684c5277f457abdbee2c97db300e not found: ID does not exist" containerID="5adffd85e6654d8c4f0ccc1cb7d8f8f33d11684c5277f457abdbee2c97db300e" Mar 18 17:46:01.826816 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:46:01.826773 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5adffd85e6654d8c4f0ccc1cb7d8f8f33d11684c5277f457abdbee2c97db300e"} err="failed to get container status \"5adffd85e6654d8c4f0ccc1cb7d8f8f33d11684c5277f457abdbee2c97db300e\": rpc error: code = NotFound desc = could not find container \"5adffd85e6654d8c4f0ccc1cb7d8f8f33d11684c5277f457abdbee2c97db300e\": container with ID starting with 5adffd85e6654d8c4f0ccc1cb7d8f8f33d11684c5277f457abdbee2c97db300e not found: ID does not exist" Mar 18 17:46:02.289802 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:46:02.289779 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c24905c7-188a-4ce8-b261-ae501cb946d6" path="/var/lib/kubelet/pods/c24905c7-188a-4ce8-b261-ae501cb946d6/volumes" Mar 18 17:46:04.806163 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:46:04.806137 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-8444f5d8d4-csvgz_ae3a0f98-7a37-4668-80c1-b1c06a9fa0a8/storage-initializer/0.log" Mar 18 17:46:04.806541 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:46:04.806175 2575 generic.go:358] "Generic (PLEG): container finished" podID="ae3a0f98-7a37-4668-80c1-b1c06a9fa0a8" containerID="8a9c3cb4fdc9356f5e971cadc9ce1edb05bdea7a1a5390d458bfc2d633ff9a30" exitCode=1 Mar 18 17:46:04.806541 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:46:04.806226 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-8444f5d8d4-csvgz" event={"ID":"ae3a0f98-7a37-4668-80c1-b1c06a9fa0a8","Type":"ContainerDied","Data":"8a9c3cb4fdc9356f5e971cadc9ce1edb05bdea7a1a5390d458bfc2d633ff9a30"} Mar 18 17:46:05.811245 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:46:05.811219 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-8444f5d8d4-csvgz_ae3a0f98-7a37-4668-80c1-b1c06a9fa0a8/storage-initializer/0.log" Mar 18 17:46:05.811630 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:46:05.811323 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-8444f5d8d4-csvgz" event={"ID":"ae3a0f98-7a37-4668-80c1-b1c06a9fa0a8","Type":"ContainerStarted","Data":"341e58b56adcfc054170ca7b5b6f13f208f994dac3b57864514590ff7d526a8d"} Mar 18 17:46:07.286990 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:46:07.286963 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:46:07.818350 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:46:07.818318 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-8444f5d8d4-csvgz_ae3a0f98-7a37-4668-80c1-b1c06a9fa0a8/storage-initializer/1.log" Mar 18 17:46:07.818680 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:46:07.818664 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-8444f5d8d4-csvgz_ae3a0f98-7a37-4668-80c1-b1c06a9fa0a8/storage-initializer/0.log" Mar 18 17:46:07.818729 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:46:07.818699 2575 generic.go:358] "Generic (PLEG): container finished" podID="ae3a0f98-7a37-4668-80c1-b1c06a9fa0a8" containerID="341e58b56adcfc054170ca7b5b6f13f208f994dac3b57864514590ff7d526a8d" exitCode=1 Mar 18 17:46:07.818796 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:46:07.818777 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-8444f5d8d4-csvgz" event={"ID":"ae3a0f98-7a37-4668-80c1-b1c06a9fa0a8","Type":"ContainerDied","Data":"341e58b56adcfc054170ca7b5b6f13f208f994dac3b57864514590ff7d526a8d"} Mar 18 17:46:07.818834 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:46:07.818819 2575 scope.go:117] "RemoveContainer" containerID="8a9c3cb4fdc9356f5e971cadc9ce1edb05bdea7a1a5390d458bfc2d633ff9a30" Mar 18 17:46:07.819151 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:46:07.819124 2575 scope.go:117] "RemoveContainer" containerID="8a9c3cb4fdc9356f5e971cadc9ce1edb05bdea7a1a5390d458bfc2d633ff9a30" Mar 18 17:46:07.834236 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:46:07.834193 2575 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-sklearn-s3-tls-serving-fail-predictor-8444f5d8d4-csvgz_kserve-ci-e2e-test_ae3a0f98-7a37-4668-80c1-b1c06a9fa0a8_0 in pod sandbox 4c9f737559f91bd3f57b6a10d52f50f60e3d5fb2a0da243fc297a55231403b7c from index: no such id: '8a9c3cb4fdc9356f5e971cadc9ce1edb05bdea7a1a5390d458bfc2d633ff9a30'" containerID="8a9c3cb4fdc9356f5e971cadc9ce1edb05bdea7a1a5390d458bfc2d633ff9a30" Mar 18 17:46:07.834351 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:46:07.834252 2575 kuberuntime_container.go:951] "Unhandled Error" err="failed to remove pod init container \"storage-initializer\": rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-sklearn-s3-tls-serving-fail-predictor-8444f5d8d4-csvgz_kserve-ci-e2e-test_ae3a0f98-7a37-4668-80c1-b1c06a9fa0a8_0 in pod sandbox 4c9f737559f91bd3f57b6a10d52f50f60e3d5fb2a0da243fc297a55231403b7c from index: no such id: '8a9c3cb4fdc9356f5e971cadc9ce1edb05bdea7a1a5390d458bfc2d633ff9a30'; Skipping pod \"isvc-sklearn-s3-tls-serving-fail-predictor-8444f5d8d4-csvgz_kserve-ci-e2e-test(ae3a0f98-7a37-4668-80c1-b1c06a9fa0a8)\"" logger="UnhandledError" Mar 18 17:46:07.835557 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:46:07.835537 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-sklearn-s3-tls-serving-fail-predictor-8444f5d8d4-csvgz_kserve-ci-e2e-test(ae3a0f98-7a37-4668-80c1-b1c06a9fa0a8)\"" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-8444f5d8d4-csvgz" podUID="ae3a0f98-7a37-4668-80c1-b1c06a9fa0a8" Mar 18 17:46:08.139666 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:46:08.139638 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-8444f5d8d4-csvgz"] Mar 18 17:46:08.823716 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:46:08.823689 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-8444f5d8d4-csvgz_ae3a0f98-7a37-4668-80c1-b1c06a9fa0a8/storage-initializer/1.log" Mar 18 17:46:08.948650 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:46:08.948627 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-8444f5d8d4-csvgz_ae3a0f98-7a37-4668-80c1-b1c06a9fa0a8/storage-initializer/1.log" Mar 18 17:46:08.948774 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:46:08.948689 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-8444f5d8d4-csvgz" Mar 18 17:46:09.017931 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:46:09.017907 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6vtqq\" (UniqueName: \"kubernetes.io/projected/ae3a0f98-7a37-4668-80c1-b1c06a9fa0a8-kube-api-access-6vtqq\") pod \"ae3a0f98-7a37-4668-80c1-b1c06a9fa0a8\" (UID: \"ae3a0f98-7a37-4668-80c1-b1c06a9fa0a8\") " Mar 18 17:46:09.018076 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:46:09.017950 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ae3a0f98-7a37-4668-80c1-b1c06a9fa0a8-kserve-provision-location\") pod \"ae3a0f98-7a37-4668-80c1-b1c06a9fa0a8\" (UID: \"ae3a0f98-7a37-4668-80c1-b1c06a9fa0a8\") " Mar 18 17:46:09.018237 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:46:09.018211 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae3a0f98-7a37-4668-80c1-b1c06a9fa0a8-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "ae3a0f98-7a37-4668-80c1-b1c06a9fa0a8" (UID: "ae3a0f98-7a37-4668-80c1-b1c06a9fa0a8"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 18 17:46:09.020095 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:46:09.020076 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae3a0f98-7a37-4668-80c1-b1c06a9fa0a8-kube-api-access-6vtqq" (OuterVolumeSpecName: "kube-api-access-6vtqq") pod "ae3a0f98-7a37-4668-80c1-b1c06a9fa0a8" (UID: "ae3a0f98-7a37-4668-80c1-b1c06a9fa0a8"). InnerVolumeSpecName "kube-api-access-6vtqq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 17:46:09.119352 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:46:09.119295 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6vtqq\" (UniqueName: \"kubernetes.io/projected/ae3a0f98-7a37-4668-80c1-b1c06a9fa0a8-kube-api-access-6vtqq\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 17:46:09.119352 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:46:09.119324 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ae3a0f98-7a37-4668-80c1-b1c06a9fa0a8-kserve-provision-location\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Mar 18 17:46:09.828025 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:46:09.827997 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-8444f5d8d4-csvgz_ae3a0f98-7a37-4668-80c1-b1c06a9fa0a8/storage-initializer/1.log" Mar 18 17:46:09.828426 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:46:09.828065 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-8444f5d8d4-csvgz" event={"ID":"ae3a0f98-7a37-4668-80c1-b1c06a9fa0a8","Type":"ContainerDied","Data":"4c9f737559f91bd3f57b6a10d52f50f60e3d5fb2a0da243fc297a55231403b7c"} Mar 18 17:46:09.828426 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:46:09.828093 2575 scope.go:117] "RemoveContainer" containerID="341e58b56adcfc054170ca7b5b6f13f208f994dac3b57864514590ff7d526a8d" Mar 18 17:46:09.828426 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:46:09.828102 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-8444f5d8d4-csvgz" Mar 18 17:46:09.873163 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:46:09.873133 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-8444f5d8d4-csvgz"] Mar 18 17:46:09.878805 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:46:09.878780 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-8444f5d8d4-csvgz"] Mar 18 17:46:10.290299 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:46:10.290270 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae3a0f98-7a37-4668-80c1-b1c06a9fa0a8" path="/var/lib/kubelet/pods/ae3a0f98-7a37-4668-80c1-b1c06a9fa0a8/volumes" Mar 18 17:46:20.286920 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:46:20.286865 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 17:46:20.287130 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:46:20.287052 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:46:33.286577 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:46:33.286526 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:46:42.216207 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:46:42.216182 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-gdm4b_4e6ee6d9-0f25-4ecd-b286-3b4f51bd62a5/global-pull-secret-syncer/0.log" Mar 18 17:46:42.422193 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:46:42.422167 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-88mp7_cac8f65a-dd3e-4b2c-8a45-1ba1e842dd91/konnectivity-agent/0.log" Mar 18 17:46:42.532320 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:46:42.532257 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-139-49.ec2.internal_ea09336bc3d6205db966e5eb690fe85d/haproxy/0.log" Mar 18 17:46:45.287172 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:46:45.287124 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:46:45.959234 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:46:45.959202 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-b58cd5d8d-rph27_50735fa3-85e2-4bcb-ad67-9dd10af25a64/cluster-monitoring-operator/0.log" Mar 18 17:46:46.125541 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:46:46.125506 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-2m2rc_1ee329a3-8fbd-4f66-b69a-0534ee4fe51c/node-exporter/0.log" Mar 18 17:46:46.151414 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:46:46.151392 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-2m2rc_1ee329a3-8fbd-4f66-b69a-0534ee4fe51c/kube-rbac-proxy/0.log" Mar 18 17:46:46.176671 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:46:46.176656 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-2m2rc_1ee329a3-8fbd-4f66-b69a-0534ee4fe51c/init-textfile/0.log" Mar 18 17:46:46.704054 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:46:46.704032 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-6b948c769-kphbn_a090e3ca-196c-44e4-8c04-ba9e61392e3c/prometheus-operator/0.log" Mar 18 17:46:46.732611 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:46:46.732586 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-6b948c769-kphbn_a090e3ca-196c-44e4-8c04-ba9e61392e3c/kube-rbac-proxy/0.log" Mar 18 17:46:49.102346 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:46:49.102316 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5fd89d9bf9-mh6nr_3464ba8e-beb1-44e3-b263-a29520f68252/console/0.log" Mar 18 17:46:49.135844 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:46:49.135819 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-5b85974fd6-jcxw6_d6c35e1e-b5a5-4e45-a199-26600d617988/download-server/0.log" Mar 18 17:46:49.195677 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:46:49.195642 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-vxxd9/perf-node-gather-daemonset-6x8bd"] Mar 18 17:46:49.195986 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:46:49.195973 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ae3a0f98-7a37-4668-80c1-b1c06a9fa0a8" containerName="storage-initializer" Mar 18 17:46:49.195986 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:46:49.195987 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae3a0f98-7a37-4668-80c1-b1c06a9fa0a8" containerName="storage-initializer" Mar 18 17:46:49.196087 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:46:49.196006 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c24905c7-188a-4ce8-b261-ae501cb946d6" containerName="storage-initializer" Mar 18 17:46:49.196087 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:46:49.196012 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="c24905c7-188a-4ce8-b261-ae501cb946d6" containerName="storage-initializer" Mar 18 17:46:49.196087 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:46:49.196019 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c24905c7-188a-4ce8-b261-ae501cb946d6" containerName="kserve-container" Mar 18 17:46:49.196087 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:46:49.196024 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="c24905c7-188a-4ce8-b261-ae501cb946d6" containerName="kserve-container" Mar 18 17:46:49.196087 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:46:49.196073 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="ae3a0f98-7a37-4668-80c1-b1c06a9fa0a8" containerName="storage-initializer" Mar 18 17:46:49.196087 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:46:49.196081 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="c24905c7-188a-4ce8-b261-ae501cb946d6" containerName="kserve-container" Mar 18 17:46:49.196087 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:46:49.196089 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="ae3a0f98-7a37-4668-80c1-b1c06a9fa0a8" containerName="storage-initializer" Mar 18 17:46:49.198921 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:46:49.198905 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vxxd9/perf-node-gather-daemonset-6x8bd" Mar 18 17:46:49.200938 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:46:49.200915 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-vxxd9\"/\"kube-root-ca.crt\"" Mar 18 17:46:49.201286 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:46:49.201264 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-vxxd9\"/\"default-dockercfg-w25nj\"" Mar 18 17:46:49.201286 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:46:49.201281 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-vxxd9\"/\"openshift-service-ca.crt\"" Mar 18 17:46:49.209941 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:46:49.209919 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-vxxd9/perf-node-gather-daemonset-6x8bd"] Mar 18 17:46:49.305976 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:46:49.305939 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4znld\" (UniqueName: \"kubernetes.io/projected/b284d161-c961-4d35-9778-9662c7f8d8cc-kube-api-access-4znld\") pod \"perf-node-gather-daemonset-6x8bd\" (UID: \"b284d161-c961-4d35-9778-9662c7f8d8cc\") " pod="openshift-must-gather-vxxd9/perf-node-gather-daemonset-6x8bd" Mar 18 17:46:49.306113 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:46:49.305982 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b284d161-c961-4d35-9778-9662c7f8d8cc-sys\") pod \"perf-node-gather-daemonset-6x8bd\" (UID: \"b284d161-c961-4d35-9778-9662c7f8d8cc\") " pod="openshift-must-gather-vxxd9/perf-node-gather-daemonset-6x8bd" Mar 18 17:46:49.306113 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:46:49.306000 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/b284d161-c961-4d35-9778-9662c7f8d8cc-proc\") pod \"perf-node-gather-daemonset-6x8bd\" (UID: \"b284d161-c961-4d35-9778-9662c7f8d8cc\") " pod="openshift-must-gather-vxxd9/perf-node-gather-daemonset-6x8bd" Mar 18 17:46:49.306113 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:46:49.306020 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/b284d161-c961-4d35-9778-9662c7f8d8cc-podres\") pod \"perf-node-gather-daemonset-6x8bd\" (UID: \"b284d161-c961-4d35-9778-9662c7f8d8cc\") " pod="openshift-must-gather-vxxd9/perf-node-gather-daemonset-6x8bd" Mar 18 17:46:49.306113 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:46:49.306071 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b284d161-c961-4d35-9778-9662c7f8d8cc-lib-modules\") pod \"perf-node-gather-daemonset-6x8bd\" (UID: \"b284d161-c961-4d35-9778-9662c7f8d8cc\") " pod="openshift-must-gather-vxxd9/perf-node-gather-daemonset-6x8bd" Mar 18 17:46:49.407277 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:46:49.407248 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b284d161-c961-4d35-9778-9662c7f8d8cc-sys\") pod \"perf-node-gather-daemonset-6x8bd\" (UID: \"b284d161-c961-4d35-9778-9662c7f8d8cc\") " pod="openshift-must-gather-vxxd9/perf-node-gather-daemonset-6x8bd" Mar 18 17:46:49.407277 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:46:49.407278 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/b284d161-c961-4d35-9778-9662c7f8d8cc-proc\") pod \"perf-node-gather-daemonset-6x8bd\" (UID: \"b284d161-c961-4d35-9778-9662c7f8d8cc\") " pod="openshift-must-gather-vxxd9/perf-node-gather-daemonset-6x8bd" Mar 18 17:46:49.407543 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:46:49.407299 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/b284d161-c961-4d35-9778-9662c7f8d8cc-podres\") pod \"perf-node-gather-daemonset-6x8bd\" (UID: \"b284d161-c961-4d35-9778-9662c7f8d8cc\") " pod="openshift-must-gather-vxxd9/perf-node-gather-daemonset-6x8bd" Mar 18 17:46:49.407543 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:46:49.407328 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b284d161-c961-4d35-9778-9662c7f8d8cc-lib-modules\") pod \"perf-node-gather-daemonset-6x8bd\" (UID: \"b284d161-c961-4d35-9778-9662c7f8d8cc\") " pod="openshift-must-gather-vxxd9/perf-node-gather-daemonset-6x8bd" Mar 18 17:46:49.407543 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:46:49.407398 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b284d161-c961-4d35-9778-9662c7f8d8cc-sys\") pod \"perf-node-gather-daemonset-6x8bd\" (UID: \"b284d161-c961-4d35-9778-9662c7f8d8cc\") " pod="openshift-must-gather-vxxd9/perf-node-gather-daemonset-6x8bd" Mar 18 17:46:49.407543 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:46:49.407415 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/b284d161-c961-4d35-9778-9662c7f8d8cc-proc\") pod \"perf-node-gather-daemonset-6x8bd\" (UID: \"b284d161-c961-4d35-9778-9662c7f8d8cc\") " pod="openshift-must-gather-vxxd9/perf-node-gather-daemonset-6x8bd" Mar 18 17:46:49.407543 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:46:49.407452 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4znld\" (UniqueName: \"kubernetes.io/projected/b284d161-c961-4d35-9778-9662c7f8d8cc-kube-api-access-4znld\") pod \"perf-node-gather-daemonset-6x8bd\" (UID: \"b284d161-c961-4d35-9778-9662c7f8d8cc\") " pod="openshift-must-gather-vxxd9/perf-node-gather-daemonset-6x8bd" Mar 18 17:46:49.407543 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:46:49.407463 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/b284d161-c961-4d35-9778-9662c7f8d8cc-podres\") pod \"perf-node-gather-daemonset-6x8bd\" (UID: \"b284d161-c961-4d35-9778-9662c7f8d8cc\") " pod="openshift-must-gather-vxxd9/perf-node-gather-daemonset-6x8bd" Mar 18 17:46:49.407543 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:46:49.407520 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b284d161-c961-4d35-9778-9662c7f8d8cc-lib-modules\") pod \"perf-node-gather-daemonset-6x8bd\" (UID: \"b284d161-c961-4d35-9778-9662c7f8d8cc\") " pod="openshift-must-gather-vxxd9/perf-node-gather-daemonset-6x8bd" Mar 18 17:46:49.419765 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:46:49.419740 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4znld\" (UniqueName: \"kubernetes.io/projected/b284d161-c961-4d35-9778-9662c7f8d8cc-kube-api-access-4znld\") pod \"perf-node-gather-daemonset-6x8bd\" (UID: \"b284d161-c961-4d35-9778-9662c7f8d8cc\") " pod="openshift-must-gather-vxxd9/perf-node-gather-daemonset-6x8bd" Mar 18 17:46:49.509197 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:46:49.509171 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vxxd9/perf-node-gather-daemonset-6x8bd" Mar 18 17:46:49.576017 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:46:49.575993 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-67fdcb5769-mgr8h_9cb162d5-5574-4cc5-86c3-c3e6f26276b5/volume-data-source-validator/0.log" Mar 18 17:46:49.635692 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:46:49.635656 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-vxxd9/perf-node-gather-daemonset-6x8bd"] Mar 18 17:46:49.638593 ip-10-0-139-49 kubenswrapper[2575]: W0318 17:46:49.638568 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podb284d161_c961_4d35_9778_9662c7f8d8cc.slice/crio-4c4db51a0fc35f88cf39a99cae00229b14c9297546f777d3d84e16e151290fd3 WatchSource:0}: Error finding container 4c4db51a0fc35f88cf39a99cae00229b14c9297546f777d3d84e16e151290fd3: Status 404 returned error can't find the container with id 4c4db51a0fc35f88cf39a99cae00229b14c9297546f777d3d84e16e151290fd3 Mar 18 17:46:49.951958 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:46:49.951874 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vxxd9/perf-node-gather-daemonset-6x8bd" event={"ID":"b284d161-c961-4d35-9778-9662c7f8d8cc","Type":"ContainerStarted","Data":"9b9a771c53b2a9b1d9d783bca4dbda5835ac46783b1a8cbad6fbbf554394c186"} Mar 18 17:46:49.951958 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:46:49.951913 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vxxd9/perf-node-gather-daemonset-6x8bd" event={"ID":"b284d161-c961-4d35-9778-9662c7f8d8cc","Type":"ContainerStarted","Data":"4c4db51a0fc35f88cf39a99cae00229b14c9297546f777d3d84e16e151290fd3"} Mar 18 17:46:49.952159 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:46:49.951998 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-vxxd9/perf-node-gather-daemonset-6x8bd" Mar 18 17:46:49.973853 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:46:49.973799 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-vxxd9/perf-node-gather-daemonset-6x8bd" podStartSLOduration=0.973782303 podStartE2EDuration="973.782303ms" podCreationTimestamp="2026-03-18 17:46:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:46:49.971320513 +0000 UTC m=+3728.282573434" watchObservedRunningTime="2026-03-18 17:46:49.973782303 +0000 UTC m=+3728.285035224" Mar 18 17:46:50.279631 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:46:50.279562 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-697ns_7443181f-d141-4218-bcc9-ca8fbafa0034/dns/0.log" Mar 18 17:46:50.307837 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:46:50.307818 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-697ns_7443181f-d141-4218-bcc9-ca8fbafa0034/kube-rbac-proxy/0.log" Mar 18 17:46:50.472236 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:46:50.472214 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-kmk8h_0452f967-d43e-4fc8-8591-3f8887d642ef/dns-node-resolver/0.log" Mar 18 17:46:50.926145 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:46:50.926109 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-678cf799c4-b9fk8_e6fb9fc9-d5ea-480c-8389-da779da43ffd/registry/0.log" Mar 18 17:46:51.003683 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:46:51.003645 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-lqcmn_1def291b-47ff-4e2f-b1a9-30275edc2bd9/node-ca/0.log" Mar 18 17:46:52.258377 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:46:52.258342 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-cx9cp_df0141d5-7fdb-4d38-a11e-e2f21fffe1bb/serve-healthcheck-canary/0.log" Mar 18 17:46:52.793301 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:46:52.793275 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-vpb6s_87197265-a9f7-4fd6-bbde-1bec72074140/kube-rbac-proxy/0.log" Mar 18 17:46:52.813242 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:46:52.813222 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-vpb6s_87197265-a9f7-4fd6-bbde-1bec72074140/exporter/0.log" Mar 18 17:46:52.835394 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:46:52.835353 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-vpb6s_87197265-a9f7-4fd6-bbde-1bec72074140/extractor/0.log" Mar 18 17:46:55.133521 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:46:55.133490 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-twgv8_71bd2296-9d68-4ee7-96fc-7080d3255d69/manager/0.log" Mar 18 17:46:55.297925 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:46:55.297890 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-tls-custom-5c88b85bb7-nbfjr_9a9d8fcc-a639-41e7-96d6-a089e8324b5d/seaweedfs-tls-custom/0.log" Mar 18 17:46:55.321597 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:46:55.321573 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-tls-serving-7fd5766db9-9f7vn_af9bfd01-4022-420b-a391-212857be16a7/seaweedfs-tls-serving/0.log" Mar 18 17:46:55.967259 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:46:55.967228 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-vxxd9/perf-node-gather-daemonset-6x8bd" Mar 18 17:46:59.286847 ip-10-0-139-49 kubenswrapper[2575]: E0318 17:46:59.286820 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-ttb7x" podUID="e41abd12-239d-46fb-9cdb-9d35fa51024d" Mar 18 17:46:59.495383 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:46:59.495298 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-6b589cdcc-q79g8_45a755dc-3ea3-453d-b63c-eb0cb38273c6/migrator/0.log" Mar 18 17:46:59.522707 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:46:59.522681 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-6b589cdcc-q79g8_45a755dc-3ea3-453d-b63c-eb0cb38273c6/graceful-termination/0.log" Mar 18 17:46:59.865275 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:46:59.865184 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-866f46547-cwhks_a6b6b964-0003-450e-8d89-0bc782c50559/kube-storage-version-migrator-operator/1.log" Mar 18 17:46:59.866548 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:46:59.866528 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-866f46547-cwhks_a6b6b964-0003-450e-8d89-0bc782c50559/kube-storage-version-migrator-operator/0.log" Mar 18 17:47:01.348794 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:47:01.348766 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-w2ddv_f8f81e7e-bafc-43ea-a249-a4269a4090b1/kube-multus-additional-cni-plugins/0.log" Mar 18 17:47:01.373325 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:47:01.373303 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-w2ddv_f8f81e7e-bafc-43ea-a249-a4269a4090b1/egress-router-binary-copy/0.log" Mar 18 17:47:01.396225 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:47:01.396207 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-w2ddv_f8f81e7e-bafc-43ea-a249-a4269a4090b1/cni-plugins/0.log" Mar 18 17:47:01.419334 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:47:01.419315 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-w2ddv_f8f81e7e-bafc-43ea-a249-a4269a4090b1/bond-cni-plugin/0.log" Mar 18 17:47:01.444855 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:47:01.444830 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-w2ddv_f8f81e7e-bafc-43ea-a249-a4269a4090b1/routeoverride-cni/0.log" Mar 18 17:47:01.471456 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:47:01.471434 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-w2ddv_f8f81e7e-bafc-43ea-a249-a4269a4090b1/whereabouts-cni-bincopy/0.log" Mar 18 17:47:01.565399 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:47:01.565376 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-w2ddv_f8f81e7e-bafc-43ea-a249-a4269a4090b1/whereabouts-cni/0.log" Mar 18 17:47:01.629606 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:47:01.629560 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-qq8q7_84aaa605-3cf0-4df0-9d6c-85011b39807e/kube-multus/0.log" Mar 18 17:47:01.840111 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:47:01.840091 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-nbj8s_d6948911-017b-4b29-b362-5520b984c273/network-metrics-daemon/0.log" Mar 18 17:47:01.873494 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:47:01.873475 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-nbj8s_d6948911-017b-4b29-b362-5520b984c273/kube-rbac-proxy/0.log" Mar 18 17:47:03.156640 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:47:03.156612 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qktrm_a51f93e2-c750-4700-bb46-109456c7c78a/ovn-controller/0.log" Mar 18 17:47:03.174695 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:47:03.174637 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qktrm_a51f93e2-c750-4700-bb46-109456c7c78a/ovn-acl-logging/0.log" Mar 18 17:47:03.194976 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:47:03.194951 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qktrm_a51f93e2-c750-4700-bb46-109456c7c78a/ovn-acl-logging/1.log" Mar 18 17:47:03.223674 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:47:03.223653 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qktrm_a51f93e2-c750-4700-bb46-109456c7c78a/kube-rbac-proxy-node/0.log" Mar 18 17:47:03.249631 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:47:03.249611 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qktrm_a51f93e2-c750-4700-bb46-109456c7c78a/kube-rbac-proxy-ovn-metrics/0.log" Mar 18 17:47:03.272724 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:47:03.272708 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qktrm_a51f93e2-c750-4700-bb46-109456c7c78a/northd/0.log" Mar 18 17:47:03.300530 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:47:03.300497 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qktrm_a51f93e2-c750-4700-bb46-109456c7c78a/nbdb/0.log" Mar 18 17:47:03.326550 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:47:03.326532 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qktrm_a51f93e2-c750-4700-bb46-109456c7c78a/sbdb/0.log" Mar 18 17:47:03.443335 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:47:03.443262 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qktrm_a51f93e2-c750-4700-bb46-109456c7c78a/ovnkube-controller/0.log" Mar 18 17:47:04.816912 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:47:04.816886 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-hr9bv_35d62bb3-aced-4698-ba43-4c7f828a050d/network-check-target-container/0.log" Mar 18 17:47:06.001651 ip-10-0-139-49 kubenswrapper[2575]: I0318 17:47:06.001620 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-kcvj8_af60e70e-528b-44c4-a08d-8bd69ee9547f/iptables-alerter/0.log"