Mar 18 16:43:28.828258 ip-10-0-133-190 systemd[1]: Starting Kubernetes Kubelet... Mar 18 16:43:29.283901 ip-10-0-133-190 kubenswrapper[2576]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 18 16:43:29.283901 ip-10-0-133-190 kubenswrapper[2576]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 18 16:43:29.286730 ip-10-0-133-190 kubenswrapper[2576]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 18 16:43:29.286730 ip-10-0-133-190 kubenswrapper[2576]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Mar 18 16:43:29.286822 ip-10-0-133-190 kubenswrapper[2576]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 18 16:43:29.288005 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.287813 2576 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 18 16:43:29.293627 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.293611 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Mar 18 16:43:29.293627 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.293627 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Mar 18 16:43:29.293694 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.293631 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Mar 18 16:43:29.293694 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.293635 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Mar 18 16:43:29.293694 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.293639 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Mar 18 16:43:29.293694 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.293642 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Mar 18 16:43:29.293694 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.293645 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Mar 18 16:43:29.293694 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.293648 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Mar 18 16:43:29.293694 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.293651 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Mar 18 16:43:29.293694 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.293653 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Mar 18 16:43:29.293694 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.293656 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Mar 18 16:43:29.293694 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.293659 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Mar 18 16:43:29.293694 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.293661 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Mar 18 16:43:29.293694 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.293664 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 18 16:43:29.293694 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.293666 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Mar 18 16:43:29.293694 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.293669 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Mar 18 16:43:29.293694 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.293672 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Mar 18 16:43:29.293694 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.293674 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Mar 18 16:43:29.293694 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.293677 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Mar 18 16:43:29.293694 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.293680 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Mar 18 16:43:29.293694 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.293684 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Mar 18 16:43:29.293694 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.293701 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Mar 18 16:43:29.294191 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.293705 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 18 16:43:29.294191 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.293709 2576 feature_gate.go:328] unrecognized feature gate: Example Mar 18 16:43:29.294191 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.293711 2576 feature_gate.go:328] unrecognized feature gate: Example2 Mar 18 16:43:29.294191 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.293714 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Mar 18 16:43:29.294191 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.293717 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Mar 18 16:43:29.294191 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.293719 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Mar 18 16:43:29.294191 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.293722 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Mar 18 16:43:29.294191 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.293724 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Mar 18 16:43:29.294191 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.293727 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Mar 18 16:43:29.294191 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.293729 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Mar 18 16:43:29.294191 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.293733 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Mar 18 16:43:29.294191 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.293735 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Mar 18 16:43:29.294191 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.293738 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Mar 18 16:43:29.294191 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.293740 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Mar 18 16:43:29.294191 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.293743 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Mar 18 16:43:29.294191 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.293745 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Mar 18 16:43:29.294191 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.293747 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Mar 18 16:43:29.294191 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.293751 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 18 16:43:29.294191 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.293754 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 18 16:43:29.294191 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.293757 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Mar 18 16:43:29.294674 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.293759 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Mar 18 16:43:29.294674 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.293762 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Mar 18 16:43:29.294674 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.293764 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Mar 18 16:43:29.294674 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.293767 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Mar 18 16:43:29.294674 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.293769 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Mar 18 16:43:29.294674 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.293772 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Mar 18 16:43:29.294674 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.293775 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Mar 18 16:43:29.294674 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.293777 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Mar 18 16:43:29.294674 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.293780 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Mar 18 16:43:29.294674 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.293782 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 18 16:43:29.294674 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.293785 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Mar 18 16:43:29.294674 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.293788 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Mar 18 16:43:29.294674 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.293791 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Mar 18 16:43:29.294674 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.293795 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Mar 18 16:43:29.294674 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.293798 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Mar 18 16:43:29.294674 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.293801 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Mar 18 16:43:29.294674 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.293804 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Mar 18 16:43:29.294674 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.293806 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Mar 18 16:43:29.294674 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.293809 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Mar 18 16:43:29.294674 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.293811 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Mar 18 16:43:29.295179 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.293814 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Mar 18 16:43:29.295179 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.293817 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Mar 18 16:43:29.295179 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.293820 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Mar 18 16:43:29.295179 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.293823 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 18 16:43:29.295179 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.293826 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Mar 18 16:43:29.295179 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.293828 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Mar 18 16:43:29.295179 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.293831 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Mar 18 16:43:29.295179 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.293833 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Mar 18 16:43:29.295179 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.293836 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Mar 18 16:43:29.295179 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.293839 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 18 16:43:29.295179 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.293842 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Mar 18 16:43:29.295179 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.293845 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Mar 18 16:43:29.295179 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.293847 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Mar 18 16:43:29.295179 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.293851 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Mar 18 16:43:29.295179 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.293854 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Mar 18 16:43:29.295179 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.293856 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 18 16:43:29.295179 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.293859 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Mar 18 16:43:29.295179 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.293862 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Mar 18 16:43:29.295179 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.293864 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Mar 18 16:43:29.295637 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.293867 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Mar 18 16:43:29.295637 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.293869 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Mar 18 16:43:29.295637 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.293872 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Mar 18 16:43:29.295637 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.293874 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Mar 18 16:43:29.295637 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.293877 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Mar 18 16:43:29.295637 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.294304 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Mar 18 16:43:29.295637 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.294309 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 18 16:43:29.295637 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.294312 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Mar 18 16:43:29.295637 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.294315 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Mar 18 16:43:29.295637 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.294317 2576 feature_gate.go:328] unrecognized feature gate: Example Mar 18 16:43:29.295637 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.294320 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Mar 18 16:43:29.295637 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.294323 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Mar 18 16:43:29.295637 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.294326 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Mar 18 16:43:29.295637 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.294328 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Mar 18 16:43:29.295637 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.294331 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Mar 18 16:43:29.295637 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.294333 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Mar 18 16:43:29.295637 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.294336 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Mar 18 16:43:29.295637 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.294339 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Mar 18 16:43:29.295637 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.294341 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Mar 18 16:43:29.295637 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.294344 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 18 16:43:29.296130 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.294346 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Mar 18 16:43:29.296130 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.294349 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Mar 18 16:43:29.296130 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.294352 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Mar 18 16:43:29.296130 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.294355 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Mar 18 16:43:29.296130 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.294358 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 18 16:43:29.296130 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.294360 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Mar 18 16:43:29.296130 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.294363 2576 feature_gate.go:328] unrecognized feature gate: Example2 Mar 18 16:43:29.296130 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.294366 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Mar 18 16:43:29.296130 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.294369 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Mar 18 16:43:29.296130 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.294371 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Mar 18 16:43:29.296130 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.294374 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Mar 18 16:43:29.296130 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.294377 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Mar 18 16:43:29.296130 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.294380 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Mar 18 16:43:29.296130 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.294382 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Mar 18 16:43:29.296130 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.294385 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Mar 18 16:43:29.296130 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.294388 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Mar 18 16:43:29.296130 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.294390 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Mar 18 16:43:29.296130 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.294393 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Mar 18 16:43:29.296130 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.294396 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Mar 18 16:43:29.296130 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.294398 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Mar 18 16:43:29.296628 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.294402 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 18 16:43:29.296628 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.294406 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Mar 18 16:43:29.296628 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.294409 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 18 16:43:29.296628 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.294413 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Mar 18 16:43:29.296628 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.294416 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Mar 18 16:43:29.296628 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.294419 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Mar 18 16:43:29.296628 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.294421 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Mar 18 16:43:29.296628 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.294425 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Mar 18 16:43:29.296628 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.294428 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Mar 18 16:43:29.296628 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.294431 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Mar 18 16:43:29.296628 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.294434 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Mar 18 16:43:29.296628 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.294438 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 18 16:43:29.296628 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.294440 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Mar 18 16:43:29.296628 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.294443 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Mar 18 16:43:29.296628 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.294446 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Mar 18 16:43:29.296628 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.294449 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Mar 18 16:43:29.296628 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.294452 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Mar 18 16:43:29.296628 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.294454 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Mar 18 16:43:29.296628 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.294457 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Mar 18 16:43:29.297123 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.294460 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Mar 18 16:43:29.297123 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.294462 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Mar 18 16:43:29.297123 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.294465 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Mar 18 16:43:29.297123 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.294468 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Mar 18 16:43:29.297123 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.294470 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Mar 18 16:43:29.297123 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.294473 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Mar 18 16:43:29.297123 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.294476 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Mar 18 16:43:29.297123 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.294478 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Mar 18 16:43:29.297123 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.294481 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Mar 18 16:43:29.297123 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.294483 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Mar 18 16:43:29.297123 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.294486 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Mar 18 16:43:29.297123 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.294488 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Mar 18 16:43:29.297123 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.294491 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Mar 18 16:43:29.297123 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.294494 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Mar 18 16:43:29.297123 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.294496 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Mar 18 16:43:29.297123 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.294499 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 18 16:43:29.297123 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.294503 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Mar 18 16:43:29.297123 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.294505 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Mar 18 16:43:29.297123 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.294508 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Mar 18 16:43:29.297123 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.294510 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Mar 18 16:43:29.297604 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.294513 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Mar 18 16:43:29.297604 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.294515 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Mar 18 16:43:29.297604 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.294518 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Mar 18 16:43:29.297604 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.294520 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 18 16:43:29.297604 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.294523 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Mar 18 16:43:29.297604 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.294525 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Mar 18 16:43:29.297604 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.294529 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Mar 18 16:43:29.297604 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.294532 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Mar 18 16:43:29.297604 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.294534 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Mar 18 16:43:29.297604 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.294536 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Mar 18 16:43:29.297604 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.294539 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Mar 18 16:43:29.297604 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.294542 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Mar 18 16:43:29.297604 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.294619 2576 flags.go:64] FLAG: --address="0.0.0.0" Mar 18 16:43:29.297604 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.294627 2576 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 18 16:43:29.297604 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.294635 2576 flags.go:64] FLAG: --anonymous-auth="true" Mar 18 16:43:29.297604 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.294642 2576 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 18 16:43:29.297604 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.294653 2576 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 18 16:43:29.297604 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.294658 2576 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 18 16:43:29.297604 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.294663 2576 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 18 16:43:29.297604 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.294667 2576 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 18 16:43:29.297604 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.294671 2576 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 18 16:43:29.298125 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.294674 2576 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 18 16:43:29.298125 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.294678 2576 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 18 16:43:29.298125 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.294681 2576 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 18 16:43:29.298125 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.294684 2576 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 18 16:43:29.298125 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.294688 2576 flags.go:64] FLAG: --cgroup-root="" Mar 18 16:43:29.298125 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.294691 2576 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 18 16:43:29.298125 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.294694 2576 flags.go:64] FLAG: --client-ca-file="" Mar 18 16:43:29.298125 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.294696 2576 flags.go:64] FLAG: --cloud-config="" Mar 18 16:43:29.298125 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.294699 2576 flags.go:64] FLAG: --cloud-provider="external" Mar 18 16:43:29.298125 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.294704 2576 flags.go:64] FLAG: --cluster-dns="[]" Mar 18 16:43:29.298125 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.294708 2576 flags.go:64] FLAG: --cluster-domain="" Mar 18 16:43:29.298125 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.294711 2576 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 18 16:43:29.298125 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.294714 2576 flags.go:64] FLAG: --config-dir="" Mar 18 16:43:29.298125 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.294717 2576 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 18 16:43:29.298125 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.294721 2576 flags.go:64] FLAG: --container-log-max-files="5" Mar 18 16:43:29.298125 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.294725 2576 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 18 16:43:29.298125 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.294728 2576 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 18 16:43:29.298125 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.294731 2576 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 18 16:43:29.298125 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.294734 2576 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 18 16:43:29.298125 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.294737 2576 flags.go:64] FLAG: --contention-profiling="false" Mar 18 16:43:29.298125 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.294740 2576 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 18 16:43:29.298125 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.294743 2576 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 18 16:43:29.298125 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.294746 2576 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 18 16:43:29.298125 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.294749 2576 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 18 16:43:29.298125 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.294753 2576 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 18 16:43:29.298752 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.294757 2576 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 18 16:43:29.298752 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.294767 2576 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 18 16:43:29.298752 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.294772 2576 flags.go:64] FLAG: --enable-load-reader="false" Mar 18 16:43:29.298752 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.294775 2576 flags.go:64] FLAG: --enable-server="true" Mar 18 16:43:29.298752 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.294778 2576 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 18 16:43:29.298752 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.294783 2576 flags.go:64] FLAG: --event-burst="100" Mar 18 16:43:29.298752 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.294787 2576 flags.go:64] FLAG: --event-qps="50" Mar 18 16:43:29.298752 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.294790 2576 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 18 16:43:29.298752 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.294793 2576 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 18 16:43:29.298752 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.294796 2576 flags.go:64] FLAG: --eviction-hard="" Mar 18 16:43:29.298752 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.294800 2576 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 18 16:43:29.298752 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.294803 2576 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 18 16:43:29.298752 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.294807 2576 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 18 16:43:29.298752 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.294810 2576 flags.go:64] FLAG: --eviction-soft="" Mar 18 16:43:29.298752 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.294813 2576 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 18 16:43:29.298752 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.294815 2576 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 18 16:43:29.298752 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.294820 2576 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 18 16:43:29.298752 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.294823 2576 flags.go:64] FLAG: --experimental-mounter-path="" Mar 18 16:43:29.298752 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.294826 2576 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 18 16:43:29.298752 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.294829 2576 flags.go:64] FLAG: --fail-swap-on="true" Mar 18 16:43:29.298752 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.294832 2576 flags.go:64] FLAG: --feature-gates="" Mar 18 16:43:29.298752 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.294836 2576 flags.go:64] FLAG: --file-check-frequency="20s" Mar 18 16:43:29.298752 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.294839 2576 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 18 16:43:29.298752 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.294842 2576 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 18 16:43:29.298752 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.294845 2576 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 18 16:43:29.299400 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.294848 2576 flags.go:64] FLAG: --healthz-port="10248" Mar 18 16:43:29.299400 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.294851 2576 flags.go:64] FLAG: --help="false" Mar 18 16:43:29.299400 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.294854 2576 flags.go:64] FLAG: --hostname-override="ip-10-0-133-190.ec2.internal" Mar 18 16:43:29.299400 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.294857 2576 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 18 16:43:29.299400 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.294860 2576 flags.go:64] FLAG: --http-check-frequency="20s" Mar 18 16:43:29.299400 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.294864 2576 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Mar 18 16:43:29.299400 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.294867 2576 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Mar 18 16:43:29.299400 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.294870 2576 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 18 16:43:29.299400 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.294874 2576 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 18 16:43:29.299400 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.294879 2576 flags.go:64] FLAG: --image-service-endpoint="" Mar 18 16:43:29.299400 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.294881 2576 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 18 16:43:29.299400 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.294884 2576 flags.go:64] FLAG: --kube-api-burst="100" Mar 18 16:43:29.299400 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.294887 2576 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 18 16:43:29.299400 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.294891 2576 flags.go:64] FLAG: --kube-api-qps="50" Mar 18 16:43:29.299400 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.294893 2576 flags.go:64] FLAG: --kube-reserved="" Mar 18 16:43:29.299400 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.294897 2576 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 18 16:43:29.299400 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.294899 2576 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 18 16:43:29.299400 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.294902 2576 flags.go:64] FLAG: --kubelet-cgroups="" Mar 18 16:43:29.299400 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.294905 2576 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 18 16:43:29.299400 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.294908 2576 flags.go:64] FLAG: --lock-file="" Mar 18 16:43:29.299400 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.294911 2576 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 18 16:43:29.299400 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.294914 2576 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 18 16:43:29.299400 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.294917 2576 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 18 16:43:29.299400 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.294924 2576 flags.go:64] FLAG: --log-json-split-stream="false" Mar 18 16:43:29.300006 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.294926 2576 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 18 16:43:29.300006 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.294929 2576 flags.go:64] FLAG: --log-text-split-stream="false" Mar 18 16:43:29.300006 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.294932 2576 flags.go:64] FLAG: --logging-format="text" Mar 18 16:43:29.300006 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.294947 2576 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 18 16:43:29.300006 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.294951 2576 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 18 16:43:29.300006 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.294954 2576 flags.go:64] FLAG: --manifest-url="" Mar 18 16:43:29.300006 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.294956 2576 flags.go:64] FLAG: --manifest-url-header="" Mar 18 16:43:29.300006 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.294961 2576 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 18 16:43:29.300006 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.294964 2576 flags.go:64] FLAG: --max-open-files="1000000" Mar 18 16:43:29.300006 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.294969 2576 flags.go:64] FLAG: --max-pods="110" Mar 18 16:43:29.300006 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.294972 2576 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 18 16:43:29.300006 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.294975 2576 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 18 16:43:29.300006 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.294978 2576 flags.go:64] FLAG: --memory-manager-policy="None" Mar 18 16:43:29.300006 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.294980 2576 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 18 16:43:29.300006 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.294983 2576 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 18 16:43:29.300006 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.294986 2576 flags.go:64] FLAG: --node-ip="0.0.0.0" Mar 18 16:43:29.300006 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.294989 2576 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Mar 18 16:43:29.300006 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.294998 2576 flags.go:64] FLAG: --node-status-max-images="50" Mar 18 16:43:29.300006 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.295001 2576 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 18 16:43:29.300006 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.295005 2576 flags.go:64] FLAG: --oom-score-adj="-999" Mar 18 16:43:29.300006 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.295008 2576 flags.go:64] FLAG: --pod-cidr="" Mar 18 16:43:29.300006 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.295010 2576 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b3115b2610585407ab0742648cfbe39c72f57482889f0e778f5ac6fdc482217b" Mar 18 16:43:29.300006 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.295016 2576 flags.go:64] FLAG: --pod-manifest-path="" Mar 18 16:43:29.300612 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.295019 2576 flags.go:64] FLAG: --pod-max-pids="-1" Mar 18 16:43:29.300612 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.295022 2576 flags.go:64] FLAG: --pods-per-core="0" Mar 18 16:43:29.300612 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.295025 2576 flags.go:64] FLAG: --port="10250" Mar 18 16:43:29.300612 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.295029 2576 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 18 16:43:29.300612 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.295031 2576 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-07931cd810a5d52ca" Mar 18 16:43:29.300612 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.295035 2576 flags.go:64] FLAG: --qos-reserved="" Mar 18 16:43:29.300612 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.295038 2576 flags.go:64] FLAG: --read-only-port="10255" Mar 18 16:43:29.300612 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.295041 2576 flags.go:64] FLAG: --register-node="true" Mar 18 16:43:29.300612 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.295045 2576 flags.go:64] FLAG: --register-schedulable="true" Mar 18 16:43:29.300612 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.295048 2576 flags.go:64] FLAG: --register-with-taints="" Mar 18 16:43:29.300612 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.295051 2576 flags.go:64] FLAG: --registry-burst="10" Mar 18 16:43:29.300612 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.295054 2576 flags.go:64] FLAG: --registry-qps="5" Mar 18 16:43:29.300612 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.295057 2576 flags.go:64] FLAG: --reserved-cpus="" Mar 18 16:43:29.300612 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.295060 2576 flags.go:64] FLAG: --reserved-memory="" Mar 18 16:43:29.300612 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.295064 2576 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 18 16:43:29.300612 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.295067 2576 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 18 16:43:29.300612 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.295070 2576 flags.go:64] FLAG: --rotate-certificates="false" Mar 18 16:43:29.300612 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.295073 2576 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 18 16:43:29.300612 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.295076 2576 flags.go:64] FLAG: --runonce="false" Mar 18 16:43:29.300612 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.295079 2576 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 18 16:43:29.300612 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.295082 2576 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 18 16:43:29.300612 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.295085 2576 flags.go:64] FLAG: --seccomp-default="false" Mar 18 16:43:29.300612 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.295088 2576 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 18 16:43:29.300612 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.295091 2576 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 18 16:43:29.300612 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.295094 2576 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 18 16:43:29.300612 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.295097 2576 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 18 16:43:29.301306 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.295100 2576 flags.go:64] FLAG: --storage-driver-password="root" Mar 18 16:43:29.301306 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.295105 2576 flags.go:64] FLAG: --storage-driver-secure="false" Mar 18 16:43:29.301306 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.295108 2576 flags.go:64] FLAG: --storage-driver-table="stats" Mar 18 16:43:29.301306 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.295111 2576 flags.go:64] FLAG: --storage-driver-user="root" Mar 18 16:43:29.301306 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.295114 2576 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 18 16:43:29.301306 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.295117 2576 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 18 16:43:29.301306 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.295120 2576 flags.go:64] FLAG: --system-cgroups="" Mar 18 16:43:29.301306 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.295123 2576 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Mar 18 16:43:29.301306 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.295129 2576 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 18 16:43:29.301306 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.295132 2576 flags.go:64] FLAG: --tls-cert-file="" Mar 18 16:43:29.301306 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.295134 2576 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 18 16:43:29.301306 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.295138 2576 flags.go:64] FLAG: --tls-min-version="" Mar 18 16:43:29.301306 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.295141 2576 flags.go:64] FLAG: --tls-private-key-file="" Mar 18 16:43:29.301306 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.295144 2576 flags.go:64] FLAG: --topology-manager-policy="none" Mar 18 16:43:29.301306 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.295148 2576 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 18 16:43:29.301306 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.295151 2576 flags.go:64] FLAG: --topology-manager-scope="container" Mar 18 16:43:29.301306 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.295154 2576 flags.go:64] FLAG: --v="2" Mar 18 16:43:29.301306 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.295159 2576 flags.go:64] FLAG: --version="false" Mar 18 16:43:29.301306 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.295163 2576 flags.go:64] FLAG: --vmodule="" Mar 18 16:43:29.301306 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.295167 2576 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 18 16:43:29.301306 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.295170 2576 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 18 16:43:29.301306 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.295270 2576 feature_gate.go:328] unrecognized feature gate: Example2 Mar 18 16:43:29.301306 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.295273 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Mar 18 16:43:29.301306 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.295276 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Mar 18 16:43:29.301919 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.295279 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Mar 18 16:43:29.301919 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.295282 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Mar 18 16:43:29.301919 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.295285 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Mar 18 16:43:29.301919 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.295287 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Mar 18 16:43:29.301919 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.295290 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Mar 18 16:43:29.301919 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.295293 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 18 16:43:29.301919 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.295295 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Mar 18 16:43:29.301919 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.295298 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Mar 18 16:43:29.301919 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.295300 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Mar 18 16:43:29.301919 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.295304 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Mar 18 16:43:29.301919 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.295313 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Mar 18 16:43:29.301919 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.295316 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Mar 18 16:43:29.301919 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.295319 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Mar 18 16:43:29.301919 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.295322 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Mar 18 16:43:29.301919 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.295325 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Mar 18 16:43:29.301919 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.295327 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Mar 18 16:43:29.301919 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.295330 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Mar 18 16:43:29.301919 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.295333 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Mar 18 16:43:29.301919 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.295335 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Mar 18 16:43:29.301919 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.295338 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Mar 18 16:43:29.302481 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.295341 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Mar 18 16:43:29.302481 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.295343 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Mar 18 16:43:29.302481 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.295347 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Mar 18 16:43:29.302481 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.295350 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Mar 18 16:43:29.302481 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.295352 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Mar 18 16:43:29.302481 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.295355 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Mar 18 16:43:29.302481 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.295358 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Mar 18 16:43:29.302481 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.295360 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Mar 18 16:43:29.302481 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.295363 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Mar 18 16:43:29.302481 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.295366 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 18 16:43:29.302481 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.295368 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Mar 18 16:43:29.302481 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.295371 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Mar 18 16:43:29.302481 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.295374 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Mar 18 16:43:29.302481 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.295376 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Mar 18 16:43:29.302481 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.295379 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Mar 18 16:43:29.302481 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.295381 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 18 16:43:29.302481 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.295384 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Mar 18 16:43:29.302481 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.295386 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Mar 18 16:43:29.302481 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.295389 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Mar 18 16:43:29.302481 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.295392 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 18 16:43:29.302996 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.295394 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Mar 18 16:43:29.302996 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.295398 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Mar 18 16:43:29.302996 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.295401 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Mar 18 16:43:29.302996 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.295404 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Mar 18 16:43:29.302996 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.295407 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Mar 18 16:43:29.302996 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.295410 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 18 16:43:29.302996 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.295414 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Mar 18 16:43:29.302996 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.295418 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Mar 18 16:43:29.302996 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.295420 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Mar 18 16:43:29.302996 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.295423 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Mar 18 16:43:29.302996 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.295425 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 18 16:43:29.302996 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.295428 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Mar 18 16:43:29.302996 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.295430 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Mar 18 16:43:29.302996 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.295433 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Mar 18 16:43:29.302996 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.295437 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Mar 18 16:43:29.302996 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.295439 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Mar 18 16:43:29.302996 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.295442 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 18 16:43:29.302996 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.295444 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Mar 18 16:43:29.302996 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.295447 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Mar 18 16:43:29.303457 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.295449 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Mar 18 16:43:29.303457 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.295452 2576 feature_gate.go:328] unrecognized feature gate: Example Mar 18 16:43:29.303457 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.295455 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 18 16:43:29.303457 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.295457 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Mar 18 16:43:29.303457 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.295460 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Mar 18 16:43:29.303457 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.295462 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Mar 18 16:43:29.303457 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.295465 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Mar 18 16:43:29.303457 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.295467 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Mar 18 16:43:29.303457 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.295470 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Mar 18 16:43:29.303457 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.295472 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Mar 18 16:43:29.303457 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.295475 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Mar 18 16:43:29.303457 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.295477 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Mar 18 16:43:29.303457 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.295479 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Mar 18 16:43:29.303457 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.295482 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Mar 18 16:43:29.303457 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.295486 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Mar 18 16:43:29.303457 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.295490 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Mar 18 16:43:29.303457 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.295494 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Mar 18 16:43:29.303457 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.295497 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Mar 18 16:43:29.303457 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.295500 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Mar 18 16:43:29.303457 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.295504 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Mar 18 16:43:29.303963 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.295506 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Mar 18 16:43:29.303963 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.295509 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Mar 18 16:43:29.303963 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.295512 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Mar 18 16:43:29.303963 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.295515 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Mar 18 16:43:29.303963 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.296477 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Mar 18 16:43:29.303963 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.303293 2576 server.go:530] "Kubelet version" kubeletVersion="v1.33.8" Mar 18 16:43:29.303963 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.303308 2576 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 18 16:43:29.303963 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303355 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Mar 18 16:43:29.303963 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303360 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Mar 18 16:43:29.303963 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303364 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Mar 18 16:43:29.303963 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303367 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Mar 18 16:43:29.303963 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303370 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Mar 18 16:43:29.303963 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303373 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 18 16:43:29.303963 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303376 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Mar 18 16:43:29.303963 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303379 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Mar 18 16:43:29.304360 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303382 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 18 16:43:29.304360 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303385 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Mar 18 16:43:29.304360 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303388 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Mar 18 16:43:29.304360 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303390 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Mar 18 16:43:29.304360 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303393 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Mar 18 16:43:29.304360 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303396 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Mar 18 16:43:29.304360 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303398 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Mar 18 16:43:29.304360 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303401 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Mar 18 16:43:29.304360 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303404 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Mar 18 16:43:29.304360 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303406 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Mar 18 16:43:29.304360 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303409 2576 feature_gate.go:328] unrecognized feature gate: Example2 Mar 18 16:43:29.304360 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303411 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Mar 18 16:43:29.304360 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303414 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Mar 18 16:43:29.304360 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303417 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Mar 18 16:43:29.304360 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303421 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Mar 18 16:43:29.304360 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303423 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Mar 18 16:43:29.304360 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303426 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Mar 18 16:43:29.304360 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303428 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Mar 18 16:43:29.304360 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303431 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Mar 18 16:43:29.304360 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303434 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Mar 18 16:43:29.304841 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303436 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Mar 18 16:43:29.304841 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303439 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Mar 18 16:43:29.304841 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303441 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Mar 18 16:43:29.304841 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303444 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Mar 18 16:43:29.304841 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303447 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Mar 18 16:43:29.304841 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303449 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Mar 18 16:43:29.304841 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303452 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Mar 18 16:43:29.304841 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303455 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 18 16:43:29.304841 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303457 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Mar 18 16:43:29.304841 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303460 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Mar 18 16:43:29.304841 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303463 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Mar 18 16:43:29.304841 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303466 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Mar 18 16:43:29.304841 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303471 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Mar 18 16:43:29.304841 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303475 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Mar 18 16:43:29.304841 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303478 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Mar 18 16:43:29.304841 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303481 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Mar 18 16:43:29.304841 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303484 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Mar 18 16:43:29.304841 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303488 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 18 16:43:29.304841 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303492 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Mar 18 16:43:29.305345 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303495 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Mar 18 16:43:29.305345 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303497 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Mar 18 16:43:29.305345 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303500 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Mar 18 16:43:29.305345 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303503 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 18 16:43:29.305345 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303506 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Mar 18 16:43:29.305345 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303508 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Mar 18 16:43:29.305345 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303511 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Mar 18 16:43:29.305345 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303514 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 18 16:43:29.305345 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303517 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Mar 18 16:43:29.305345 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303520 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Mar 18 16:43:29.305345 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303522 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Mar 18 16:43:29.305345 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303525 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Mar 18 16:43:29.305345 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303528 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Mar 18 16:43:29.305345 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303530 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Mar 18 16:43:29.305345 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303533 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Mar 18 16:43:29.305345 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303535 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Mar 18 16:43:29.305345 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303538 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Mar 18 16:43:29.305345 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303540 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Mar 18 16:43:29.305345 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303543 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Mar 18 16:43:29.305345 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303546 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Mar 18 16:43:29.305845 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303549 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Mar 18 16:43:29.305845 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303553 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Mar 18 16:43:29.305845 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303555 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Mar 18 16:43:29.305845 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303558 2576 feature_gate.go:328] unrecognized feature gate: Example Mar 18 16:43:29.305845 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303560 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Mar 18 16:43:29.305845 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303563 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Mar 18 16:43:29.305845 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303566 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Mar 18 16:43:29.305845 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303569 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Mar 18 16:43:29.305845 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303571 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Mar 18 16:43:29.305845 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303574 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Mar 18 16:43:29.305845 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303576 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Mar 18 16:43:29.305845 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303579 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Mar 18 16:43:29.305845 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303581 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Mar 18 16:43:29.305845 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303584 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 18 16:43:29.305845 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303586 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Mar 18 16:43:29.305845 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303589 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Mar 18 16:43:29.305845 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303592 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Mar 18 16:43:29.305845 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303595 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Mar 18 16:43:29.305845 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303597 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 18 16:43:29.306316 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.303603 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Mar 18 16:43:29.306316 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303718 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Mar 18 16:43:29.306316 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303723 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Mar 18 16:43:29.306316 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303727 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Mar 18 16:43:29.306316 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303730 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Mar 18 16:43:29.306316 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303732 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Mar 18 16:43:29.306316 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303736 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Mar 18 16:43:29.306316 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303739 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Mar 18 16:43:29.306316 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303742 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Mar 18 16:43:29.306316 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303745 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Mar 18 16:43:29.306316 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303748 2576 feature_gate.go:328] unrecognized feature gate: Example Mar 18 16:43:29.306316 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303750 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 18 16:43:29.306316 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303753 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Mar 18 16:43:29.306316 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303756 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Mar 18 16:43:29.306316 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303758 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Mar 18 16:43:29.306316 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303761 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Mar 18 16:43:29.306753 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303763 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Mar 18 16:43:29.306753 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303766 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Mar 18 16:43:29.306753 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303768 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Mar 18 16:43:29.306753 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303771 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Mar 18 16:43:29.306753 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303774 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Mar 18 16:43:29.306753 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303776 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Mar 18 16:43:29.306753 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303780 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Mar 18 16:43:29.306753 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303784 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Mar 18 16:43:29.306753 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303787 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Mar 18 16:43:29.306753 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303790 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Mar 18 16:43:29.306753 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303792 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Mar 18 16:43:29.306753 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303795 2576 feature_gate.go:328] unrecognized feature gate: Example2 Mar 18 16:43:29.306753 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303797 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Mar 18 16:43:29.306753 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303800 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Mar 18 16:43:29.306753 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303802 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Mar 18 16:43:29.306753 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303805 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 18 16:43:29.306753 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303808 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Mar 18 16:43:29.306753 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303811 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Mar 18 16:43:29.306753 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303813 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Mar 18 16:43:29.307306 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303816 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Mar 18 16:43:29.307306 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303819 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Mar 18 16:43:29.307306 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303821 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Mar 18 16:43:29.307306 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303824 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Mar 18 16:43:29.307306 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303827 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Mar 18 16:43:29.307306 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303829 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Mar 18 16:43:29.307306 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303832 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Mar 18 16:43:29.307306 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303834 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Mar 18 16:43:29.307306 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303837 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Mar 18 16:43:29.307306 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303839 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Mar 18 16:43:29.307306 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303842 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Mar 18 16:43:29.307306 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303844 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Mar 18 16:43:29.307306 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303847 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 18 16:43:29.307306 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303849 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Mar 18 16:43:29.307306 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303852 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Mar 18 16:43:29.307306 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303854 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Mar 18 16:43:29.307306 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303857 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Mar 18 16:43:29.307306 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303859 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Mar 18 16:43:29.307306 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303862 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Mar 18 16:43:29.307306 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303864 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Mar 18 16:43:29.307802 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303867 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Mar 18 16:43:29.307802 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303869 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Mar 18 16:43:29.307802 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303872 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Mar 18 16:43:29.307802 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303874 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Mar 18 16:43:29.307802 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303877 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Mar 18 16:43:29.307802 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303879 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Mar 18 16:43:29.307802 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303882 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Mar 18 16:43:29.307802 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303884 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Mar 18 16:43:29.307802 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303887 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Mar 18 16:43:29.307802 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303890 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 18 16:43:29.307802 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303892 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Mar 18 16:43:29.307802 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303895 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Mar 18 16:43:29.307802 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303898 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Mar 18 16:43:29.307802 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303901 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Mar 18 16:43:29.307802 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303903 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Mar 18 16:43:29.307802 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303908 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 18 16:43:29.307802 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303912 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Mar 18 16:43:29.307802 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303915 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Mar 18 16:43:29.307802 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303918 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Mar 18 16:43:29.307802 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303921 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Mar 18 16:43:29.308336 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303923 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Mar 18 16:43:29.308336 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303926 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Mar 18 16:43:29.308336 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303928 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 18 16:43:29.308336 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303931 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Mar 18 16:43:29.308336 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303933 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Mar 18 16:43:29.308336 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303948 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Mar 18 16:43:29.308336 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303951 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Mar 18 16:43:29.308336 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303954 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Mar 18 16:43:29.308336 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303956 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 18 16:43:29.308336 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303959 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Mar 18 16:43:29.308336 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303961 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 18 16:43:29.308336 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:29.303964 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Mar 18 16:43:29.308336 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.303969 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Mar 18 16:43:29.308336 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.304082 2576 server.go:962] "Client rotation is on, will bootstrap in background" Mar 18 16:43:29.308336 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.306639 2576 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 18 16:43:29.308705 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.308035 2576 server.go:1019] "Starting client certificate rotation" Mar 18 16:43:29.308705 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.308136 2576 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Mar 18 16:43:29.308705 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.308171 2576 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Mar 18 16:43:29.336810 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.336786 2576 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 18 16:43:29.345570 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.345543 2576 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 18 16:43:29.361892 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.361871 2576 log.go:25] "Validated CRI v1 runtime API" Mar 18 16:43:29.366737 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.366716 2576 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Mar 18 16:43:29.367992 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.367974 2576 log.go:25] "Validated CRI v1 image API" Mar 18 16:43:29.369711 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.369695 2576 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 18 16:43:29.373228 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.373208 2576 fs.go:135] Filesystem UUIDs: map[58148b22-6a80-49f5-aaec-47e1842bcfd2:/dev/nvme0n1p4 6c28b16a-9194-459d-b013-425e4c9d3a46:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2] Mar 18 16:43:29.373280 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.373229 2576 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Mar 18 16:43:29.378832 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.378711 2576 manager.go:217] Machine: {Timestamp:2026-03-18 16:43:29.377486767 +0000 UTC m=+0.427302185 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3100420 MemoryCapacity:33164488704 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec228ad7d629815f1d3d36f9eea6183c SystemUUID:ec228ad7-d629-815f-1d3d-36f9eea6183c BootID:da9993a5-323b-40a9-ac5d-542fadaa0132 Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6094848 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:25:fc:77:d4:d1 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:25:fc:77:d4:d1 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:0e:6c:55:10:f2:4c Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164488704 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 18 16:43:29.378832 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.378827 2576 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 18 16:43:29.378934 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.378912 2576 manager.go:233] Version: {KernelVersion:5.14.0-570.96.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260303-1 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 18 16:43:29.380558 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.380530 2576 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 18 16:43:29.380708 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.380559 2576 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-133-190.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 18 16:43:29.380757 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.380721 2576 topology_manager.go:138] "Creating topology manager with none policy" Mar 18 16:43:29.380757 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.380730 2576 container_manager_linux.go:306] "Creating device plugin manager" Mar 18 16:43:29.380757 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.380746 2576 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 18 16:43:29.380834 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.380759 2576 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 18 16:43:29.382756 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.382746 2576 state_mem.go:36] "Initialized new in-memory state store" Mar 18 16:43:29.382877 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.382867 2576 server.go:1267] "Using root directory" path="/var/lib/kubelet" Mar 18 16:43:29.385479 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.385469 2576 kubelet.go:491] "Attempting to sync node with API server" Mar 18 16:43:29.385538 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.385483 2576 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 18 16:43:29.386211 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.386199 2576 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 18 16:43:29.386259 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.386217 2576 kubelet.go:397] "Adding apiserver pod source" Mar 18 16:43:29.386259 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.386230 2576 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 18 16:43:29.387570 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.387559 2576 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Mar 18 16:43:29.387615 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.387577 2576 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Mar 18 16:43:29.390791 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.390745 2576 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-lvb9z" Mar 18 16:43:29.393616 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.393596 2576 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.9-3.rhaos4.20.gitb9ac835.el9" apiVersion="v1" Mar 18 16:43:29.395654 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.395632 2576 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Mar 18 16:43:29.396368 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.396350 2576 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-lvb9z" Mar 18 16:43:29.397860 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.397846 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 18 16:43:29.397860 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.397863 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 18 16:43:29.397959 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.397869 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 18 16:43:29.397959 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.397874 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 18 16:43:29.397959 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.397880 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 18 16:43:29.397959 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.397886 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 18 16:43:29.397959 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.397891 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 18 16:43:29.397959 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.397897 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 18 16:43:29.397959 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.397904 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 18 16:43:29.397959 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.397910 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 18 16:43:29.397959 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.397918 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 18 16:43:29.397959 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.397927 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 18 16:43:29.399018 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.399006 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 18 16:43:29.399065 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.399021 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Mar 18 16:43:29.402828 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.402815 2576 watchdog_linux.go:99] "Systemd watchdog is not enabled" Mar 18 16:43:29.402903 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.402851 2576 server.go:1295] "Started kubelet" Mar 18 16:43:29.403028 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.402977 2576 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Mar 18 16:43:29.403570 ip-10-0-133-190 systemd[1]: Started Kubernetes Kubelet. Mar 18 16:43:29.404143 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.404036 2576 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Mar 18 16:43:29.404680 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.404632 2576 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 18 16:43:29.404728 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.404705 2576 server_v1.go:47] "podresources" method="list" useActivePods=true Mar 18 16:43:29.405013 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.404989 2576 server.go:317] "Adding debug handlers to kubelet server" Mar 18 16:43:29.405314 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.405296 2576 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Mar 18 16:43:29.405722 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.405703 2576 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 18 16:43:29.407400 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.407383 2576 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-133-190.ec2.internal" not found Mar 18 16:43:29.411834 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.411813 2576 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Mar 18 16:43:29.412364 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.412350 2576 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 18 16:43:29.413652 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:43:29.413629 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-190.ec2.internal\" not found" Mar 18 16:43:29.413749 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:43:29.413707 2576 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Mar 18 16:43:29.413879 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.413859 2576 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 18 16:43:29.413879 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.413878 2576 factory.go:55] Registering systemd factory Mar 18 16:43:29.414069 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.413887 2576 factory.go:223] Registration of the systemd container factory successfully Mar 18 16:43:29.414069 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.413986 2576 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Mar 18 16:43:29.414069 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.413987 2576 volume_manager.go:295] "The desired_state_of_world populator starts" Mar 18 16:43:29.414069 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.414013 2576 volume_manager.go:297] "Starting Kubelet Volume Manager" Mar 18 16:43:29.414240 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.414172 2576 factory.go:153] Registering CRI-O factory Mar 18 16:43:29.414240 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.414184 2576 reconstruct.go:97] "Volume reconstruction finished" Mar 18 16:43:29.414240 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.414192 2576 reconciler.go:26] "Reconciler: start to sync state" Mar 18 16:43:29.414240 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.414186 2576 factory.go:223] Registration of the crio container factory successfully Mar 18 16:43:29.414240 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.414222 2576 factory.go:103] Registering Raw factory Mar 18 16:43:29.414240 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.414232 2576 manager.go:1196] Started watching for new ooms in manager Mar 18 16:43:29.414781 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.414619 2576 manager.go:319] Starting recovery of all containers Mar 18 16:43:29.415893 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.415864 2576 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Mar 18 16:43:29.417985 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:43:29.417929 2576 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-133-190.ec2.internal\" not found" node="ip-10-0-133-190.ec2.internal" Mar 18 16:43:29.424036 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.424005 2576 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-133-190.ec2.internal" not found Mar 18 16:43:29.427104 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.427088 2576 manager.go:324] Recovery completed Mar 18 16:43:29.431230 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.431217 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 18 16:43:29.432947 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.432921 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-190.ec2.internal" event="NodeHasSufficientMemory" Mar 18 16:43:29.433003 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.432963 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-190.ec2.internal" event="NodeHasNoDiskPressure" Mar 18 16:43:29.433003 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.432973 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-190.ec2.internal" event="NodeHasSufficientPID" Mar 18 16:43:29.433400 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.433387 2576 cpu_manager.go:222] "Starting CPU manager" policy="none" Mar 18 16:43:29.433458 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.433399 2576 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Mar 18 16:43:29.433458 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.433418 2576 state_mem.go:36] "Initialized new in-memory state store" Mar 18 16:43:29.434601 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.434590 2576 policy_none.go:49] "None policy: Start" Mar 18 16:43:29.434639 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.434605 2576 memory_manager.go:186] "Starting memorymanager" policy="None" Mar 18 16:43:29.434639 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.434614 2576 state_mem.go:35] "Initializing new in-memory state store" Mar 18 16:43:29.477739 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.477574 2576 manager.go:341] "Starting Device Plugin manager" Mar 18 16:43:29.495930 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:43:29.477781 2576 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Mar 18 16:43:29.495930 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.477793 2576 server.go:85] "Starting device plugin registration server" Mar 18 16:43:29.495930 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.478023 2576 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 18 16:43:29.495930 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.478033 2576 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 18 16:43:29.495930 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.478146 2576 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 18 16:43:29.495930 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.478260 2576 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 18 16:43:29.495930 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.478278 2576 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 18 16:43:29.495930 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:43:29.478749 2576 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Mar 18 16:43:29.495930 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:43:29.478802 2576 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-133-190.ec2.internal\" not found" Mar 18 16:43:29.495930 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.485668 2576 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-133-190.ec2.internal" not found Mar 18 16:43:29.556757 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.556670 2576 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Mar 18 16:43:29.557989 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.557968 2576 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Mar 18 16:43:29.558059 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.557999 2576 status_manager.go:230] "Starting to sync pod status with apiserver" Mar 18 16:43:29.558059 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.558040 2576 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Mar 18 16:43:29.558059 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.558049 2576 kubelet.go:2451] "Starting kubelet main sync loop" Mar 18 16:43:29.558184 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:43:29.558092 2576 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Mar 18 16:43:29.561030 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.561005 2576 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Mar 18 16:43:29.578392 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.578373 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 18 16:43:29.579216 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.579200 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-190.ec2.internal" event="NodeHasSufficientMemory" Mar 18 16:43:29.579298 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.579230 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-190.ec2.internal" event="NodeHasNoDiskPressure" Mar 18 16:43:29.579298 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.579249 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-190.ec2.internal" event="NodeHasSufficientPID" Mar 18 16:43:29.579298 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.579292 2576 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-133-190.ec2.internal" Mar 18 16:43:29.589297 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.589268 2576 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-133-190.ec2.internal" Mar 18 16:43:29.658482 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.658447 2576 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-190.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-133-190.ec2.internal"] Mar 18 16:43:29.661051 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.661024 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-190.ec2.internal" Mar 18 16:43:29.661172 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.661025 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-190.ec2.internal" Mar 18 16:43:29.677486 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.677469 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-190.ec2.internal" Mar 18 16:43:29.681894 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.681879 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-190.ec2.internal" Mar 18 16:43:29.696128 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.696105 2576 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Mar 18 16:43:29.696226 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.696155 2576 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Mar 18 16:43:29.716285 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.716259 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/7e78a67ecfa4cc7c383866ea4270c748-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-190.ec2.internal\" (UID: \"7e78a67ecfa4cc7c383866ea4270c748\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-190.ec2.internal" Mar 18 16:43:29.716394 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.716288 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7e78a67ecfa4cc7c383866ea4270c748-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-190.ec2.internal\" (UID: \"7e78a67ecfa4cc7c383866ea4270c748\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-190.ec2.internal" Mar 18 16:43:29.716394 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.716306 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/1fbacc3938de459504eb6ed80d05d8dc-config\") pod \"kube-apiserver-proxy-ip-10-0-133-190.ec2.internal\" (UID: \"1fbacc3938de459504eb6ed80d05d8dc\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-190.ec2.internal" Mar 18 16:43:29.817338 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.817263 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/7e78a67ecfa4cc7c383866ea4270c748-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-190.ec2.internal\" (UID: \"7e78a67ecfa4cc7c383866ea4270c748\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-190.ec2.internal" Mar 18 16:43:29.817338 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.817301 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7e78a67ecfa4cc7c383866ea4270c748-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-190.ec2.internal\" (UID: \"7e78a67ecfa4cc7c383866ea4270c748\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-190.ec2.internal" Mar 18 16:43:29.817479 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.817339 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/1fbacc3938de459504eb6ed80d05d8dc-config\") pod \"kube-apiserver-proxy-ip-10-0-133-190.ec2.internal\" (UID: \"1fbacc3938de459504eb6ed80d05d8dc\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-190.ec2.internal" Mar 18 16:43:29.817479 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.817355 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/7e78a67ecfa4cc7c383866ea4270c748-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-190.ec2.internal\" (UID: \"7e78a67ecfa4cc7c383866ea4270c748\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-190.ec2.internal" Mar 18 16:43:29.817479 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.817392 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/1fbacc3938de459504eb6ed80d05d8dc-config\") pod \"kube-apiserver-proxy-ip-10-0-133-190.ec2.internal\" (UID: \"1fbacc3938de459504eb6ed80d05d8dc\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-190.ec2.internal" Mar 18 16:43:29.817479 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.817426 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7e78a67ecfa4cc7c383866ea4270c748-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-190.ec2.internal\" (UID: \"7e78a67ecfa4cc7c383866ea4270c748\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-190.ec2.internal" Mar 18 16:43:29.999802 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:29.999744 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-190.ec2.internal" Mar 18 16:43:30.000805 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.000786 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-190.ec2.internal" Mar 18 16:43:30.308015 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.307907 2576 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 18 16:43:30.308532 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.308088 2576 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Mar 18 16:43:30.308532 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.308132 2576 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Mar 18 16:43:30.308532 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.308127 2576 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Mar 18 16:43:30.387291 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.387258 2576 apiserver.go:52] "Watching apiserver" Mar 18 16:43:30.391719 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.391697 2576 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Mar 18 16:43:30.392084 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.392066 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-sr4nn","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-190.ec2.internal","openshift-multus/multus-additional-cni-plugins-x7hss","openshift-multus/multus-wqbkg","openshift-network-diagnostics/network-check-target-j676s","kube-system/kube-apiserver-proxy-ip-10-0-133-190.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lm2bh","openshift-multus/network-metrics-daemon-bd2jt","openshift-network-operator/iptables-alerter-hqszc","openshift-ovn-kubernetes/ovnkube-node-g2jdr","kube-system/konnectivity-agent-vclgl","openshift-cluster-node-tuning-operator/tuned-bt66t","openshift-dns/node-resolver-9w7wj"] Mar 18 16:43:30.393876 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.393856 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-sr4nn" Mar 18 16:43:30.396530 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.395104 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-x7hss" Mar 18 16:43:30.396530 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.396239 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Mar 18 16:43:30.396530 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.396399 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-lstmt\"" Mar 18 16:43:30.396892 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.396876 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Mar 18 16:43:30.397583 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.397563 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Mar 18 16:43:30.397583 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.397567 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Mar 18 16:43:30.397743 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.397680 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Mar 18 16:43:30.397743 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.397691 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Mar 18 16:43:30.397743 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.397700 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-03-17 16:38:29 +0000 UTC" deadline="2027-09-10 10:38:09.147720631 +0000 UTC" Mar 18 16:43:30.397743 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.397722 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12977h54m38.750000731s" Mar 18 16:43:30.397957 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.397747 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Mar 18 16:43:30.397957 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.397775 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Mar 18 16:43:30.397957 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.397836 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-7z5fg\"" Mar 18 16:43:30.398668 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.398652 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-wqbkg" Mar 18 16:43:30.399934 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.399921 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-j676s" Mar 18 16:43:30.400046 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:43:30.400007 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-j676s" podUID="ce437450-60be-4502-aa99-03a4af6c8e7c" Mar 18 16:43:30.400217 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.400202 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Mar 18 16:43:30.400299 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.400278 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-bnwwn\"" Mar 18 16:43:30.401253 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.401239 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lm2bh" Mar 18 16:43:30.402617 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.402600 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bd2jt" Mar 18 16:43:30.402703 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.402674 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Mar 18 16:43:30.402703 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:43:30.402670 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bd2jt" podUID="e3720b07-5dd9-403e-9a66-17abd67f145f" Mar 18 16:43:30.403070 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.403054 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Mar 18 16:43:30.403070 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.403067 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-9rc74\"" Mar 18 16:43:30.403191 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.403088 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Mar 18 16:43:30.403952 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.403918 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-hqszc" Mar 18 16:43:30.405456 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.405439 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-g2jdr" Mar 18 16:43:30.405677 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.405578 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Mar 18 16:43:30.405780 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.405765 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-kjv5x\"" Mar 18 16:43:30.405827 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.405814 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Mar 18 16:43:30.405908 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.405895 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Mar 18 16:43:30.406883 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.406863 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-vclgl" Mar 18 16:43:30.407396 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.407380 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Mar 18 16:43:30.407396 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.407394 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-rfqzx\"" Mar 18 16:43:30.407396 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.407400 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Mar 18 16:43:30.407826 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.407384 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Mar 18 16:43:30.407826 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.407786 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Mar 18 16:43:30.407826 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.407810 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Mar 18 16:43:30.408301 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.407904 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Mar 18 16:43:30.408477 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.408460 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Mar 18 16:43:30.408574 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.408510 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-bt66t" Mar 18 16:43:30.408634 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.408584 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-q6bb2\"" Mar 18 16:43:30.408876 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.408863 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Mar 18 16:43:30.409996 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.409979 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-9w7wj" Mar 18 16:43:30.410229 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.410212 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Mar 18 16:43:30.410289 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.410272 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-wmhkw\"" Mar 18 16:43:30.410357 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.410345 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Mar 18 16:43:30.411886 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.411868 2576 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Mar 18 16:43:30.411998 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.411927 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-84sd2\"" Mar 18 16:43:30.411998 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.411956 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Mar 18 16:43:30.412114 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.412019 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Mar 18 16:43:30.415020 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.415004 2576 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Mar 18 16:43:30.419080 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.419056 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcsf6\" (UniqueName: \"kubernetes.io/projected/fc696d94-dd9b-4ec4-8c68-3ef333b3aaa1-kube-api-access-wcsf6\") pod \"multus-additional-cni-plugins-x7hss\" (UID: \"fc696d94-dd9b-4ec4-8c68-3ef333b3aaa1\") " pod="openshift-multus/multus-additional-cni-plugins-x7hss" Mar 18 16:43:30.419181 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.419094 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/df16c6f7-55c8-4f26-80fc-ae0ba3fa368a-run-openvswitch\") pod \"ovnkube-node-g2jdr\" (UID: \"df16c6f7-55c8-4f26-80fc-ae0ba3fa368a\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2jdr" Mar 18 16:43:30.419181 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.419127 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/df16c6f7-55c8-4f26-80fc-ae0ba3fa368a-host-run-ovn-kubernetes\") pod \"ovnkube-node-g2jdr\" (UID: \"df16c6f7-55c8-4f26-80fc-ae0ba3fa368a\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2jdr" Mar 18 16:43:30.419181 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.419165 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/df16c6f7-55c8-4f26-80fc-ae0ba3fa368a-host-cni-netd\") pod \"ovnkube-node-g2jdr\" (UID: \"df16c6f7-55c8-4f26-80fc-ae0ba3fa368a\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2jdr" Mar 18 16:43:30.419302 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.419226 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98ghs\" (UniqueName: \"kubernetes.io/projected/df16c6f7-55c8-4f26-80fc-ae0ba3fa368a-kube-api-access-98ghs\") pod \"ovnkube-node-g2jdr\" (UID: \"df16c6f7-55c8-4f26-80fc-ae0ba3fa368a\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2jdr" Mar 18 16:43:30.419302 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.419266 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/c1ae1c50-62e9-4fb1-9fc6-458c124ef493-socket-dir\") pod \"aws-ebs-csi-driver-node-lm2bh\" (UID: \"c1ae1c50-62e9-4fb1-9fc6-458c124ef493\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lm2bh" Mar 18 16:43:30.419302 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.419296 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/df16c6f7-55c8-4f26-80fc-ae0ba3fa368a-run-systemd\") pod \"ovnkube-node-g2jdr\" (UID: \"df16c6f7-55c8-4f26-80fc-ae0ba3fa368a\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2jdr" Mar 18 16:43:30.419438 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.419323 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/50878b4f-dc7c-45c7-81b8-e6f009741d18-etc-kubernetes\") pod \"tuned-bt66t\" (UID: \"50878b4f-dc7c-45c7-81b8-e6f009741d18\") " pod="openshift-cluster-node-tuning-operator/tuned-bt66t" Mar 18 16:43:30.419438 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.419371 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/fc696d94-dd9b-4ec4-8c68-3ef333b3aaa1-cni-binary-copy\") pod \"multus-additional-cni-plugins-x7hss\" (UID: \"fc696d94-dd9b-4ec4-8c68-3ef333b3aaa1\") " pod="openshift-multus/multus-additional-cni-plugins-x7hss" Mar 18 16:43:30.419438 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.419404 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/0bc34432-5dcb-459c-a6d8-587f77ae9dcf-hosts-file\") pod \"node-resolver-9w7wj\" (UID: \"0bc34432-5dcb-459c-a6d8-587f77ae9dcf\") " pod="openshift-dns/node-resolver-9w7wj" Mar 18 16:43:30.419580 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.419431 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/df16c6f7-55c8-4f26-80fc-ae0ba3fa368a-run-ovn\") pod \"ovnkube-node-g2jdr\" (UID: \"df16c6f7-55c8-4f26-80fc-ae0ba3fa368a\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2jdr" Mar 18 16:43:30.419580 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.419469 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/50878b4f-dc7c-45c7-81b8-e6f009741d18-etc-systemd\") pod \"tuned-bt66t\" (UID: \"50878b4f-dc7c-45c7-81b8-e6f009741d18\") " pod="openshift-cluster-node-tuning-operator/tuned-bt66t" Mar 18 16:43:30.419580 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.419493 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/fc696d94-dd9b-4ec4-8c68-3ef333b3aaa1-os-release\") pod \"multus-additional-cni-plugins-x7hss\" (UID: \"fc696d94-dd9b-4ec4-8c68-3ef333b3aaa1\") " pod="openshift-multus/multus-additional-cni-plugins-x7hss" Mar 18 16:43:30.419580 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.419518 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzzmt\" (UniqueName: \"kubernetes.io/projected/e3720b07-5dd9-403e-9a66-17abd67f145f-kube-api-access-tzzmt\") pod \"network-metrics-daemon-bd2jt\" (UID: \"e3720b07-5dd9-403e-9a66-17abd67f145f\") " pod="openshift-multus/network-metrics-daemon-bd2jt" Mar 18 16:43:30.419580 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.419541 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/df16c6f7-55c8-4f26-80fc-ae0ba3fa368a-env-overrides\") pod \"ovnkube-node-g2jdr\" (UID: \"df16c6f7-55c8-4f26-80fc-ae0ba3fa368a\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2jdr" Mar 18 16:43:30.419580 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.419564 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqhzg\" (UniqueName: \"kubernetes.io/projected/ad6e055b-96d2-47b3-bb6b-f2e0165f3470-kube-api-access-zqhzg\") pod \"node-ca-sr4nn\" (UID: \"ad6e055b-96d2-47b3-bb6b-f2e0165f3470\") " pod="openshift-image-registry/node-ca-sr4nn" Mar 18 16:43:30.419850 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.419588 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/832b4122-2cb3-47a3-9cdb-f45d80349575-system-cni-dir\") pod \"multus-wqbkg\" (UID: \"832b4122-2cb3-47a3-9cdb-f45d80349575\") " pod="openshift-multus/multus-wqbkg" Mar 18 16:43:30.419850 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.419612 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/50878b4f-dc7c-45c7-81b8-e6f009741d18-lib-modules\") pod \"tuned-bt66t\" (UID: \"50878b4f-dc7c-45c7-81b8-e6f009741d18\") " pod="openshift-cluster-node-tuning-operator/tuned-bt66t" Mar 18 16:43:30.419850 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.419634 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/50878b4f-dc7c-45c7-81b8-e6f009741d18-host\") pod \"tuned-bt66t\" (UID: \"50878b4f-dc7c-45c7-81b8-e6f009741d18\") " pod="openshift-cluster-node-tuning-operator/tuned-bt66t" Mar 18 16:43:30.419850 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.419661 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c1ae1c50-62e9-4fb1-9fc6-458c124ef493-kubelet-dir\") pod \"aws-ebs-csi-driver-node-lm2bh\" (UID: \"c1ae1c50-62e9-4fb1-9fc6-458c124ef493\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lm2bh" Mar 18 16:43:30.419850 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.419687 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/832b4122-2cb3-47a3-9cdb-f45d80349575-host-run-multus-certs\") pod \"multus-wqbkg\" (UID: \"832b4122-2cb3-47a3-9cdb-f45d80349575\") " pod="openshift-multus/multus-wqbkg" Mar 18 16:43:30.419850 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.419711 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/df16c6f7-55c8-4f26-80fc-ae0ba3fa368a-host-slash\") pod \"ovnkube-node-g2jdr\" (UID: \"df16c6f7-55c8-4f26-80fc-ae0ba3fa368a\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2jdr" Mar 18 16:43:30.419850 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.419734 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/50878b4f-dc7c-45c7-81b8-e6f009741d18-etc-sysconfig\") pod \"tuned-bt66t\" (UID: \"50878b4f-dc7c-45c7-81b8-e6f009741d18\") " pod="openshift-cluster-node-tuning-operator/tuned-bt66t" Mar 18 16:43:30.419850 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.419783 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/fc696d94-dd9b-4ec4-8c68-3ef333b3aaa1-cnibin\") pod \"multus-additional-cni-plugins-x7hss\" (UID: \"fc696d94-dd9b-4ec4-8c68-3ef333b3aaa1\") " pod="openshift-multus/multus-additional-cni-plugins-x7hss" Mar 18 16:43:30.419850 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.419807 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/c1ae1c50-62e9-4fb1-9fc6-458c124ef493-registration-dir\") pod \"aws-ebs-csi-driver-node-lm2bh\" (UID: \"c1ae1c50-62e9-4fb1-9fc6-458c124ef493\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lm2bh" Mar 18 16:43:30.419850 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.419831 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/832b4122-2cb3-47a3-9cdb-f45d80349575-host-var-lib-cni-multus\") pod \"multus-wqbkg\" (UID: \"832b4122-2cb3-47a3-9cdb-f45d80349575\") " pod="openshift-multus/multus-wqbkg" Mar 18 16:43:30.420219 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.419853 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/832b4122-2cb3-47a3-9cdb-f45d80349575-multus-conf-dir\") pod \"multus-wqbkg\" (UID: \"832b4122-2cb3-47a3-9cdb-f45d80349575\") " pod="openshift-multus/multus-wqbkg" Mar 18 16:43:30.420219 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.419882 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z68v5\" (UniqueName: \"kubernetes.io/projected/ce437450-60be-4502-aa99-03a4af6c8e7c-kube-api-access-z68v5\") pod \"network-check-target-j676s\" (UID: \"ce437450-60be-4502-aa99-03a4af6c8e7c\") " pod="openshift-network-diagnostics/network-check-target-j676s" Mar 18 16:43:30.420219 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.419915 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/50878b4f-dc7c-45c7-81b8-e6f009741d18-etc-sysctl-conf\") pod \"tuned-bt66t\" (UID: \"50878b4f-dc7c-45c7-81b8-e6f009741d18\") " pod="openshift-cluster-node-tuning-operator/tuned-bt66t" Mar 18 16:43:30.420219 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.420002 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/50878b4f-dc7c-45c7-81b8-e6f009741d18-etc-tuned\") pod \"tuned-bt66t\" (UID: \"50878b4f-dc7c-45c7-81b8-e6f009741d18\") " pod="openshift-cluster-node-tuning-operator/tuned-bt66t" Mar 18 16:43:30.420219 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.420039 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/50878b4f-dc7c-45c7-81b8-e6f009741d18-tmp\") pod \"tuned-bt66t\" (UID: \"50878b4f-dc7c-45c7-81b8-e6f009741d18\") " pod="openshift-cluster-node-tuning-operator/tuned-bt66t" Mar 18 16:43:30.420219 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.420067 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/fc696d94-dd9b-4ec4-8c68-3ef333b3aaa1-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-x7hss\" (UID: \"fc696d94-dd9b-4ec4-8c68-3ef333b3aaa1\") " pod="openshift-multus/multus-additional-cni-plugins-x7hss" Mar 18 16:43:30.420219 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.420088 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/c1ae1c50-62e9-4fb1-9fc6-458c124ef493-sys-fs\") pod \"aws-ebs-csi-driver-node-lm2bh\" (UID: \"c1ae1c50-62e9-4fb1-9fc6-458c124ef493\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lm2bh" Mar 18 16:43:30.420219 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.420105 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/832b4122-2cb3-47a3-9cdb-f45d80349575-host-run-k8s-cni-cncf-io\") pod \"multus-wqbkg\" (UID: \"832b4122-2cb3-47a3-9cdb-f45d80349575\") " pod="openshift-multus/multus-wqbkg" Mar 18 16:43:30.420219 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.420121 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/fe7890a0-afa4-444d-9ad4-445e8890337a-iptables-alerter-script\") pod \"iptables-alerter-hqszc\" (UID: \"fe7890a0-afa4-444d-9ad4-445e8890337a\") " pod="openshift-network-operator/iptables-alerter-hqszc" Mar 18 16:43:30.420219 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.420135 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/df16c6f7-55c8-4f26-80fc-ae0ba3fa368a-node-log\") pod \"ovnkube-node-g2jdr\" (UID: \"df16c6f7-55c8-4f26-80fc-ae0ba3fa368a\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2jdr" Mar 18 16:43:30.420219 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.420155 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/fc696d94-dd9b-4ec4-8c68-3ef333b3aaa1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-x7hss\" (UID: \"fc696d94-dd9b-4ec4-8c68-3ef333b3aaa1\") " pod="openshift-multus/multus-additional-cni-plugins-x7hss" Mar 18 16:43:30.420219 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.420170 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8gsz\" (UniqueName: \"kubernetes.io/projected/0bc34432-5dcb-459c-a6d8-587f77ae9dcf-kube-api-access-j8gsz\") pod \"node-resolver-9w7wj\" (UID: \"0bc34432-5dcb-459c-a6d8-587f77ae9dcf\") " pod="openshift-dns/node-resolver-9w7wj" Mar 18 16:43:30.420219 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.420187 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/df16c6f7-55c8-4f26-80fc-ae0ba3fa368a-host-kubelet\") pod \"ovnkube-node-g2jdr\" (UID: \"df16c6f7-55c8-4f26-80fc-ae0ba3fa368a\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2jdr" Mar 18 16:43:30.420729 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.420232 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/df16c6f7-55c8-4f26-80fc-ae0ba3fa368a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-g2jdr\" (UID: \"df16c6f7-55c8-4f26-80fc-ae0ba3fa368a\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2jdr" Mar 18 16:43:30.420729 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.420275 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/df16c6f7-55c8-4f26-80fc-ae0ba3fa368a-ovn-node-metrics-cert\") pod \"ovnkube-node-g2jdr\" (UID: \"df16c6f7-55c8-4f26-80fc-ae0ba3fa368a\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2jdr" Mar 18 16:43:30.420729 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.420332 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/50878b4f-dc7c-45c7-81b8-e6f009741d18-etc-sysctl-d\") pod \"tuned-bt66t\" (UID: \"50878b4f-dc7c-45c7-81b8-e6f009741d18\") " pod="openshift-cluster-node-tuning-operator/tuned-bt66t" Mar 18 16:43:30.420729 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.420377 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e3720b07-5dd9-403e-9a66-17abd67f145f-metrics-certs\") pod \"network-metrics-daemon-bd2jt\" (UID: \"e3720b07-5dd9-403e-9a66-17abd67f145f\") " pod="openshift-multus/network-metrics-daemon-bd2jt" Mar 18 16:43:30.420729 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.420413 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/df16c6f7-55c8-4f26-80fc-ae0ba3fa368a-var-lib-openvswitch\") pod \"ovnkube-node-g2jdr\" (UID: \"df16c6f7-55c8-4f26-80fc-ae0ba3fa368a\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2jdr" Mar 18 16:43:30.420729 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.420468 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/50878b4f-dc7c-45c7-81b8-e6f009741d18-run\") pod \"tuned-bt66t\" (UID: \"50878b4f-dc7c-45c7-81b8-e6f009741d18\") " pod="openshift-cluster-node-tuning-operator/tuned-bt66t" Mar 18 16:43:30.420729 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.420489 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vb5zq\" (UniqueName: \"kubernetes.io/projected/50878b4f-dc7c-45c7-81b8-e6f009741d18-kube-api-access-vb5zq\") pod \"tuned-bt66t\" (UID: \"50878b4f-dc7c-45c7-81b8-e6f009741d18\") " pod="openshift-cluster-node-tuning-operator/tuned-bt66t" Mar 18 16:43:30.420729 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.420528 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/fc696d94-dd9b-4ec4-8c68-3ef333b3aaa1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-x7hss\" (UID: \"fc696d94-dd9b-4ec4-8c68-3ef333b3aaa1\") " pod="openshift-multus/multus-additional-cni-plugins-x7hss" Mar 18 16:43:30.420729 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.420571 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/c1ae1c50-62e9-4fb1-9fc6-458c124ef493-device-dir\") pod \"aws-ebs-csi-driver-node-lm2bh\" (UID: \"c1ae1c50-62e9-4fb1-9fc6-458c124ef493\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lm2bh" Mar 18 16:43:30.420729 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.420606 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ad6e055b-96d2-47b3-bb6b-f2e0165f3470-host\") pod \"node-ca-sr4nn\" (UID: \"ad6e055b-96d2-47b3-bb6b-f2e0165f3470\") " pod="openshift-image-registry/node-ca-sr4nn" Mar 18 16:43:30.420729 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.420636 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/832b4122-2cb3-47a3-9cdb-f45d80349575-cni-binary-copy\") pod \"multus-wqbkg\" (UID: \"832b4122-2cb3-47a3-9cdb-f45d80349575\") " pod="openshift-multus/multus-wqbkg" Mar 18 16:43:30.420729 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.420705 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/832b4122-2cb3-47a3-9cdb-f45d80349575-hostroot\") pod \"multus-wqbkg\" (UID: \"832b4122-2cb3-47a3-9cdb-f45d80349575\") " pod="openshift-multus/multus-wqbkg" Mar 18 16:43:30.420729 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.420732 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6qh2\" (UniqueName: \"kubernetes.io/projected/832b4122-2cb3-47a3-9cdb-f45d80349575-kube-api-access-d6qh2\") pod \"multus-wqbkg\" (UID: \"832b4122-2cb3-47a3-9cdb-f45d80349575\") " pod="openshift-multus/multus-wqbkg" Mar 18 16:43:30.421229 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.420752 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/832b4122-2cb3-47a3-9cdb-f45d80349575-etc-kubernetes\") pod \"multus-wqbkg\" (UID: \"832b4122-2cb3-47a3-9cdb-f45d80349575\") " pod="openshift-multus/multus-wqbkg" Mar 18 16:43:30.421229 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.420774 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/df16c6f7-55c8-4f26-80fc-ae0ba3fa368a-host-cni-bin\") pod \"ovnkube-node-g2jdr\" (UID: \"df16c6f7-55c8-4f26-80fc-ae0ba3fa368a\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2jdr" Mar 18 16:43:30.421229 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.420795 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/bf84e45b-5912-4a57-b350-dd211f3513fe-agent-certs\") pod \"konnectivity-agent-vclgl\" (UID: \"bf84e45b-5912-4a57-b350-dd211f3513fe\") " pod="kube-system/konnectivity-agent-vclgl" Mar 18 16:43:30.421229 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.420814 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/832b4122-2cb3-47a3-9cdb-f45d80349575-host-run-netns\") pod \"multus-wqbkg\" (UID: \"832b4122-2cb3-47a3-9cdb-f45d80349575\") " pod="openshift-multus/multus-wqbkg" Mar 18 16:43:30.421229 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.420851 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/832b4122-2cb3-47a3-9cdb-f45d80349575-multus-socket-dir-parent\") pod \"multus-wqbkg\" (UID: \"832b4122-2cb3-47a3-9cdb-f45d80349575\") " pod="openshift-multus/multus-wqbkg" Mar 18 16:43:30.421229 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.420881 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/832b4122-2cb3-47a3-9cdb-f45d80349575-host-var-lib-cni-bin\") pod \"multus-wqbkg\" (UID: \"832b4122-2cb3-47a3-9cdb-f45d80349575\") " pod="openshift-multus/multus-wqbkg" Mar 18 16:43:30.421229 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.420909 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fe7890a0-afa4-444d-9ad4-445e8890337a-host-slash\") pod \"iptables-alerter-hqszc\" (UID: \"fe7890a0-afa4-444d-9ad4-445e8890337a\") " pod="openshift-network-operator/iptables-alerter-hqszc" Mar 18 16:43:30.421229 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.420932 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/df16c6f7-55c8-4f26-80fc-ae0ba3fa368a-systemd-units\") pod \"ovnkube-node-g2jdr\" (UID: \"df16c6f7-55c8-4f26-80fc-ae0ba3fa368a\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2jdr" Mar 18 16:43:30.421229 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.420975 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/832b4122-2cb3-47a3-9cdb-f45d80349575-cnibin\") pod \"multus-wqbkg\" (UID: \"832b4122-2cb3-47a3-9cdb-f45d80349575\") " pod="openshift-multus/multus-wqbkg" Mar 18 16:43:30.421229 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.420996 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/832b4122-2cb3-47a3-9cdb-f45d80349575-multus-daemon-config\") pod \"multus-wqbkg\" (UID: \"832b4122-2cb3-47a3-9cdb-f45d80349575\") " pod="openshift-multus/multus-wqbkg" Mar 18 16:43:30.421229 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.421019 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/df16c6f7-55c8-4f26-80fc-ae0ba3fa368a-host-run-netns\") pod \"ovnkube-node-g2jdr\" (UID: \"df16c6f7-55c8-4f26-80fc-ae0ba3fa368a\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2jdr" Mar 18 16:43:30.421229 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.421043 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/df16c6f7-55c8-4f26-80fc-ae0ba3fa368a-ovnkube-script-lib\") pod \"ovnkube-node-g2jdr\" (UID: \"df16c6f7-55c8-4f26-80fc-ae0ba3fa368a\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2jdr" Mar 18 16:43:30.421229 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.421073 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fc696d94-dd9b-4ec4-8c68-3ef333b3aaa1-system-cni-dir\") pod \"multus-additional-cni-plugins-x7hss\" (UID: \"fc696d94-dd9b-4ec4-8c68-3ef333b3aaa1\") " pod="openshift-multus/multus-additional-cni-plugins-x7hss" Mar 18 16:43:30.421229 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.421093 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/c1ae1c50-62e9-4fb1-9fc6-458c124ef493-etc-selinux\") pod \"aws-ebs-csi-driver-node-lm2bh\" (UID: \"c1ae1c50-62e9-4fb1-9fc6-458c124ef493\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lm2bh" Mar 18 16:43:30.421229 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.421117 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9hs8\" (UniqueName: \"kubernetes.io/projected/c1ae1c50-62e9-4fb1-9fc6-458c124ef493-kube-api-access-q9hs8\") pod \"aws-ebs-csi-driver-node-lm2bh\" (UID: \"c1ae1c50-62e9-4fb1-9fc6-458c124ef493\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lm2bh" Mar 18 16:43:30.421229 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.421135 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/832b4122-2cb3-47a3-9cdb-f45d80349575-multus-cni-dir\") pod \"multus-wqbkg\" (UID: \"832b4122-2cb3-47a3-9cdb-f45d80349575\") " pod="openshift-multus/multus-wqbkg" Mar 18 16:43:30.421229 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.421157 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/832b4122-2cb3-47a3-9cdb-f45d80349575-host-var-lib-kubelet\") pod \"multus-wqbkg\" (UID: \"832b4122-2cb3-47a3-9cdb-f45d80349575\") " pod="openshift-multus/multus-wqbkg" Mar 18 16:43:30.421752 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.421201 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/df16c6f7-55c8-4f26-80fc-ae0ba3fa368a-etc-openvswitch\") pod \"ovnkube-node-g2jdr\" (UID: \"df16c6f7-55c8-4f26-80fc-ae0ba3fa368a\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2jdr" Mar 18 16:43:30.421752 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.421225 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/df16c6f7-55c8-4f26-80fc-ae0ba3fa368a-log-socket\") pod \"ovnkube-node-g2jdr\" (UID: \"df16c6f7-55c8-4f26-80fc-ae0ba3fa368a\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2jdr" Mar 18 16:43:30.421752 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.421243 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0bc34432-5dcb-459c-a6d8-587f77ae9dcf-tmp-dir\") pod \"node-resolver-9w7wj\" (UID: \"0bc34432-5dcb-459c-a6d8-587f77ae9dcf\") " pod="openshift-dns/node-resolver-9w7wj" Mar 18 16:43:30.421752 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.421263 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/832b4122-2cb3-47a3-9cdb-f45d80349575-os-release\") pod \"multus-wqbkg\" (UID: \"832b4122-2cb3-47a3-9cdb-f45d80349575\") " pod="openshift-multus/multus-wqbkg" Mar 18 16:43:30.421752 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.421299 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22ctb\" (UniqueName: \"kubernetes.io/projected/fe7890a0-afa4-444d-9ad4-445e8890337a-kube-api-access-22ctb\") pod \"iptables-alerter-hqszc\" (UID: \"fe7890a0-afa4-444d-9ad4-445e8890337a\") " pod="openshift-network-operator/iptables-alerter-hqszc" Mar 18 16:43:30.421752 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.421332 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/df16c6f7-55c8-4f26-80fc-ae0ba3fa368a-ovnkube-config\") pod \"ovnkube-node-g2jdr\" (UID: \"df16c6f7-55c8-4f26-80fc-ae0ba3fa368a\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2jdr" Mar 18 16:43:30.421752 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.421355 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/50878b4f-dc7c-45c7-81b8-e6f009741d18-sys\") pod \"tuned-bt66t\" (UID: \"50878b4f-dc7c-45c7-81b8-e6f009741d18\") " pod="openshift-cluster-node-tuning-operator/tuned-bt66t" Mar 18 16:43:30.421752 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.421379 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/ad6e055b-96d2-47b3-bb6b-f2e0165f3470-serviceca\") pod \"node-ca-sr4nn\" (UID: \"ad6e055b-96d2-47b3-bb6b-f2e0165f3470\") " pod="openshift-image-registry/node-ca-sr4nn" Mar 18 16:43:30.421752 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.421401 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/bf84e45b-5912-4a57-b350-dd211f3513fe-konnectivity-ca\") pod \"konnectivity-agent-vclgl\" (UID: \"bf84e45b-5912-4a57-b350-dd211f3513fe\") " pod="kube-system/konnectivity-agent-vclgl" Mar 18 16:43:30.421752 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.421423 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/50878b4f-dc7c-45c7-81b8-e6f009741d18-etc-modprobe-d\") pod \"tuned-bt66t\" (UID: \"50878b4f-dc7c-45c7-81b8-e6f009741d18\") " pod="openshift-cluster-node-tuning-operator/tuned-bt66t" Mar 18 16:43:30.421752 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.421441 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/50878b4f-dc7c-45c7-81b8-e6f009741d18-var-lib-kubelet\") pod \"tuned-bt66t\" (UID: \"50878b4f-dc7c-45c7-81b8-e6f009741d18\") " pod="openshift-cluster-node-tuning-operator/tuned-bt66t" Mar 18 16:43:30.426653 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.426634 2576 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Mar 18 16:43:30.440848 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.440825 2576 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-gfvt5" Mar 18 16:43:30.450192 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.450169 2576 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-gfvt5" Mar 18 16:43:30.522598 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.522574 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/832b4122-2cb3-47a3-9cdb-f45d80349575-multus-cni-dir\") pod \"multus-wqbkg\" (UID: \"832b4122-2cb3-47a3-9cdb-f45d80349575\") " pod="openshift-multus/multus-wqbkg" Mar 18 16:43:30.522598 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.522601 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/832b4122-2cb3-47a3-9cdb-f45d80349575-host-var-lib-kubelet\") pod \"multus-wqbkg\" (UID: \"832b4122-2cb3-47a3-9cdb-f45d80349575\") " pod="openshift-multus/multus-wqbkg" Mar 18 16:43:30.522853 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.522619 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/df16c6f7-55c8-4f26-80fc-ae0ba3fa368a-etc-openvswitch\") pod \"ovnkube-node-g2jdr\" (UID: \"df16c6f7-55c8-4f26-80fc-ae0ba3fa368a\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2jdr" Mar 18 16:43:30.522853 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.522633 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/df16c6f7-55c8-4f26-80fc-ae0ba3fa368a-log-socket\") pod \"ovnkube-node-g2jdr\" (UID: \"df16c6f7-55c8-4f26-80fc-ae0ba3fa368a\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2jdr" Mar 18 16:43:30.522853 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.522651 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0bc34432-5dcb-459c-a6d8-587f77ae9dcf-tmp-dir\") pod \"node-resolver-9w7wj\" (UID: \"0bc34432-5dcb-459c-a6d8-587f77ae9dcf\") " pod="openshift-dns/node-resolver-9w7wj" Mar 18 16:43:30.522853 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.522667 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/832b4122-2cb3-47a3-9cdb-f45d80349575-os-release\") pod \"multus-wqbkg\" (UID: \"832b4122-2cb3-47a3-9cdb-f45d80349575\") " pod="openshift-multus/multus-wqbkg" Mar 18 16:43:30.522853 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.522689 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-22ctb\" (UniqueName: \"kubernetes.io/projected/fe7890a0-afa4-444d-9ad4-445e8890337a-kube-api-access-22ctb\") pod \"iptables-alerter-hqszc\" (UID: \"fe7890a0-afa4-444d-9ad4-445e8890337a\") " pod="openshift-network-operator/iptables-alerter-hqszc" Mar 18 16:43:30.522853 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.522687 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/832b4122-2cb3-47a3-9cdb-f45d80349575-host-var-lib-kubelet\") pod \"multus-wqbkg\" (UID: \"832b4122-2cb3-47a3-9cdb-f45d80349575\") " pod="openshift-multus/multus-wqbkg" Mar 18 16:43:30.522853 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.522704 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/df16c6f7-55c8-4f26-80fc-ae0ba3fa368a-etc-openvswitch\") pod \"ovnkube-node-g2jdr\" (UID: \"df16c6f7-55c8-4f26-80fc-ae0ba3fa368a\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2jdr" Mar 18 16:43:30.522853 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.522704 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/df16c6f7-55c8-4f26-80fc-ae0ba3fa368a-log-socket\") pod \"ovnkube-node-g2jdr\" (UID: \"df16c6f7-55c8-4f26-80fc-ae0ba3fa368a\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2jdr" Mar 18 16:43:30.522853 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.522711 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/df16c6f7-55c8-4f26-80fc-ae0ba3fa368a-ovnkube-config\") pod \"ovnkube-node-g2jdr\" (UID: \"df16c6f7-55c8-4f26-80fc-ae0ba3fa368a\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2jdr" Mar 18 16:43:30.522853 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.522749 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/832b4122-2cb3-47a3-9cdb-f45d80349575-os-release\") pod \"multus-wqbkg\" (UID: \"832b4122-2cb3-47a3-9cdb-f45d80349575\") " pod="openshift-multus/multus-wqbkg" Mar 18 16:43:30.522853 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.522775 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/50878b4f-dc7c-45c7-81b8-e6f009741d18-sys\") pod \"tuned-bt66t\" (UID: \"50878b4f-dc7c-45c7-81b8-e6f009741d18\") " pod="openshift-cluster-node-tuning-operator/tuned-bt66t" Mar 18 16:43:30.522853 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.522778 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/832b4122-2cb3-47a3-9cdb-f45d80349575-multus-cni-dir\") pod \"multus-wqbkg\" (UID: \"832b4122-2cb3-47a3-9cdb-f45d80349575\") " pod="openshift-multus/multus-wqbkg" Mar 18 16:43:30.522853 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.522803 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/ad6e055b-96d2-47b3-bb6b-f2e0165f3470-serviceca\") pod \"node-ca-sr4nn\" (UID: \"ad6e055b-96d2-47b3-bb6b-f2e0165f3470\") " pod="openshift-image-registry/node-ca-sr4nn" Mar 18 16:43:30.522853 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.522808 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/50878b4f-dc7c-45c7-81b8-e6f009741d18-sys\") pod \"tuned-bt66t\" (UID: \"50878b4f-dc7c-45c7-81b8-e6f009741d18\") " pod="openshift-cluster-node-tuning-operator/tuned-bt66t" Mar 18 16:43:30.522853 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.522827 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/bf84e45b-5912-4a57-b350-dd211f3513fe-konnectivity-ca\") pod \"konnectivity-agent-vclgl\" (UID: \"bf84e45b-5912-4a57-b350-dd211f3513fe\") " pod="kube-system/konnectivity-agent-vclgl" Mar 18 16:43:30.522853 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.522854 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/50878b4f-dc7c-45c7-81b8-e6f009741d18-etc-modprobe-d\") pod \"tuned-bt66t\" (UID: \"50878b4f-dc7c-45c7-81b8-e6f009741d18\") " pod="openshift-cluster-node-tuning-operator/tuned-bt66t" Mar 18 16:43:30.523576 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.522877 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/50878b4f-dc7c-45c7-81b8-e6f009741d18-var-lib-kubelet\") pod \"tuned-bt66t\" (UID: \"50878b4f-dc7c-45c7-81b8-e6f009741d18\") " pod="openshift-cluster-node-tuning-operator/tuned-bt66t" Mar 18 16:43:30.523576 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.522901 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wcsf6\" (UniqueName: \"kubernetes.io/projected/fc696d94-dd9b-4ec4-8c68-3ef333b3aaa1-kube-api-access-wcsf6\") pod \"multus-additional-cni-plugins-x7hss\" (UID: \"fc696d94-dd9b-4ec4-8c68-3ef333b3aaa1\") " pod="openshift-multus/multus-additional-cni-plugins-x7hss" Mar 18 16:43:30.523576 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.522929 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/df16c6f7-55c8-4f26-80fc-ae0ba3fa368a-run-openvswitch\") pod \"ovnkube-node-g2jdr\" (UID: \"df16c6f7-55c8-4f26-80fc-ae0ba3fa368a\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2jdr" Mar 18 16:43:30.523576 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.523000 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/df16c6f7-55c8-4f26-80fc-ae0ba3fa368a-run-openvswitch\") pod \"ovnkube-node-g2jdr\" (UID: \"df16c6f7-55c8-4f26-80fc-ae0ba3fa368a\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2jdr" Mar 18 16:43:30.523576 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.523000 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/50878b4f-dc7c-45c7-81b8-e6f009741d18-var-lib-kubelet\") pod \"tuned-bt66t\" (UID: \"50878b4f-dc7c-45c7-81b8-e6f009741d18\") " pod="openshift-cluster-node-tuning-operator/tuned-bt66t" Mar 18 16:43:30.523576 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.523022 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/df16c6f7-55c8-4f26-80fc-ae0ba3fa368a-host-run-ovn-kubernetes\") pod \"ovnkube-node-g2jdr\" (UID: \"df16c6f7-55c8-4f26-80fc-ae0ba3fa368a\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2jdr" Mar 18 16:43:30.523576 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.523052 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/df16c6f7-55c8-4f26-80fc-ae0ba3fa368a-host-cni-netd\") pod \"ovnkube-node-g2jdr\" (UID: \"df16c6f7-55c8-4f26-80fc-ae0ba3fa368a\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2jdr" Mar 18 16:43:30.523576 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.523066 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/df16c6f7-55c8-4f26-80fc-ae0ba3fa368a-host-run-ovn-kubernetes\") pod \"ovnkube-node-g2jdr\" (UID: \"df16c6f7-55c8-4f26-80fc-ae0ba3fa368a\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2jdr" Mar 18 16:43:30.523576 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.523025 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/50878b4f-dc7c-45c7-81b8-e6f009741d18-etc-modprobe-d\") pod \"tuned-bt66t\" (UID: \"50878b4f-dc7c-45c7-81b8-e6f009741d18\") " pod="openshift-cluster-node-tuning-operator/tuned-bt66t" Mar 18 16:43:30.523576 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.523089 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-98ghs\" (UniqueName: \"kubernetes.io/projected/df16c6f7-55c8-4f26-80fc-ae0ba3fa368a-kube-api-access-98ghs\") pod \"ovnkube-node-g2jdr\" (UID: \"df16c6f7-55c8-4f26-80fc-ae0ba3fa368a\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2jdr" Mar 18 16:43:30.523576 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.523102 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0bc34432-5dcb-459c-a6d8-587f77ae9dcf-tmp-dir\") pod \"node-resolver-9w7wj\" (UID: \"0bc34432-5dcb-459c-a6d8-587f77ae9dcf\") " pod="openshift-dns/node-resolver-9w7wj" Mar 18 16:43:30.523576 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.523124 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/df16c6f7-55c8-4f26-80fc-ae0ba3fa368a-host-cni-netd\") pod \"ovnkube-node-g2jdr\" (UID: \"df16c6f7-55c8-4f26-80fc-ae0ba3fa368a\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2jdr" Mar 18 16:43:30.523576 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.523155 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/c1ae1c50-62e9-4fb1-9fc6-458c124ef493-socket-dir\") pod \"aws-ebs-csi-driver-node-lm2bh\" (UID: \"c1ae1c50-62e9-4fb1-9fc6-458c124ef493\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lm2bh" Mar 18 16:43:30.523576 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.523185 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/df16c6f7-55c8-4f26-80fc-ae0ba3fa368a-run-systemd\") pod \"ovnkube-node-g2jdr\" (UID: \"df16c6f7-55c8-4f26-80fc-ae0ba3fa368a\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2jdr" Mar 18 16:43:30.523576 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.523220 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/50878b4f-dc7c-45c7-81b8-e6f009741d18-etc-kubernetes\") pod \"tuned-bt66t\" (UID: \"50878b4f-dc7c-45c7-81b8-e6f009741d18\") " pod="openshift-cluster-node-tuning-operator/tuned-bt66t" Mar 18 16:43:30.523576 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.523247 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/fc696d94-dd9b-4ec4-8c68-3ef333b3aaa1-cni-binary-copy\") pod \"multus-additional-cni-plugins-x7hss\" (UID: \"fc696d94-dd9b-4ec4-8c68-3ef333b3aaa1\") " pod="openshift-multus/multus-additional-cni-plugins-x7hss" Mar 18 16:43:30.523576 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.523268 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/ad6e055b-96d2-47b3-bb6b-f2e0165f3470-serviceca\") pod \"node-ca-sr4nn\" (UID: \"ad6e055b-96d2-47b3-bb6b-f2e0165f3470\") " pod="openshift-image-registry/node-ca-sr4nn" Mar 18 16:43:30.524360 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.523294 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/df16c6f7-55c8-4f26-80fc-ae0ba3fa368a-ovnkube-config\") pod \"ovnkube-node-g2jdr\" (UID: \"df16c6f7-55c8-4f26-80fc-ae0ba3fa368a\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2jdr" Mar 18 16:43:30.524360 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.523305 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/0bc34432-5dcb-459c-a6d8-587f77ae9dcf-hosts-file\") pod \"node-resolver-9w7wj\" (UID: \"0bc34432-5dcb-459c-a6d8-587f77ae9dcf\") " pod="openshift-dns/node-resolver-9w7wj" Mar 18 16:43:30.524360 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.523309 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/df16c6f7-55c8-4f26-80fc-ae0ba3fa368a-run-systemd\") pod \"ovnkube-node-g2jdr\" (UID: \"df16c6f7-55c8-4f26-80fc-ae0ba3fa368a\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2jdr" Mar 18 16:43:30.524360 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.523316 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/c1ae1c50-62e9-4fb1-9fc6-458c124ef493-socket-dir\") pod \"aws-ebs-csi-driver-node-lm2bh\" (UID: \"c1ae1c50-62e9-4fb1-9fc6-458c124ef493\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lm2bh" Mar 18 16:43:30.524360 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.523331 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/df16c6f7-55c8-4f26-80fc-ae0ba3fa368a-run-ovn\") pod \"ovnkube-node-g2jdr\" (UID: \"df16c6f7-55c8-4f26-80fc-ae0ba3fa368a\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2jdr" Mar 18 16:43:30.524360 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.523362 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/0bc34432-5dcb-459c-a6d8-587f77ae9dcf-hosts-file\") pod \"node-resolver-9w7wj\" (UID: \"0bc34432-5dcb-459c-a6d8-587f77ae9dcf\") " pod="openshift-dns/node-resolver-9w7wj" Mar 18 16:43:30.524360 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.523363 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/50878b4f-dc7c-45c7-81b8-e6f009741d18-etc-systemd\") pod \"tuned-bt66t\" (UID: \"50878b4f-dc7c-45c7-81b8-e6f009741d18\") " pod="openshift-cluster-node-tuning-operator/tuned-bt66t" Mar 18 16:43:30.524360 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.523370 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/df16c6f7-55c8-4f26-80fc-ae0ba3fa368a-run-ovn\") pod \"ovnkube-node-g2jdr\" (UID: \"df16c6f7-55c8-4f26-80fc-ae0ba3fa368a\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2jdr" Mar 18 16:43:30.524360 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.523391 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/fc696d94-dd9b-4ec4-8c68-3ef333b3aaa1-os-release\") pod \"multus-additional-cni-plugins-x7hss\" (UID: \"fc696d94-dd9b-4ec4-8c68-3ef333b3aaa1\") " pod="openshift-multus/multus-additional-cni-plugins-x7hss" Mar 18 16:43:30.524360 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.523418 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tzzmt\" (UniqueName: \"kubernetes.io/projected/e3720b07-5dd9-403e-9a66-17abd67f145f-kube-api-access-tzzmt\") pod \"network-metrics-daemon-bd2jt\" (UID: \"e3720b07-5dd9-403e-9a66-17abd67f145f\") " pod="openshift-multus/network-metrics-daemon-bd2jt" Mar 18 16:43:30.524360 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.523421 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/50878b4f-dc7c-45c7-81b8-e6f009741d18-etc-systemd\") pod \"tuned-bt66t\" (UID: \"50878b4f-dc7c-45c7-81b8-e6f009741d18\") " pod="openshift-cluster-node-tuning-operator/tuned-bt66t" Mar 18 16:43:30.524360 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.523425 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/bf84e45b-5912-4a57-b350-dd211f3513fe-konnectivity-ca\") pod \"konnectivity-agent-vclgl\" (UID: \"bf84e45b-5912-4a57-b350-dd211f3513fe\") " pod="kube-system/konnectivity-agent-vclgl" Mar 18 16:43:30.524360 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.523439 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/fc696d94-dd9b-4ec4-8c68-3ef333b3aaa1-os-release\") pod \"multus-additional-cni-plugins-x7hss\" (UID: \"fc696d94-dd9b-4ec4-8c68-3ef333b3aaa1\") " pod="openshift-multus/multus-additional-cni-plugins-x7hss" Mar 18 16:43:30.524360 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.523452 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/df16c6f7-55c8-4f26-80fc-ae0ba3fa368a-env-overrides\") pod \"ovnkube-node-g2jdr\" (UID: \"df16c6f7-55c8-4f26-80fc-ae0ba3fa368a\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2jdr" Mar 18 16:43:30.524360 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.523476 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zqhzg\" (UniqueName: \"kubernetes.io/projected/ad6e055b-96d2-47b3-bb6b-f2e0165f3470-kube-api-access-zqhzg\") pod \"node-ca-sr4nn\" (UID: \"ad6e055b-96d2-47b3-bb6b-f2e0165f3470\") " pod="openshift-image-registry/node-ca-sr4nn" Mar 18 16:43:30.524360 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.523492 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/832b4122-2cb3-47a3-9cdb-f45d80349575-system-cni-dir\") pod \"multus-wqbkg\" (UID: \"832b4122-2cb3-47a3-9cdb-f45d80349575\") " pod="openshift-multus/multus-wqbkg" Mar 18 16:43:30.524360 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.523477 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/50878b4f-dc7c-45c7-81b8-e6f009741d18-etc-kubernetes\") pod \"tuned-bt66t\" (UID: \"50878b4f-dc7c-45c7-81b8-e6f009741d18\") " pod="openshift-cluster-node-tuning-operator/tuned-bt66t" Mar 18 16:43:30.524360 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.523547 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/50878b4f-dc7c-45c7-81b8-e6f009741d18-lib-modules\") pod \"tuned-bt66t\" (UID: \"50878b4f-dc7c-45c7-81b8-e6f009741d18\") " pod="openshift-cluster-node-tuning-operator/tuned-bt66t" Mar 18 16:43:30.525196 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.523567 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/50878b4f-dc7c-45c7-81b8-e6f009741d18-host\") pod \"tuned-bt66t\" (UID: \"50878b4f-dc7c-45c7-81b8-e6f009741d18\") " pod="openshift-cluster-node-tuning-operator/tuned-bt66t" Mar 18 16:43:30.525196 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.523629 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/832b4122-2cb3-47a3-9cdb-f45d80349575-system-cni-dir\") pod \"multus-wqbkg\" (UID: \"832b4122-2cb3-47a3-9cdb-f45d80349575\") " pod="openshift-multus/multus-wqbkg" Mar 18 16:43:30.525196 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.523658 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/50878b4f-dc7c-45c7-81b8-e6f009741d18-lib-modules\") pod \"tuned-bt66t\" (UID: \"50878b4f-dc7c-45c7-81b8-e6f009741d18\") " pod="openshift-cluster-node-tuning-operator/tuned-bt66t" Mar 18 16:43:30.525196 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.523666 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c1ae1c50-62e9-4fb1-9fc6-458c124ef493-kubelet-dir\") pod \"aws-ebs-csi-driver-node-lm2bh\" (UID: \"c1ae1c50-62e9-4fb1-9fc6-458c124ef493\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lm2bh" Mar 18 16:43:30.525196 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.523695 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/50878b4f-dc7c-45c7-81b8-e6f009741d18-host\") pod \"tuned-bt66t\" (UID: \"50878b4f-dc7c-45c7-81b8-e6f009741d18\") " pod="openshift-cluster-node-tuning-operator/tuned-bt66t" Mar 18 16:43:30.525196 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.523698 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/832b4122-2cb3-47a3-9cdb-f45d80349575-host-run-multus-certs\") pod \"multus-wqbkg\" (UID: \"832b4122-2cb3-47a3-9cdb-f45d80349575\") " pod="openshift-multus/multus-wqbkg" Mar 18 16:43:30.525196 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.523713 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c1ae1c50-62e9-4fb1-9fc6-458c124ef493-kubelet-dir\") pod \"aws-ebs-csi-driver-node-lm2bh\" (UID: \"c1ae1c50-62e9-4fb1-9fc6-458c124ef493\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lm2bh" Mar 18 16:43:30.525196 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.523730 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/df16c6f7-55c8-4f26-80fc-ae0ba3fa368a-host-slash\") pod \"ovnkube-node-g2jdr\" (UID: \"df16c6f7-55c8-4f26-80fc-ae0ba3fa368a\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2jdr" Mar 18 16:43:30.525196 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.523734 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/832b4122-2cb3-47a3-9cdb-f45d80349575-host-run-multus-certs\") pod \"multus-wqbkg\" (UID: \"832b4122-2cb3-47a3-9cdb-f45d80349575\") " pod="openshift-multus/multus-wqbkg" Mar 18 16:43:30.525196 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.523758 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/50878b4f-dc7c-45c7-81b8-e6f009741d18-etc-sysconfig\") pod \"tuned-bt66t\" (UID: \"50878b4f-dc7c-45c7-81b8-e6f009741d18\") " pod="openshift-cluster-node-tuning-operator/tuned-bt66t" Mar 18 16:43:30.525196 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.523773 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/fc696d94-dd9b-4ec4-8c68-3ef333b3aaa1-cni-binary-copy\") pod \"multus-additional-cni-plugins-x7hss\" (UID: \"fc696d94-dd9b-4ec4-8c68-3ef333b3aaa1\") " pod="openshift-multus/multus-additional-cni-plugins-x7hss" Mar 18 16:43:30.525196 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.523782 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/fc696d94-dd9b-4ec4-8c68-3ef333b3aaa1-cnibin\") pod \"multus-additional-cni-plugins-x7hss\" (UID: \"fc696d94-dd9b-4ec4-8c68-3ef333b3aaa1\") " pod="openshift-multus/multus-additional-cni-plugins-x7hss" Mar 18 16:43:30.525196 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.523795 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/df16c6f7-55c8-4f26-80fc-ae0ba3fa368a-host-slash\") pod \"ovnkube-node-g2jdr\" (UID: \"df16c6f7-55c8-4f26-80fc-ae0ba3fa368a\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2jdr" Mar 18 16:43:30.525196 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.523811 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/c1ae1c50-62e9-4fb1-9fc6-458c124ef493-registration-dir\") pod \"aws-ebs-csi-driver-node-lm2bh\" (UID: \"c1ae1c50-62e9-4fb1-9fc6-458c124ef493\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lm2bh" Mar 18 16:43:30.525196 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.523804 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/df16c6f7-55c8-4f26-80fc-ae0ba3fa368a-env-overrides\") pod \"ovnkube-node-g2jdr\" (UID: \"df16c6f7-55c8-4f26-80fc-ae0ba3fa368a\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2jdr" Mar 18 16:43:30.525196 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.523827 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/fc696d94-dd9b-4ec4-8c68-3ef333b3aaa1-cnibin\") pod \"multus-additional-cni-plugins-x7hss\" (UID: \"fc696d94-dd9b-4ec4-8c68-3ef333b3aaa1\") " pod="openshift-multus/multus-additional-cni-plugins-x7hss" Mar 18 16:43:30.525196 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.523831 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/50878b4f-dc7c-45c7-81b8-e6f009741d18-etc-sysconfig\") pod \"tuned-bt66t\" (UID: \"50878b4f-dc7c-45c7-81b8-e6f009741d18\") " pod="openshift-cluster-node-tuning-operator/tuned-bt66t" Mar 18 16:43:30.525196 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.523832 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/832b4122-2cb3-47a3-9cdb-f45d80349575-host-var-lib-cni-multus\") pod \"multus-wqbkg\" (UID: \"832b4122-2cb3-47a3-9cdb-f45d80349575\") " pod="openshift-multus/multus-wqbkg" Mar 18 16:43:30.526038 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.523855 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/832b4122-2cb3-47a3-9cdb-f45d80349575-host-var-lib-cni-multus\") pod \"multus-wqbkg\" (UID: \"832b4122-2cb3-47a3-9cdb-f45d80349575\") " pod="openshift-multus/multus-wqbkg" Mar 18 16:43:30.526038 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.523859 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/c1ae1c50-62e9-4fb1-9fc6-458c124ef493-registration-dir\") pod \"aws-ebs-csi-driver-node-lm2bh\" (UID: \"c1ae1c50-62e9-4fb1-9fc6-458c124ef493\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lm2bh" Mar 18 16:43:30.526038 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.523866 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/832b4122-2cb3-47a3-9cdb-f45d80349575-multus-conf-dir\") pod \"multus-wqbkg\" (UID: \"832b4122-2cb3-47a3-9cdb-f45d80349575\") " pod="openshift-multus/multus-wqbkg" Mar 18 16:43:30.526038 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.523893 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z68v5\" (UniqueName: \"kubernetes.io/projected/ce437450-60be-4502-aa99-03a4af6c8e7c-kube-api-access-z68v5\") pod \"network-check-target-j676s\" (UID: \"ce437450-60be-4502-aa99-03a4af6c8e7c\") " pod="openshift-network-diagnostics/network-check-target-j676s" Mar 18 16:43:30.526038 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.523918 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/50878b4f-dc7c-45c7-81b8-e6f009741d18-etc-sysctl-conf\") pod \"tuned-bt66t\" (UID: \"50878b4f-dc7c-45c7-81b8-e6f009741d18\") " pod="openshift-cluster-node-tuning-operator/tuned-bt66t" Mar 18 16:43:30.526038 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.523961 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/50878b4f-dc7c-45c7-81b8-e6f009741d18-etc-tuned\") pod \"tuned-bt66t\" (UID: \"50878b4f-dc7c-45c7-81b8-e6f009741d18\") " pod="openshift-cluster-node-tuning-operator/tuned-bt66t" Mar 18 16:43:30.526038 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.523977 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/832b4122-2cb3-47a3-9cdb-f45d80349575-multus-conf-dir\") pod \"multus-wqbkg\" (UID: \"832b4122-2cb3-47a3-9cdb-f45d80349575\") " pod="openshift-multus/multus-wqbkg" Mar 18 16:43:30.526038 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.523992 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/50878b4f-dc7c-45c7-81b8-e6f009741d18-tmp\") pod \"tuned-bt66t\" (UID: \"50878b4f-dc7c-45c7-81b8-e6f009741d18\") " pod="openshift-cluster-node-tuning-operator/tuned-bt66t" Mar 18 16:43:30.526038 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.524019 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/fc696d94-dd9b-4ec4-8c68-3ef333b3aaa1-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-x7hss\" (UID: \"fc696d94-dd9b-4ec4-8c68-3ef333b3aaa1\") " pod="openshift-multus/multus-additional-cni-plugins-x7hss" Mar 18 16:43:30.526038 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.524046 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/c1ae1c50-62e9-4fb1-9fc6-458c124ef493-sys-fs\") pod \"aws-ebs-csi-driver-node-lm2bh\" (UID: \"c1ae1c50-62e9-4fb1-9fc6-458c124ef493\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lm2bh" Mar 18 16:43:30.526038 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.524059 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/50878b4f-dc7c-45c7-81b8-e6f009741d18-etc-sysctl-conf\") pod \"tuned-bt66t\" (UID: \"50878b4f-dc7c-45c7-81b8-e6f009741d18\") " pod="openshift-cluster-node-tuning-operator/tuned-bt66t" Mar 18 16:43:30.526038 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.524071 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/832b4122-2cb3-47a3-9cdb-f45d80349575-host-run-k8s-cni-cncf-io\") pod \"multus-wqbkg\" (UID: \"832b4122-2cb3-47a3-9cdb-f45d80349575\") " pod="openshift-multus/multus-wqbkg" Mar 18 16:43:30.526038 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.524100 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/fe7890a0-afa4-444d-9ad4-445e8890337a-iptables-alerter-script\") pod \"iptables-alerter-hqszc\" (UID: \"fe7890a0-afa4-444d-9ad4-445e8890337a\") " pod="openshift-network-operator/iptables-alerter-hqszc" Mar 18 16:43:30.526038 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.524108 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/832b4122-2cb3-47a3-9cdb-f45d80349575-host-run-k8s-cni-cncf-io\") pod \"multus-wqbkg\" (UID: \"832b4122-2cb3-47a3-9cdb-f45d80349575\") " pod="openshift-multus/multus-wqbkg" Mar 18 16:43:30.526038 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.524124 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/df16c6f7-55c8-4f26-80fc-ae0ba3fa368a-node-log\") pod \"ovnkube-node-g2jdr\" (UID: \"df16c6f7-55c8-4f26-80fc-ae0ba3fa368a\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2jdr" Mar 18 16:43:30.526038 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.524126 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/c1ae1c50-62e9-4fb1-9fc6-458c124ef493-sys-fs\") pod \"aws-ebs-csi-driver-node-lm2bh\" (UID: \"c1ae1c50-62e9-4fb1-9fc6-458c124ef493\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lm2bh" Mar 18 16:43:30.526038 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.524149 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/fc696d94-dd9b-4ec4-8c68-3ef333b3aaa1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-x7hss\" (UID: \"fc696d94-dd9b-4ec4-8c68-3ef333b3aaa1\") " pod="openshift-multus/multus-additional-cni-plugins-x7hss" Mar 18 16:43:30.526842 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.524188 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/df16c6f7-55c8-4f26-80fc-ae0ba3fa368a-node-log\") pod \"ovnkube-node-g2jdr\" (UID: \"df16c6f7-55c8-4f26-80fc-ae0ba3fa368a\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2jdr" Mar 18 16:43:30.526842 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.524248 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j8gsz\" (UniqueName: \"kubernetes.io/projected/0bc34432-5dcb-459c-a6d8-587f77ae9dcf-kube-api-access-j8gsz\") pod \"node-resolver-9w7wj\" (UID: \"0bc34432-5dcb-459c-a6d8-587f77ae9dcf\") " pod="openshift-dns/node-resolver-9w7wj" Mar 18 16:43:30.526842 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.524251 2576 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 18 16:43:30.526842 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.524275 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/df16c6f7-55c8-4f26-80fc-ae0ba3fa368a-host-kubelet\") pod \"ovnkube-node-g2jdr\" (UID: \"df16c6f7-55c8-4f26-80fc-ae0ba3fa368a\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2jdr" Mar 18 16:43:30.526842 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.524287 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/fc696d94-dd9b-4ec4-8c68-3ef333b3aaa1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-x7hss\" (UID: \"fc696d94-dd9b-4ec4-8c68-3ef333b3aaa1\") " pod="openshift-multus/multus-additional-cni-plugins-x7hss" Mar 18 16:43:30.526842 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.524317 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/df16c6f7-55c8-4f26-80fc-ae0ba3fa368a-host-kubelet\") pod \"ovnkube-node-g2jdr\" (UID: \"df16c6f7-55c8-4f26-80fc-ae0ba3fa368a\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2jdr" Mar 18 16:43:30.526842 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.524364 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/df16c6f7-55c8-4f26-80fc-ae0ba3fa368a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-g2jdr\" (UID: \"df16c6f7-55c8-4f26-80fc-ae0ba3fa368a\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2jdr" Mar 18 16:43:30.526842 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.524394 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/df16c6f7-55c8-4f26-80fc-ae0ba3fa368a-ovn-node-metrics-cert\") pod \"ovnkube-node-g2jdr\" (UID: \"df16c6f7-55c8-4f26-80fc-ae0ba3fa368a\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2jdr" Mar 18 16:43:30.526842 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.524419 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/50878b4f-dc7c-45c7-81b8-e6f009741d18-etc-sysctl-d\") pod \"tuned-bt66t\" (UID: \"50878b4f-dc7c-45c7-81b8-e6f009741d18\") " pod="openshift-cluster-node-tuning-operator/tuned-bt66t" Mar 18 16:43:30.526842 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.524452 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e3720b07-5dd9-403e-9a66-17abd67f145f-metrics-certs\") pod \"network-metrics-daemon-bd2jt\" (UID: \"e3720b07-5dd9-403e-9a66-17abd67f145f\") " pod="openshift-multus/network-metrics-daemon-bd2jt" Mar 18 16:43:30.526842 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.524479 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/df16c6f7-55c8-4f26-80fc-ae0ba3fa368a-var-lib-openvswitch\") pod \"ovnkube-node-g2jdr\" (UID: \"df16c6f7-55c8-4f26-80fc-ae0ba3fa368a\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2jdr" Mar 18 16:43:30.526842 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.524479 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/df16c6f7-55c8-4f26-80fc-ae0ba3fa368a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-g2jdr\" (UID: \"df16c6f7-55c8-4f26-80fc-ae0ba3fa368a\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2jdr" Mar 18 16:43:30.526842 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.524514 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/50878b4f-dc7c-45c7-81b8-e6f009741d18-run\") pod \"tuned-bt66t\" (UID: \"50878b4f-dc7c-45c7-81b8-e6f009741d18\") " pod="openshift-cluster-node-tuning-operator/tuned-bt66t" Mar 18 16:43:30.526842 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.524540 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vb5zq\" (UniqueName: \"kubernetes.io/projected/50878b4f-dc7c-45c7-81b8-e6f009741d18-kube-api-access-vb5zq\") pod \"tuned-bt66t\" (UID: \"50878b4f-dc7c-45c7-81b8-e6f009741d18\") " pod="openshift-cluster-node-tuning-operator/tuned-bt66t" Mar 18 16:43:30.526842 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.524546 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/fc696d94-dd9b-4ec4-8c68-3ef333b3aaa1-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-x7hss\" (UID: \"fc696d94-dd9b-4ec4-8c68-3ef333b3aaa1\") " pod="openshift-multus/multus-additional-cni-plugins-x7hss" Mar 18 16:43:30.526842 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.524568 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/fc696d94-dd9b-4ec4-8c68-3ef333b3aaa1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-x7hss\" (UID: \"fc696d94-dd9b-4ec4-8c68-3ef333b3aaa1\") " pod="openshift-multus/multus-additional-cni-plugins-x7hss" Mar 18 16:43:30.526842 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:43:30.524576 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:43:30.526842 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.524594 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/c1ae1c50-62e9-4fb1-9fc6-458c124ef493-device-dir\") pod \"aws-ebs-csi-driver-node-lm2bh\" (UID: \"c1ae1c50-62e9-4fb1-9fc6-458c124ef493\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lm2bh" Mar 18 16:43:30.527597 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.524607 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/50878b4f-dc7c-45c7-81b8-e6f009741d18-etc-sysctl-d\") pod \"tuned-bt66t\" (UID: \"50878b4f-dc7c-45c7-81b8-e6f009741d18\") " pod="openshift-cluster-node-tuning-operator/tuned-bt66t" Mar 18 16:43:30.527597 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.524610 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/50878b4f-dc7c-45c7-81b8-e6f009741d18-run\") pod \"tuned-bt66t\" (UID: \"50878b4f-dc7c-45c7-81b8-e6f009741d18\") " pod="openshift-cluster-node-tuning-operator/tuned-bt66t" Mar 18 16:43:30.527597 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.524620 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ad6e055b-96d2-47b3-bb6b-f2e0165f3470-host\") pod \"node-ca-sr4nn\" (UID: \"ad6e055b-96d2-47b3-bb6b-f2e0165f3470\") " pod="openshift-image-registry/node-ca-sr4nn" Mar 18 16:43:30.527597 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.524595 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/fe7890a0-afa4-444d-9ad4-445e8890337a-iptables-alerter-script\") pod \"iptables-alerter-hqszc\" (UID: \"fe7890a0-afa4-444d-9ad4-445e8890337a\") " pod="openshift-network-operator/iptables-alerter-hqszc" Mar 18 16:43:30.527597 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.524645 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/df16c6f7-55c8-4f26-80fc-ae0ba3fa368a-var-lib-openvswitch\") pod \"ovnkube-node-g2jdr\" (UID: \"df16c6f7-55c8-4f26-80fc-ae0ba3fa368a\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2jdr" Mar 18 16:43:30.527597 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.524672 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/c1ae1c50-62e9-4fb1-9fc6-458c124ef493-device-dir\") pod \"aws-ebs-csi-driver-node-lm2bh\" (UID: \"c1ae1c50-62e9-4fb1-9fc6-458c124ef493\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lm2bh" Mar 18 16:43:30.527597 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:43:30.524681 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e3720b07-5dd9-403e-9a66-17abd67f145f-metrics-certs podName:e3720b07-5dd9-403e-9a66-17abd67f145f nodeName:}" failed. No retries permitted until 2026-03-18 16:43:31.024623337 +0000 UTC m=+2.074438749 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e3720b07-5dd9-403e-9a66-17abd67f145f-metrics-certs") pod "network-metrics-daemon-bd2jt" (UID: "e3720b07-5dd9-403e-9a66-17abd67f145f") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:43:30.527597 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.524762 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/832b4122-2cb3-47a3-9cdb-f45d80349575-cni-binary-copy\") pod \"multus-wqbkg\" (UID: \"832b4122-2cb3-47a3-9cdb-f45d80349575\") " pod="openshift-multus/multus-wqbkg" Mar 18 16:43:30.527597 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.524786 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/832b4122-2cb3-47a3-9cdb-f45d80349575-hostroot\") pod \"multus-wqbkg\" (UID: \"832b4122-2cb3-47a3-9cdb-f45d80349575\") " pod="openshift-multus/multus-wqbkg" Mar 18 16:43:30.527597 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.524806 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d6qh2\" (UniqueName: \"kubernetes.io/projected/832b4122-2cb3-47a3-9cdb-f45d80349575-kube-api-access-d6qh2\") pod \"multus-wqbkg\" (UID: \"832b4122-2cb3-47a3-9cdb-f45d80349575\") " pod="openshift-multus/multus-wqbkg" Mar 18 16:43:30.527597 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.524827 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/832b4122-2cb3-47a3-9cdb-f45d80349575-etc-kubernetes\") pod \"multus-wqbkg\" (UID: \"832b4122-2cb3-47a3-9cdb-f45d80349575\") " pod="openshift-multus/multus-wqbkg" Mar 18 16:43:30.527597 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.524851 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ad6e055b-96d2-47b3-bb6b-f2e0165f3470-host\") pod \"node-ca-sr4nn\" (UID: \"ad6e055b-96d2-47b3-bb6b-f2e0165f3470\") " pod="openshift-image-registry/node-ca-sr4nn" Mar 18 16:43:30.527597 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.524862 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/df16c6f7-55c8-4f26-80fc-ae0ba3fa368a-host-cni-bin\") pod \"ovnkube-node-g2jdr\" (UID: \"df16c6f7-55c8-4f26-80fc-ae0ba3fa368a\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2jdr" Mar 18 16:43:30.527597 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.524894 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/df16c6f7-55c8-4f26-80fc-ae0ba3fa368a-host-cni-bin\") pod \"ovnkube-node-g2jdr\" (UID: \"df16c6f7-55c8-4f26-80fc-ae0ba3fa368a\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2jdr" Mar 18 16:43:30.527597 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.524918 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/832b4122-2cb3-47a3-9cdb-f45d80349575-hostroot\") pod \"multus-wqbkg\" (UID: \"832b4122-2cb3-47a3-9cdb-f45d80349575\") " pod="openshift-multus/multus-wqbkg" Mar 18 16:43:30.527597 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.524973 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/832b4122-2cb3-47a3-9cdb-f45d80349575-etc-kubernetes\") pod \"multus-wqbkg\" (UID: \"832b4122-2cb3-47a3-9cdb-f45d80349575\") " pod="openshift-multus/multus-wqbkg" Mar 18 16:43:30.527597 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.524892 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/bf84e45b-5912-4a57-b350-dd211f3513fe-agent-certs\") pod \"konnectivity-agent-vclgl\" (UID: \"bf84e45b-5912-4a57-b350-dd211f3513fe\") " pod="kube-system/konnectivity-agent-vclgl" Mar 18 16:43:30.528091 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.525194 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/832b4122-2cb3-47a3-9cdb-f45d80349575-host-run-netns\") pod \"multus-wqbkg\" (UID: \"832b4122-2cb3-47a3-9cdb-f45d80349575\") " pod="openshift-multus/multus-wqbkg" Mar 18 16:43:30.528091 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.525225 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/832b4122-2cb3-47a3-9cdb-f45d80349575-multus-socket-dir-parent\") pod \"multus-wqbkg\" (UID: \"832b4122-2cb3-47a3-9cdb-f45d80349575\") " pod="openshift-multus/multus-wqbkg" Mar 18 16:43:30.528091 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.525250 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/832b4122-2cb3-47a3-9cdb-f45d80349575-host-var-lib-cni-bin\") pod \"multus-wqbkg\" (UID: \"832b4122-2cb3-47a3-9cdb-f45d80349575\") " pod="openshift-multus/multus-wqbkg" Mar 18 16:43:30.528091 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.525277 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fe7890a0-afa4-444d-9ad4-445e8890337a-host-slash\") pod \"iptables-alerter-hqszc\" (UID: \"fe7890a0-afa4-444d-9ad4-445e8890337a\") " pod="openshift-network-operator/iptables-alerter-hqszc" Mar 18 16:43:30.528091 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.525303 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/df16c6f7-55c8-4f26-80fc-ae0ba3fa368a-systemd-units\") pod \"ovnkube-node-g2jdr\" (UID: \"df16c6f7-55c8-4f26-80fc-ae0ba3fa368a\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2jdr" Mar 18 16:43:30.528091 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.525330 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/832b4122-2cb3-47a3-9cdb-f45d80349575-cnibin\") pod \"multus-wqbkg\" (UID: \"832b4122-2cb3-47a3-9cdb-f45d80349575\") " pod="openshift-multus/multus-wqbkg" Mar 18 16:43:30.528091 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.525356 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/832b4122-2cb3-47a3-9cdb-f45d80349575-multus-daemon-config\") pod \"multus-wqbkg\" (UID: \"832b4122-2cb3-47a3-9cdb-f45d80349575\") " pod="openshift-multus/multus-wqbkg" Mar 18 16:43:30.528091 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.525382 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/df16c6f7-55c8-4f26-80fc-ae0ba3fa368a-host-run-netns\") pod \"ovnkube-node-g2jdr\" (UID: \"df16c6f7-55c8-4f26-80fc-ae0ba3fa368a\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2jdr" Mar 18 16:43:30.528091 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.525413 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/df16c6f7-55c8-4f26-80fc-ae0ba3fa368a-ovnkube-script-lib\") pod \"ovnkube-node-g2jdr\" (UID: \"df16c6f7-55c8-4f26-80fc-ae0ba3fa368a\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2jdr" Mar 18 16:43:30.528091 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.525438 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fc696d94-dd9b-4ec4-8c68-3ef333b3aaa1-system-cni-dir\") pod \"multus-additional-cni-plugins-x7hss\" (UID: \"fc696d94-dd9b-4ec4-8c68-3ef333b3aaa1\") " pod="openshift-multus/multus-additional-cni-plugins-x7hss" Mar 18 16:43:30.528091 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.525453 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/832b4122-2cb3-47a3-9cdb-f45d80349575-cni-binary-copy\") pod \"multus-wqbkg\" (UID: \"832b4122-2cb3-47a3-9cdb-f45d80349575\") " pod="openshift-multus/multus-wqbkg" Mar 18 16:43:30.528091 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.525462 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/c1ae1c50-62e9-4fb1-9fc6-458c124ef493-etc-selinux\") pod \"aws-ebs-csi-driver-node-lm2bh\" (UID: \"c1ae1c50-62e9-4fb1-9fc6-458c124ef493\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lm2bh" Mar 18 16:43:30.528091 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.525496 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q9hs8\" (UniqueName: \"kubernetes.io/projected/c1ae1c50-62e9-4fb1-9fc6-458c124ef493-kube-api-access-q9hs8\") pod \"aws-ebs-csi-driver-node-lm2bh\" (UID: \"c1ae1c50-62e9-4fb1-9fc6-458c124ef493\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lm2bh" Mar 18 16:43:30.528091 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.525545 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/c1ae1c50-62e9-4fb1-9fc6-458c124ef493-etc-selinux\") pod \"aws-ebs-csi-driver-node-lm2bh\" (UID: \"c1ae1c50-62e9-4fb1-9fc6-458c124ef493\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lm2bh" Mar 18 16:43:30.528091 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.525585 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/832b4122-2cb3-47a3-9cdb-f45d80349575-host-run-netns\") pod \"multus-wqbkg\" (UID: \"832b4122-2cb3-47a3-9cdb-f45d80349575\") " pod="openshift-multus/multus-wqbkg" Mar 18 16:43:30.528091 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.525630 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/832b4122-2cb3-47a3-9cdb-f45d80349575-multus-socket-dir-parent\") pod \"multus-wqbkg\" (UID: \"832b4122-2cb3-47a3-9cdb-f45d80349575\") " pod="openshift-multus/multus-wqbkg" Mar 18 16:43:30.528091 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.525642 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/fc696d94-dd9b-4ec4-8c68-3ef333b3aaa1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-x7hss\" (UID: \"fc696d94-dd9b-4ec4-8c68-3ef333b3aaa1\") " pod="openshift-multus/multus-additional-cni-plugins-x7hss" Mar 18 16:43:30.528567 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.525666 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/832b4122-2cb3-47a3-9cdb-f45d80349575-host-var-lib-cni-bin\") pod \"multus-wqbkg\" (UID: \"832b4122-2cb3-47a3-9cdb-f45d80349575\") " pod="openshift-multus/multus-wqbkg" Mar 18 16:43:30.528567 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.525707 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/df16c6f7-55c8-4f26-80fc-ae0ba3fa368a-host-run-netns\") pod \"ovnkube-node-g2jdr\" (UID: \"df16c6f7-55c8-4f26-80fc-ae0ba3fa368a\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2jdr" Mar 18 16:43:30.528567 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.525749 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/df16c6f7-55c8-4f26-80fc-ae0ba3fa368a-systemd-units\") pod \"ovnkube-node-g2jdr\" (UID: \"df16c6f7-55c8-4f26-80fc-ae0ba3fa368a\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2jdr" Mar 18 16:43:30.528567 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.526002 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fc696d94-dd9b-4ec4-8c68-3ef333b3aaa1-system-cni-dir\") pod \"multus-additional-cni-plugins-x7hss\" (UID: \"fc696d94-dd9b-4ec4-8c68-3ef333b3aaa1\") " pod="openshift-multus/multus-additional-cni-plugins-x7hss" Mar 18 16:43:30.528567 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.526040 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fe7890a0-afa4-444d-9ad4-445e8890337a-host-slash\") pod \"iptables-alerter-hqszc\" (UID: \"fe7890a0-afa4-444d-9ad4-445e8890337a\") " pod="openshift-network-operator/iptables-alerter-hqszc" Mar 18 16:43:30.528567 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.526094 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/832b4122-2cb3-47a3-9cdb-f45d80349575-cnibin\") pod \"multus-wqbkg\" (UID: \"832b4122-2cb3-47a3-9cdb-f45d80349575\") " pod="openshift-multus/multus-wqbkg" Mar 18 16:43:30.528567 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.526140 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/832b4122-2cb3-47a3-9cdb-f45d80349575-multus-daemon-config\") pod \"multus-wqbkg\" (UID: \"832b4122-2cb3-47a3-9cdb-f45d80349575\") " pod="openshift-multus/multus-wqbkg" Mar 18 16:43:30.528567 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.526197 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/df16c6f7-55c8-4f26-80fc-ae0ba3fa368a-ovnkube-script-lib\") pod \"ovnkube-node-g2jdr\" (UID: \"df16c6f7-55c8-4f26-80fc-ae0ba3fa368a\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2jdr" Mar 18 16:43:30.528567 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.527599 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/50878b4f-dc7c-45c7-81b8-e6f009741d18-tmp\") pod \"tuned-bt66t\" (UID: \"50878b4f-dc7c-45c7-81b8-e6f009741d18\") " pod="openshift-cluster-node-tuning-operator/tuned-bt66t" Mar 18 16:43:30.528567 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.527653 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/50878b4f-dc7c-45c7-81b8-e6f009741d18-etc-tuned\") pod \"tuned-bt66t\" (UID: \"50878b4f-dc7c-45c7-81b8-e6f009741d18\") " pod="openshift-cluster-node-tuning-operator/tuned-bt66t" Mar 18 16:43:30.528567 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.527824 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/bf84e45b-5912-4a57-b350-dd211f3513fe-agent-certs\") pod \"konnectivity-agent-vclgl\" (UID: \"bf84e45b-5912-4a57-b350-dd211f3513fe\") " pod="kube-system/konnectivity-agent-vclgl" Mar 18 16:43:30.528567 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.527918 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/df16c6f7-55c8-4f26-80fc-ae0ba3fa368a-ovn-node-metrics-cert\") pod \"ovnkube-node-g2jdr\" (UID: \"df16c6f7-55c8-4f26-80fc-ae0ba3fa368a\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2jdr" Mar 18 16:43:30.536785 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:43:30.536764 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 16:43:30.536785 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:43:30.536789 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 16:43:30.537728 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:43:30.536803 2576 projected.go:194] Error preparing data for projected volume kube-api-access-z68v5 for pod openshift-network-diagnostics/network-check-target-j676s: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:43:30.537728 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:43:30.536894 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ce437450-60be-4502-aa99-03a4af6c8e7c-kube-api-access-z68v5 podName:ce437450-60be-4502-aa99-03a4af6c8e7c nodeName:}" failed. No retries permitted until 2026-03-18 16:43:31.036873707 +0000 UTC m=+2.086689125 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-z68v5" (UniqueName: "kubernetes.io/projected/ce437450-60be-4502-aa99-03a4af6c8e7c-kube-api-access-z68v5") pod "network-check-target-j676s" (UID: "ce437450-60be-4502-aa99-03a4af6c8e7c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:43:30.539841 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.538730 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8gsz\" (UniqueName: \"kubernetes.io/projected/0bc34432-5dcb-459c-a6d8-587f77ae9dcf-kube-api-access-j8gsz\") pod \"node-resolver-9w7wj\" (UID: \"0bc34432-5dcb-459c-a6d8-587f77ae9dcf\") " pod="openshift-dns/node-resolver-9w7wj" Mar 18 16:43:30.539841 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.539489 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-22ctb\" (UniqueName: \"kubernetes.io/projected/fe7890a0-afa4-444d-9ad4-445e8890337a-kube-api-access-22ctb\") pod \"iptables-alerter-hqszc\" (UID: \"fe7890a0-afa4-444d-9ad4-445e8890337a\") " pod="openshift-network-operator/iptables-alerter-hqszc" Mar 18 16:43:30.539841 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.539641 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9hs8\" (UniqueName: \"kubernetes.io/projected/c1ae1c50-62e9-4fb1-9fc6-458c124ef493-kube-api-access-q9hs8\") pod \"aws-ebs-csi-driver-node-lm2bh\" (UID: \"c1ae1c50-62e9-4fb1-9fc6-458c124ef493\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lm2bh" Mar 18 16:43:30.539841 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.539655 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzzmt\" (UniqueName: \"kubernetes.io/projected/e3720b07-5dd9-403e-9a66-17abd67f145f-kube-api-access-tzzmt\") pod \"network-metrics-daemon-bd2jt\" (UID: \"e3720b07-5dd9-403e-9a66-17abd67f145f\") " pod="openshift-multus/network-metrics-daemon-bd2jt" Mar 18 16:43:30.539841 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.539704 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6qh2\" (UniqueName: \"kubernetes.io/projected/832b4122-2cb3-47a3-9cdb-f45d80349575-kube-api-access-d6qh2\") pod \"multus-wqbkg\" (UID: \"832b4122-2cb3-47a3-9cdb-f45d80349575\") " pod="openshift-multus/multus-wqbkg" Mar 18 16:43:30.539841 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.539788 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-98ghs\" (UniqueName: \"kubernetes.io/projected/df16c6f7-55c8-4f26-80fc-ae0ba3fa368a-kube-api-access-98ghs\") pod \"ovnkube-node-g2jdr\" (UID: \"df16c6f7-55c8-4f26-80fc-ae0ba3fa368a\") " pod="openshift-ovn-kubernetes/ovnkube-node-g2jdr" Mar 18 16:43:30.540742 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.540714 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcsf6\" (UniqueName: \"kubernetes.io/projected/fc696d94-dd9b-4ec4-8c68-3ef333b3aaa1-kube-api-access-wcsf6\") pod \"multus-additional-cni-plugins-x7hss\" (UID: \"fc696d94-dd9b-4ec4-8c68-3ef333b3aaa1\") " pod="openshift-multus/multus-additional-cni-plugins-x7hss" Mar 18 16:43:30.540828 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.540751 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vb5zq\" (UniqueName: \"kubernetes.io/projected/50878b4f-dc7c-45c7-81b8-e6f009741d18-kube-api-access-vb5zq\") pod \"tuned-bt66t\" (UID: \"50878b4f-dc7c-45c7-81b8-e6f009741d18\") " pod="openshift-cluster-node-tuning-operator/tuned-bt66t" Mar 18 16:43:30.541001 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.540980 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqhzg\" (UniqueName: \"kubernetes.io/projected/ad6e055b-96d2-47b3-bb6b-f2e0165f3470-kube-api-access-zqhzg\") pod \"node-ca-sr4nn\" (UID: \"ad6e055b-96d2-47b3-bb6b-f2e0165f3470\") " pod="openshift-image-registry/node-ca-sr4nn" Mar 18 16:43:30.592560 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:30.592521 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e78a67ecfa4cc7c383866ea4270c748.slice/crio-bb96e52e65b03b77f66b7e232718691fa1e49ac9f7c6a98ff2aa06132e1acbc3 WatchSource:0}: Error finding container bb96e52e65b03b77f66b7e232718691fa1e49ac9f7c6a98ff2aa06132e1acbc3: Status 404 returned error can't find the container with id bb96e52e65b03b77f66b7e232718691fa1e49ac9f7c6a98ff2aa06132e1acbc3 Mar 18 16:43:30.592906 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:30.592884 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1fbacc3938de459504eb6ed80d05d8dc.slice/crio-039665fb614b505c0353abd5ec011fe7a00ef62333c46375c6fb4714ccbb1219 WatchSource:0}: Error finding container 039665fb614b505c0353abd5ec011fe7a00ef62333c46375c6fb4714ccbb1219: Status 404 returned error can't find the container with id 039665fb614b505c0353abd5ec011fe7a00ef62333c46375c6fb4714ccbb1219 Mar 18 16:43:30.596954 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.596923 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 16:43:30.721593 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.721558 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-sr4nn" Mar 18 16:43:30.728324 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:30.728295 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad6e055b_96d2_47b3_bb6b_f2e0165f3470.slice/crio-0d4e801f3e058df01b2d3782155206bf5281d3860a83966087004b878bb4bc89 WatchSource:0}: Error finding container 0d4e801f3e058df01b2d3782155206bf5281d3860a83966087004b878bb4bc89: Status 404 returned error can't find the container with id 0d4e801f3e058df01b2d3782155206bf5281d3860a83966087004b878bb4bc89 Mar 18 16:43:30.742222 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.742201 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-x7hss" Mar 18 16:43:30.748703 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:30.748681 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc696d94_dd9b_4ec4_8c68_3ef333b3aaa1.slice/crio-aaa43e25dca9a4b31709c5c365c6d4c9aa35d659b89a91a703791514cddea260 WatchSource:0}: Error finding container aaa43e25dca9a4b31709c5c365c6d4c9aa35d659b89a91a703791514cddea260: Status 404 returned error can't find the container with id aaa43e25dca9a4b31709c5c365c6d4c9aa35d659b89a91a703791514cddea260 Mar 18 16:43:30.765094 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.765072 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lm2bh" Mar 18 16:43:30.765203 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.765129 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-wqbkg" Mar 18 16:43:30.772467 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:30.772442 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc1ae1c50_62e9_4fb1_9fc6_458c124ef493.slice/crio-93a8aa23c696fc75ca33712d20eefec1a97c2ef0f0cf78cb913adade99464c71 WatchSource:0}: Error finding container 93a8aa23c696fc75ca33712d20eefec1a97c2ef0f0cf78cb913adade99464c71: Status 404 returned error can't find the container with id 93a8aa23c696fc75ca33712d20eefec1a97c2ef0f0cf78cb913adade99464c71 Mar 18 16:43:30.774352 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:30.774324 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod832b4122_2cb3_47a3_9cdb_f45d80349575.slice/crio-6975eba1cc96368c38ada3e58f7a98f7bc42ea41e216ea4c37a67cbdfe3f1450 WatchSource:0}: Error finding container 6975eba1cc96368c38ada3e58f7a98f7bc42ea41e216ea4c37a67cbdfe3f1450: Status 404 returned error can't find the container with id 6975eba1cc96368c38ada3e58f7a98f7bc42ea41e216ea4c37a67cbdfe3f1450 Mar 18 16:43:30.777091 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.777014 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-hqszc" Mar 18 16:43:30.782341 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:30.782318 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe7890a0_afa4_444d_9ad4_445e8890337a.slice/crio-83641dc108a99de76b6a39118062de3e4670bd211f54bf787381a43b4ce4e931 WatchSource:0}: Error finding container 83641dc108a99de76b6a39118062de3e4670bd211f54bf787381a43b4ce4e931: Status 404 returned error can't find the container with id 83641dc108a99de76b6a39118062de3e4670bd211f54bf787381a43b4ce4e931 Mar 18 16:43:30.791924 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.791899 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-g2jdr" Mar 18 16:43:30.797361 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:30.797332 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf16c6f7_55c8_4f26_80fc_ae0ba3fa368a.slice/crio-87b486ae423cfb77d9a5ff4733d1f10b9d33ad07e2090ca8f20acbc3c82bac4d WatchSource:0}: Error finding container 87b486ae423cfb77d9a5ff4733d1f10b9d33ad07e2090ca8f20acbc3c82bac4d: Status 404 returned error can't find the container with id 87b486ae423cfb77d9a5ff4733d1f10b9d33ad07e2090ca8f20acbc3c82bac4d Mar 18 16:43:30.809828 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.809808 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-vclgl" Mar 18 16:43:30.815490 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:30.815464 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf84e45b_5912_4a57_b350_dd211f3513fe.slice/crio-8c8df24a515a23b27a604d114a5d4b968a86ba4f77473c08dc20b7842a9288aa WatchSource:0}: Error finding container 8c8df24a515a23b27a604d114a5d4b968a86ba4f77473c08dc20b7842a9288aa: Status 404 returned error can't find the container with id 8c8df24a515a23b27a604d114a5d4b968a86ba4f77473c08dc20b7842a9288aa Mar 18 16:43:30.816465 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.816421 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-bt66t" Mar 18 16:43:30.822004 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:30.821982 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod50878b4f_dc7c_45c7_81b8_e6f009741d18.slice/crio-b2dc4a3a186abb32eaa61dd8d752abe48c668a50ac2d22d1db3e823328095ab9 WatchSource:0}: Error finding container b2dc4a3a186abb32eaa61dd8d752abe48c668a50ac2d22d1db3e823328095ab9: Status 404 returned error can't find the container with id b2dc4a3a186abb32eaa61dd8d752abe48c668a50ac2d22d1db3e823328095ab9 Mar 18 16:43:30.836293 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:30.836218 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-9w7wj" Mar 18 16:43:30.843218 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:43:30.843194 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0bc34432_5dcb_459c_a6d8_587f77ae9dcf.slice/crio-f635a14123b37d3bb0a5c4672b91a2a91568deb52e903d189a11c7fc7feb3394 WatchSource:0}: Error finding container f635a14123b37d3bb0a5c4672b91a2a91568deb52e903d189a11c7fc7feb3394: Status 404 returned error can't find the container with id f635a14123b37d3bb0a5c4672b91a2a91568deb52e903d189a11c7fc7feb3394 Mar 18 16:43:31.028431 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:31.028397 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e3720b07-5dd9-403e-9a66-17abd67f145f-metrics-certs\") pod \"network-metrics-daemon-bd2jt\" (UID: \"e3720b07-5dd9-403e-9a66-17abd67f145f\") " pod="openshift-multus/network-metrics-daemon-bd2jt" Mar 18 16:43:31.028604 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:43:31.028533 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:43:31.028663 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:43:31.028604 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e3720b07-5dd9-403e-9a66-17abd67f145f-metrics-certs podName:e3720b07-5dd9-403e-9a66-17abd67f145f nodeName:}" failed. No retries permitted until 2026-03-18 16:43:32.028584864 +0000 UTC m=+3.078400272 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e3720b07-5dd9-403e-9a66-17abd67f145f-metrics-certs") pod "network-metrics-daemon-bd2jt" (UID: "e3720b07-5dd9-403e-9a66-17abd67f145f") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:43:31.129585 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:31.128735 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z68v5\" (UniqueName: \"kubernetes.io/projected/ce437450-60be-4502-aa99-03a4af6c8e7c-kube-api-access-z68v5\") pod \"network-check-target-j676s\" (UID: \"ce437450-60be-4502-aa99-03a4af6c8e7c\") " pod="openshift-network-diagnostics/network-check-target-j676s" Mar 18 16:43:31.129585 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:43:31.128918 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 16:43:31.129585 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:43:31.128955 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 16:43:31.129585 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:43:31.128968 2576 projected.go:194] Error preparing data for projected volume kube-api-access-z68v5 for pod openshift-network-diagnostics/network-check-target-j676s: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:43:31.129585 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:43:31.129026 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ce437450-60be-4502-aa99-03a4af6c8e7c-kube-api-access-z68v5 podName:ce437450-60be-4502-aa99-03a4af6c8e7c nodeName:}" failed. No retries permitted until 2026-03-18 16:43:32.129005579 +0000 UTC m=+3.178821005 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-z68v5" (UniqueName: "kubernetes.io/projected/ce437450-60be-4502-aa99-03a4af6c8e7c-kube-api-access-z68v5") pod "network-check-target-j676s" (UID: "ce437450-60be-4502-aa99-03a4af6c8e7c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:43:31.328769 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:31.328732 2576 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Mar 18 16:43:31.373290 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:31.373039 2576 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Mar 18 16:43:31.451268 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:31.451125 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-03-17 16:38:30 +0000 UTC" deadline="2027-11-30 17:19:49.38371846 +0000 UTC" Mar 18 16:43:31.451268 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:31.451158 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14928h36m17.932564522s" Mar 18 16:43:31.579070 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:31.578987 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-190.ec2.internal" event={"ID":"1fbacc3938de459504eb6ed80d05d8dc","Type":"ContainerStarted","Data":"039665fb614b505c0353abd5ec011fe7a00ef62333c46375c6fb4714ccbb1219"} Mar 18 16:43:31.592312 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:31.592278 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-190.ec2.internal" event={"ID":"7e78a67ecfa4cc7c383866ea4270c748","Type":"ContainerStarted","Data":"bb96e52e65b03b77f66b7e232718691fa1e49ac9f7c6a98ff2aa06132e1acbc3"} Mar 18 16:43:31.596654 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:31.596578 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-9w7wj" event={"ID":"0bc34432-5dcb-459c-a6d8-587f77ae9dcf","Type":"ContainerStarted","Data":"f635a14123b37d3bb0a5c4672b91a2a91568deb52e903d189a11c7fc7feb3394"} Mar 18 16:43:31.605173 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:31.605143 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-x7hss" event={"ID":"fc696d94-dd9b-4ec4-8c68-3ef333b3aaa1","Type":"ContainerStarted","Data":"aaa43e25dca9a4b31709c5c365c6d4c9aa35d659b89a91a703791514cddea260"} Mar 18 16:43:31.618109 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:31.618075 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-sr4nn" event={"ID":"ad6e055b-96d2-47b3-bb6b-f2e0165f3470","Type":"ContainerStarted","Data":"0d4e801f3e058df01b2d3782155206bf5281d3860a83966087004b878bb4bc89"} Mar 18 16:43:31.627272 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:31.627227 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-bt66t" event={"ID":"50878b4f-dc7c-45c7-81b8-e6f009741d18","Type":"ContainerStarted","Data":"b2dc4a3a186abb32eaa61dd8d752abe48c668a50ac2d22d1db3e823328095ab9"} Mar 18 16:43:31.645657 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:31.645604 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-vclgl" event={"ID":"bf84e45b-5912-4a57-b350-dd211f3513fe","Type":"ContainerStarted","Data":"8c8df24a515a23b27a604d114a5d4b968a86ba4f77473c08dc20b7842a9288aa"} Mar 18 16:43:31.662227 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:31.662191 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2jdr" event={"ID":"df16c6f7-55c8-4f26-80fc-ae0ba3fa368a","Type":"ContainerStarted","Data":"87b486ae423cfb77d9a5ff4733d1f10b9d33ad07e2090ca8f20acbc3c82bac4d"} Mar 18 16:43:31.682039 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:31.682001 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-hqszc" event={"ID":"fe7890a0-afa4-444d-9ad4-445e8890337a","Type":"ContainerStarted","Data":"83641dc108a99de76b6a39118062de3e4670bd211f54bf787381a43b4ce4e931"} Mar 18 16:43:31.694150 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:31.693612 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wqbkg" event={"ID":"832b4122-2cb3-47a3-9cdb-f45d80349575","Type":"ContainerStarted","Data":"6975eba1cc96368c38ada3e58f7a98f7bc42ea41e216ea4c37a67cbdfe3f1450"} Mar 18 16:43:31.699800 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:31.699764 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lm2bh" event={"ID":"c1ae1c50-62e9-4fb1-9fc6-458c124ef493","Type":"ContainerStarted","Data":"93a8aa23c696fc75ca33712d20eefec1a97c2ef0f0cf78cb913adade99464c71"} Mar 18 16:43:31.874268 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:31.874237 2576 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Mar 18 16:43:32.036630 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:32.036581 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e3720b07-5dd9-403e-9a66-17abd67f145f-metrics-certs\") pod \"network-metrics-daemon-bd2jt\" (UID: \"e3720b07-5dd9-403e-9a66-17abd67f145f\") " pod="openshift-multus/network-metrics-daemon-bd2jt" Mar 18 16:43:32.036820 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:43:32.036744 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:43:32.036820 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:43:32.036809 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e3720b07-5dd9-403e-9a66-17abd67f145f-metrics-certs podName:e3720b07-5dd9-403e-9a66-17abd67f145f nodeName:}" failed. No retries permitted until 2026-03-18 16:43:34.036789179 +0000 UTC m=+5.086604585 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e3720b07-5dd9-403e-9a66-17abd67f145f-metrics-certs") pod "network-metrics-daemon-bd2jt" (UID: "e3720b07-5dd9-403e-9a66-17abd67f145f") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:43:32.137681 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:32.137647 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z68v5\" (UniqueName: \"kubernetes.io/projected/ce437450-60be-4502-aa99-03a4af6c8e7c-kube-api-access-z68v5\") pod \"network-check-target-j676s\" (UID: \"ce437450-60be-4502-aa99-03a4af6c8e7c\") " pod="openshift-network-diagnostics/network-check-target-j676s" Mar 18 16:43:32.137863 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:43:32.137828 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 16:43:32.137863 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:43:32.137853 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 16:43:32.137863 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:43:32.137866 2576 projected.go:194] Error preparing data for projected volume kube-api-access-z68v5 for pod openshift-network-diagnostics/network-check-target-j676s: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:43:32.138047 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:43:32.137929 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ce437450-60be-4502-aa99-03a4af6c8e7c-kube-api-access-z68v5 podName:ce437450-60be-4502-aa99-03a4af6c8e7c nodeName:}" failed. No retries permitted until 2026-03-18 16:43:34.137910303 +0000 UTC m=+5.187725708 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-z68v5" (UniqueName: "kubernetes.io/projected/ce437450-60be-4502-aa99-03a4af6c8e7c-kube-api-access-z68v5") pod "network-check-target-j676s" (UID: "ce437450-60be-4502-aa99-03a4af6c8e7c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:43:32.451998 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:32.451741 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-03-17 16:38:30 +0000 UTC" deadline="2027-10-01 11:43:31.342756997 +0000 UTC" Mar 18 16:43:32.451998 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:32.451852 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13482h59m58.890910202s" Mar 18 16:43:32.561234 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:32.561197 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bd2jt" Mar 18 16:43:32.561432 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:43:32.561339 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bd2jt" podUID="e3720b07-5dd9-403e-9a66-17abd67f145f" Mar 18 16:43:32.561795 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:32.561774 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-j676s" Mar 18 16:43:32.561888 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:43:32.561869 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-j676s" podUID="ce437450-60be-4502-aa99-03a4af6c8e7c" Mar 18 16:43:34.057213 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:34.056478 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e3720b07-5dd9-403e-9a66-17abd67f145f-metrics-certs\") pod \"network-metrics-daemon-bd2jt\" (UID: \"e3720b07-5dd9-403e-9a66-17abd67f145f\") " pod="openshift-multus/network-metrics-daemon-bd2jt" Mar 18 16:43:34.057213 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:43:34.056614 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:43:34.057213 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:43:34.056683 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e3720b07-5dd9-403e-9a66-17abd67f145f-metrics-certs podName:e3720b07-5dd9-403e-9a66-17abd67f145f nodeName:}" failed. No retries permitted until 2026-03-18 16:43:38.056659845 +0000 UTC m=+9.106475299 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e3720b07-5dd9-403e-9a66-17abd67f145f-metrics-certs") pod "network-metrics-daemon-bd2jt" (UID: "e3720b07-5dd9-403e-9a66-17abd67f145f") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:43:34.157861 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:34.157705 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z68v5\" (UniqueName: \"kubernetes.io/projected/ce437450-60be-4502-aa99-03a4af6c8e7c-kube-api-access-z68v5\") pod \"network-check-target-j676s\" (UID: \"ce437450-60be-4502-aa99-03a4af6c8e7c\") " pod="openshift-network-diagnostics/network-check-target-j676s" Mar 18 16:43:34.158150 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:43:34.157906 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 16:43:34.158150 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:43:34.157922 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 16:43:34.158150 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:43:34.157931 2576 projected.go:194] Error preparing data for projected volume kube-api-access-z68v5 for pod openshift-network-diagnostics/network-check-target-j676s: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:43:34.158150 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:43:34.158008 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ce437450-60be-4502-aa99-03a4af6c8e7c-kube-api-access-z68v5 podName:ce437450-60be-4502-aa99-03a4af6c8e7c nodeName:}" failed. No retries permitted until 2026-03-18 16:43:38.15798027 +0000 UTC m=+9.207795675 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-z68v5" (UniqueName: "kubernetes.io/projected/ce437450-60be-4502-aa99-03a4af6c8e7c-kube-api-access-z68v5") pod "network-check-target-j676s" (UID: "ce437450-60be-4502-aa99-03a4af6c8e7c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:43:34.559472 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:34.558832 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bd2jt" Mar 18 16:43:34.559472 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:43:34.558996 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bd2jt" podUID="e3720b07-5dd9-403e-9a66-17abd67f145f" Mar 18 16:43:34.559472 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:34.559159 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-j676s" Mar 18 16:43:34.559472 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:43:34.559265 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-j676s" podUID="ce437450-60be-4502-aa99-03a4af6c8e7c" Mar 18 16:43:36.559252 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:36.559217 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-j676s" Mar 18 16:43:36.559734 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:36.559227 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bd2jt" Mar 18 16:43:36.559734 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:43:36.559426 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-j676s" podUID="ce437450-60be-4502-aa99-03a4af6c8e7c" Mar 18 16:43:36.559734 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:43:36.559559 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bd2jt" podUID="e3720b07-5dd9-403e-9a66-17abd67f145f" Mar 18 16:43:38.091180 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:38.091141 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e3720b07-5dd9-403e-9a66-17abd67f145f-metrics-certs\") pod \"network-metrics-daemon-bd2jt\" (UID: \"e3720b07-5dd9-403e-9a66-17abd67f145f\") " pod="openshift-multus/network-metrics-daemon-bd2jt" Mar 18 16:43:38.091622 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:43:38.091289 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:43:38.091622 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:43:38.091355 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e3720b07-5dd9-403e-9a66-17abd67f145f-metrics-certs podName:e3720b07-5dd9-403e-9a66-17abd67f145f nodeName:}" failed. No retries permitted until 2026-03-18 16:43:46.091338068 +0000 UTC m=+17.141153473 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e3720b07-5dd9-403e-9a66-17abd67f145f-metrics-certs") pod "network-metrics-daemon-bd2jt" (UID: "e3720b07-5dd9-403e-9a66-17abd67f145f") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:43:38.192713 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:38.192055 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z68v5\" (UniqueName: \"kubernetes.io/projected/ce437450-60be-4502-aa99-03a4af6c8e7c-kube-api-access-z68v5\") pod \"network-check-target-j676s\" (UID: \"ce437450-60be-4502-aa99-03a4af6c8e7c\") " pod="openshift-network-diagnostics/network-check-target-j676s" Mar 18 16:43:38.192713 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:43:38.192268 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 16:43:38.192713 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:43:38.192290 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 16:43:38.192713 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:43:38.192304 2576 projected.go:194] Error preparing data for projected volume kube-api-access-z68v5 for pod openshift-network-diagnostics/network-check-target-j676s: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:43:38.192713 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:43:38.192364 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ce437450-60be-4502-aa99-03a4af6c8e7c-kube-api-access-z68v5 podName:ce437450-60be-4502-aa99-03a4af6c8e7c nodeName:}" failed. No retries permitted until 2026-03-18 16:43:46.192344562 +0000 UTC m=+17.242159970 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-z68v5" (UniqueName: "kubernetes.io/projected/ce437450-60be-4502-aa99-03a4af6c8e7c-kube-api-access-z68v5") pod "network-check-target-j676s" (UID: "ce437450-60be-4502-aa99-03a4af6c8e7c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:43:38.558341 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:38.558302 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bd2jt" Mar 18 16:43:38.558341 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:38.558302 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-j676s" Mar 18 16:43:38.558589 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:43:38.558470 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bd2jt" podUID="e3720b07-5dd9-403e-9a66-17abd67f145f" Mar 18 16:43:38.558645 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:43:38.558600 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-j676s" podUID="ce437450-60be-4502-aa99-03a4af6c8e7c" Mar 18 16:43:40.558668 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:40.558628 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-j676s" Mar 18 16:43:40.559117 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:40.558636 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bd2jt" Mar 18 16:43:40.559117 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:43:40.558765 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-j676s" podUID="ce437450-60be-4502-aa99-03a4af6c8e7c" Mar 18 16:43:40.559117 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:43:40.558843 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bd2jt" podUID="e3720b07-5dd9-403e-9a66-17abd67f145f" Mar 18 16:43:42.558611 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:42.558574 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bd2jt" Mar 18 16:43:42.559087 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:42.558575 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-j676s" Mar 18 16:43:42.559087 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:43:42.558710 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bd2jt" podUID="e3720b07-5dd9-403e-9a66-17abd67f145f" Mar 18 16:43:42.559087 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:43:42.558788 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-j676s" podUID="ce437450-60be-4502-aa99-03a4af6c8e7c" Mar 18 16:43:44.558621 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:44.558583 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-j676s" Mar 18 16:43:44.559061 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:44.558583 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bd2jt" Mar 18 16:43:44.559061 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:43:44.558708 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-j676s" podUID="ce437450-60be-4502-aa99-03a4af6c8e7c" Mar 18 16:43:44.559061 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:43:44.558811 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bd2jt" podUID="e3720b07-5dd9-403e-9a66-17abd67f145f" Mar 18 16:43:46.147699 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:46.147662 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e3720b07-5dd9-403e-9a66-17abd67f145f-metrics-certs\") pod \"network-metrics-daemon-bd2jt\" (UID: \"e3720b07-5dd9-403e-9a66-17abd67f145f\") " pod="openshift-multus/network-metrics-daemon-bd2jt" Mar 18 16:43:46.148253 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:43:46.147802 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:43:46.148253 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:43:46.147870 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e3720b07-5dd9-403e-9a66-17abd67f145f-metrics-certs podName:e3720b07-5dd9-403e-9a66-17abd67f145f nodeName:}" failed. No retries permitted until 2026-03-18 16:44:02.147849065 +0000 UTC m=+33.197664481 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e3720b07-5dd9-403e-9a66-17abd67f145f-metrics-certs") pod "network-metrics-daemon-bd2jt" (UID: "e3720b07-5dd9-403e-9a66-17abd67f145f") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:43:46.248715 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:46.248674 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z68v5\" (UniqueName: \"kubernetes.io/projected/ce437450-60be-4502-aa99-03a4af6c8e7c-kube-api-access-z68v5\") pod \"network-check-target-j676s\" (UID: \"ce437450-60be-4502-aa99-03a4af6c8e7c\") " pod="openshift-network-diagnostics/network-check-target-j676s" Mar 18 16:43:46.248885 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:43:46.248835 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 16:43:46.248885 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:43:46.248859 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 16:43:46.248885 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:43:46.248875 2576 projected.go:194] Error preparing data for projected volume kube-api-access-z68v5 for pod openshift-network-diagnostics/network-check-target-j676s: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:43:46.249040 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:43:46.248952 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ce437450-60be-4502-aa99-03a4af6c8e7c-kube-api-access-z68v5 podName:ce437450-60be-4502-aa99-03a4af6c8e7c nodeName:}" failed. No retries permitted until 2026-03-18 16:44:02.248915579 +0000 UTC m=+33.298730990 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-z68v5" (UniqueName: "kubernetes.io/projected/ce437450-60be-4502-aa99-03a4af6c8e7c-kube-api-access-z68v5") pod "network-check-target-j676s" (UID: "ce437450-60be-4502-aa99-03a4af6c8e7c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:43:46.558771 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:46.558727 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-j676s" Mar 18 16:43:46.558970 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:46.558727 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bd2jt" Mar 18 16:43:46.558970 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:43:46.558848 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-j676s" podUID="ce437450-60be-4502-aa99-03a4af6c8e7c" Mar 18 16:43:46.558970 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:43:46.558911 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bd2jt" podUID="e3720b07-5dd9-403e-9a66-17abd67f145f" Mar 18 16:43:48.559195 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:48.559156 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-j676s" Mar 18 16:43:48.559566 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:43:48.559261 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-j676s" podUID="ce437450-60be-4502-aa99-03a4af6c8e7c" Mar 18 16:43:48.559566 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:48.559315 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bd2jt" Mar 18 16:43:48.559566 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:43:48.559410 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bd2jt" podUID="e3720b07-5dd9-403e-9a66-17abd67f145f" Mar 18 16:43:49.737659 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:49.737301 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-bt66t" event={"ID":"50878b4f-dc7c-45c7-81b8-e6f009741d18","Type":"ContainerStarted","Data":"b66697ee102d9eaec7b0bb8970690f9db475900a85e5d54872732481fbb7e248"} Mar 18 16:43:49.752363 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:49.752338 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g2jdr_df16c6f7-55c8-4f26-80fc-ae0ba3fa368a/ovn-acl-logging/0.log" Mar 18 16:43:49.752674 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:49.752654 2576 generic.go:358] "Generic (PLEG): container finished" podID="df16c6f7-55c8-4f26-80fc-ae0ba3fa368a" containerID="c1a90c4ca43471e1bc1b7908416f6c5800bba675960da55e7ab233bbb33eabd7" exitCode=1 Mar 18 16:43:49.752742 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:49.752720 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2jdr" event={"ID":"df16c6f7-55c8-4f26-80fc-ae0ba3fa368a","Type":"ContainerStarted","Data":"2bedd5d3959180da5f6b05735791d4d14e664472a0975a3cbb2758cc1c68b20b"} Mar 18 16:43:49.752793 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:49.752751 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2jdr" event={"ID":"df16c6f7-55c8-4f26-80fc-ae0ba3fa368a","Type":"ContainerStarted","Data":"8f85f695b89e3f12cfdb3bc0acaad481bb16f3ce0af08568ba0749dcdaac87a0"} Mar 18 16:43:49.752793 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:49.752763 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2jdr" event={"ID":"df16c6f7-55c8-4f26-80fc-ae0ba3fa368a","Type":"ContainerStarted","Data":"5da13c322efd279fe192ed4d0e1d27211926fabccfd7e6d427f10c0157515c5d"} Mar 18 16:43:49.752793 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:49.752779 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2jdr" event={"ID":"df16c6f7-55c8-4f26-80fc-ae0ba3fa368a","Type":"ContainerStarted","Data":"de6ea9741834efb59de05b58583583219a2eab6efe406049959e77ba3d8ec49e"} Mar 18 16:43:49.752793 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:49.752792 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2jdr" event={"ID":"df16c6f7-55c8-4f26-80fc-ae0ba3fa368a","Type":"ContainerDied","Data":"c1a90c4ca43471e1bc1b7908416f6c5800bba675960da55e7ab233bbb33eabd7"} Mar 18 16:43:49.753010 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:49.752807 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2jdr" event={"ID":"df16c6f7-55c8-4f26-80fc-ae0ba3fa368a","Type":"ContainerStarted","Data":"6e7b2b3f08e547c481f865f66d499d0a2f331dd5e44a092257a5fa47201ab973"} Mar 18 16:43:49.754233 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:49.754192 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-bt66t" podStartSLOduration=2.710475007 podStartE2EDuration="20.754177269s" podCreationTimestamp="2026-03-18 16:43:29 +0000 UTC" firstStartedPulling="2026-03-18 16:43:30.823466425 +0000 UTC m=+1.873281830" lastFinishedPulling="2026-03-18 16:43:48.867168684 +0000 UTC m=+19.916984092" observedRunningTime="2026-03-18 16:43:49.753506334 +0000 UTC m=+20.803321784" watchObservedRunningTime="2026-03-18 16:43:49.754177269 +0000 UTC m=+20.803992697" Mar 18 16:43:49.754518 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:49.754365 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wqbkg" event={"ID":"832b4122-2cb3-47a3-9cdb-f45d80349575","Type":"ContainerStarted","Data":"eeb177a061c91f1d36a1d9ee539ebcda0cee535d62c9fec641341e3f749c86a0"} Mar 18 16:43:49.755991 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:49.755974 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-190.ec2.internal" event={"ID":"1fbacc3938de459504eb6ed80d05d8dc","Type":"ContainerStarted","Data":"fa592f1e29a27c26204d3c8b290d05694d8555d1ebede8c2067295a8b6d2e3e1"} Mar 18 16:43:49.777632 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:49.777582 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-wqbkg" podStartSLOduration=2.341879621 podStartE2EDuration="20.777567059s" podCreationTimestamp="2026-03-18 16:43:29 +0000 UTC" firstStartedPulling="2026-03-18 16:43:30.775761997 +0000 UTC m=+1.825577402" lastFinishedPulling="2026-03-18 16:43:49.211449435 +0000 UTC m=+20.261264840" observedRunningTime="2026-03-18 16:43:49.777464133 +0000 UTC m=+20.827279556" watchObservedRunningTime="2026-03-18 16:43:49.777567059 +0000 UTC m=+20.827382485" Mar 18 16:43:49.789899 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:49.789854 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-190.ec2.internal" podStartSLOduration=20.789840651 podStartE2EDuration="20.789840651s" podCreationTimestamp="2026-03-18 16:43:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:43:49.789347932 +0000 UTC m=+20.839163361" watchObservedRunningTime="2026-03-18 16:43:49.789840651 +0000 UTC m=+20.839656078" Mar 18 16:43:50.559063 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:50.558765 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bd2jt" Mar 18 16:43:50.559063 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:50.558796 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-j676s" Mar 18 16:43:50.559274 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:43:50.559080 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bd2jt" podUID="e3720b07-5dd9-403e-9a66-17abd67f145f" Mar 18 16:43:50.559274 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:43:50.559121 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-j676s" podUID="ce437450-60be-4502-aa99-03a4af6c8e7c" Mar 18 16:43:50.752233 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:50.752211 2576 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Mar 18 16:43:50.758664 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:50.758639 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-vclgl" event={"ID":"bf84e45b-5912-4a57-b350-dd211f3513fe","Type":"ContainerStarted","Data":"f4e0b4074def258ab5b789ed1bc9aa4a9165e5a2432c56627eb4ecede2b566a9"} Mar 18 16:43:50.759958 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:50.759917 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-hqszc" event={"ID":"fe7890a0-afa4-444d-9ad4-445e8890337a","Type":"ContainerStarted","Data":"dba51c22551126b808e8da19dfaf669d3577ef50e56253b00f10349ea8faa1d6"} Mar 18 16:43:50.761492 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:50.761470 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lm2bh" event={"ID":"c1ae1c50-62e9-4fb1-9fc6-458c124ef493","Type":"ContainerStarted","Data":"895ba0f077cbdd016dafe982aca0b35a836759272cdaf92cc4592c39de6dc26a"} Mar 18 16:43:50.761572 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:50.761497 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lm2bh" event={"ID":"c1ae1c50-62e9-4fb1-9fc6-458c124ef493","Type":"ContainerStarted","Data":"513f5c1c446d276f7850cb3830997fb35183ea447f910b003aaa2d40b018d85f"} Mar 18 16:43:50.762719 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:50.762696 2576 generic.go:358] "Generic (PLEG): container finished" podID="7e78a67ecfa4cc7c383866ea4270c748" containerID="6da890d69479fa68de16e6aa1c9368dc612ba204a5e707529e78a3414c197b9f" exitCode=0 Mar 18 16:43:50.762797 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:50.762774 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-190.ec2.internal" event={"ID":"7e78a67ecfa4cc7c383866ea4270c748","Type":"ContainerDied","Data":"6da890d69479fa68de16e6aa1c9368dc612ba204a5e707529e78a3414c197b9f"} Mar 18 16:43:50.763995 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:50.763975 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-9w7wj" event={"ID":"0bc34432-5dcb-459c-a6d8-587f77ae9dcf","Type":"ContainerStarted","Data":"ca824af4d545f702a7647ea3d962221d1f0bdbc4b40b05e8bed45a72673d4f15"} Mar 18 16:43:50.765202 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:50.765180 2576 generic.go:358] "Generic (PLEG): container finished" podID="fc696d94-dd9b-4ec4-8c68-3ef333b3aaa1" containerID="bda2f25314b10c1bf6286ef31ad4fe39da8622e465aeee9a32a092b71e4d8a2d" exitCode=0 Mar 18 16:43:50.765299 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:50.765248 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-x7hss" event={"ID":"fc696d94-dd9b-4ec4-8c68-3ef333b3aaa1","Type":"ContainerDied","Data":"bda2f25314b10c1bf6286ef31ad4fe39da8622e465aeee9a32a092b71e4d8a2d"} Mar 18 16:43:50.766517 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:50.766492 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-sr4nn" event={"ID":"ad6e055b-96d2-47b3-bb6b-f2e0165f3470","Type":"ContainerStarted","Data":"1601dbce40f4857536a06e862d815e989411ea9b050c77e58aaa7b2b2dc8fd35"} Mar 18 16:43:50.784019 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:50.783984 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-vclgl" Mar 18 16:43:50.784561 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:50.784514 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-vclgl" podStartSLOduration=11.934289044 podStartE2EDuration="21.784498972s" podCreationTimestamp="2026-03-18 16:43:29 +0000 UTC" firstStartedPulling="2026-03-18 16:43:30.816778444 +0000 UTC m=+1.866593849" lastFinishedPulling="2026-03-18 16:43:40.666988368 +0000 UTC m=+11.716803777" observedRunningTime="2026-03-18 16:43:50.784054624 +0000 UTC m=+21.833870052" watchObservedRunningTime="2026-03-18 16:43:50.784498972 +0000 UTC m=+21.834314401" Mar 18 16:43:50.784690 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:50.784674 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-vclgl" Mar 18 16:43:50.837793 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:50.837327 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-9w7wj" podStartSLOduration=3.816759336 podStartE2EDuration="21.837312432s" podCreationTimestamp="2026-03-18 16:43:29 +0000 UTC" firstStartedPulling="2026-03-18 16:43:30.844653919 +0000 UTC m=+1.894469327" lastFinishedPulling="2026-03-18 16:43:48.865207017 +0000 UTC m=+19.915022423" observedRunningTime="2026-03-18 16:43:50.837116815 +0000 UTC m=+21.886932242" watchObservedRunningTime="2026-03-18 16:43:50.837312432 +0000 UTC m=+21.887127859" Mar 18 16:43:50.837793 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:50.837694 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-hqszc" podStartSLOduration=3.7562806909999997 podStartE2EDuration="21.837688278s" podCreationTimestamp="2026-03-18 16:43:29 +0000 UTC" firstStartedPulling="2026-03-18 16:43:30.783810493 +0000 UTC m=+1.833625903" lastFinishedPulling="2026-03-18 16:43:48.865218071 +0000 UTC m=+19.915033490" observedRunningTime="2026-03-18 16:43:50.823369534 +0000 UTC m=+21.873184975" watchObservedRunningTime="2026-03-18 16:43:50.837688278 +0000 UTC m=+21.887504092" Mar 18 16:43:50.871600 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:50.871560 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-sr4nn" podStartSLOduration=3.7359722140000002 podStartE2EDuration="21.871545594s" podCreationTimestamp="2026-03-18 16:43:29 +0000 UTC" firstStartedPulling="2026-03-18 16:43:30.729866571 +0000 UTC m=+1.779681976" lastFinishedPulling="2026-03-18 16:43:48.865439937 +0000 UTC m=+19.915255356" observedRunningTime="2026-03-18 16:43:50.871385684 +0000 UTC m=+21.921201112" watchObservedRunningTime="2026-03-18 16:43:50.871545594 +0000 UTC m=+21.921361020" Mar 18 16:43:51.211874 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:51.211792 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-6qxc8"] Mar 18 16:43:51.218657 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:51.218624 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6qxc8" Mar 18 16:43:51.218847 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:43:51.218712 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6qxc8" podUID="d7144ee2-f2d9-497d-acc9-a5a9a6afcf2b" Mar 18 16:43:51.281895 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:51.281851 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d7144ee2-f2d9-497d-acc9-a5a9a6afcf2b-original-pull-secret\") pod \"global-pull-secret-syncer-6qxc8\" (UID: \"d7144ee2-f2d9-497d-acc9-a5a9a6afcf2b\") " pod="kube-system/global-pull-secret-syncer-6qxc8" Mar 18 16:43:51.281895 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:51.281894 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/d7144ee2-f2d9-497d-acc9-a5a9a6afcf2b-dbus\") pod \"global-pull-secret-syncer-6qxc8\" (UID: \"d7144ee2-f2d9-497d-acc9-a5a9a6afcf2b\") " pod="kube-system/global-pull-secret-syncer-6qxc8" Mar 18 16:43:51.282227 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:51.281923 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/d7144ee2-f2d9-497d-acc9-a5a9a6afcf2b-kubelet-config\") pod \"global-pull-secret-syncer-6qxc8\" (UID: \"d7144ee2-f2d9-497d-acc9-a5a9a6afcf2b\") " pod="kube-system/global-pull-secret-syncer-6qxc8" Mar 18 16:43:51.382426 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:51.382394 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d7144ee2-f2d9-497d-acc9-a5a9a6afcf2b-original-pull-secret\") pod \"global-pull-secret-syncer-6qxc8\" (UID: \"d7144ee2-f2d9-497d-acc9-a5a9a6afcf2b\") " pod="kube-system/global-pull-secret-syncer-6qxc8" Mar 18 16:43:51.382426 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:51.382431 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/d7144ee2-f2d9-497d-acc9-a5a9a6afcf2b-dbus\") pod \"global-pull-secret-syncer-6qxc8\" (UID: \"d7144ee2-f2d9-497d-acc9-a5a9a6afcf2b\") " pod="kube-system/global-pull-secret-syncer-6qxc8" Mar 18 16:43:51.382659 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:51.382451 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/d7144ee2-f2d9-497d-acc9-a5a9a6afcf2b-kubelet-config\") pod \"global-pull-secret-syncer-6qxc8\" (UID: \"d7144ee2-f2d9-497d-acc9-a5a9a6afcf2b\") " pod="kube-system/global-pull-secret-syncer-6qxc8" Mar 18 16:43:51.382659 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:51.382545 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/d7144ee2-f2d9-497d-acc9-a5a9a6afcf2b-kubelet-config\") pod \"global-pull-secret-syncer-6qxc8\" (UID: \"d7144ee2-f2d9-497d-acc9-a5a9a6afcf2b\") " pod="kube-system/global-pull-secret-syncer-6qxc8" Mar 18 16:43:51.382659 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:43:51.382643 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Mar 18 16:43:51.382827 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:43:51.382686 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d7144ee2-f2d9-497d-acc9-a5a9a6afcf2b-original-pull-secret podName:d7144ee2-f2d9-497d-acc9-a5a9a6afcf2b nodeName:}" failed. No retries permitted until 2026-03-18 16:43:51.882673 +0000 UTC m=+22.932488404 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d7144ee2-f2d9-497d-acc9-a5a9a6afcf2b-original-pull-secret") pod "global-pull-secret-syncer-6qxc8" (UID: "d7144ee2-f2d9-497d-acc9-a5a9a6afcf2b") : object "kube-system"/"original-pull-secret" not registered Mar 18 16:43:51.383439 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:51.383095 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/d7144ee2-f2d9-497d-acc9-a5a9a6afcf2b-dbus\") pod \"global-pull-secret-syncer-6qxc8\" (UID: \"d7144ee2-f2d9-497d-acc9-a5a9a6afcf2b\") " pod="kube-system/global-pull-secret-syncer-6qxc8" Mar 18 16:43:51.489844 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:51.489728 2576 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-03-18T16:43:50.752229289Z","UUID":"1f3b1de4-6373-4cd8-973a-45b909a98b2f","Handler":null,"Name":"","Endpoint":""} Mar 18 16:43:51.491691 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:51.491668 2576 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Mar 18 16:43:51.491832 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:51.491701 2576 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Mar 18 16:43:51.770965 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:51.770852 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lm2bh" event={"ID":"c1ae1c50-62e9-4fb1-9fc6-458c124ef493","Type":"ContainerStarted","Data":"d0423d2c617e76cef8298b1c1b4fa6f5c413055715e9b4b5c4de37c2946f3d2f"} Mar 18 16:43:51.772795 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:51.772767 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-190.ec2.internal" event={"ID":"7e78a67ecfa4cc7c383866ea4270c748","Type":"ContainerStarted","Data":"f551df69cd85463fe5521bd5fa3fb38b161d80e3c1d19967ec2d64e2869df18c"} Mar 18 16:43:51.775823 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:51.775797 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g2jdr_df16c6f7-55c8-4f26-80fc-ae0ba3fa368a/ovn-acl-logging/0.log" Mar 18 16:43:51.776195 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:51.776174 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2jdr" event={"ID":"df16c6f7-55c8-4f26-80fc-ae0ba3fa368a","Type":"ContainerStarted","Data":"3a540d4f15c51f32b80dd5d740defc957f3598eb48530de94c2240bec360ce60"} Mar 18 16:43:51.776796 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:51.776774 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-vclgl" Mar 18 16:43:51.777471 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:51.777452 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-vclgl" Mar 18 16:43:51.788893 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:51.788857 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lm2bh" podStartSLOduration=1.980083831 podStartE2EDuration="22.788845771s" podCreationTimestamp="2026-03-18 16:43:29 +0000 UTC" firstStartedPulling="2026-03-18 16:43:30.775708712 +0000 UTC m=+1.825524116" lastFinishedPulling="2026-03-18 16:43:51.584470636 +0000 UTC m=+22.634286056" observedRunningTime="2026-03-18 16:43:51.788562626 +0000 UTC m=+22.838378143" watchObservedRunningTime="2026-03-18 16:43:51.788845771 +0000 UTC m=+22.838661197" Mar 18 16:43:51.815605 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:51.815551 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-190.ec2.internal" podStartSLOduration=22.815533338 podStartE2EDuration="22.815533338s" podCreationTimestamp="2026-03-18 16:43:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:43:51.815367097 +0000 UTC m=+22.865182525" watchObservedRunningTime="2026-03-18 16:43:51.815533338 +0000 UTC m=+22.865348766" Mar 18 16:43:51.887955 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:51.887907 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d7144ee2-f2d9-497d-acc9-a5a9a6afcf2b-original-pull-secret\") pod \"global-pull-secret-syncer-6qxc8\" (UID: \"d7144ee2-f2d9-497d-acc9-a5a9a6afcf2b\") " pod="kube-system/global-pull-secret-syncer-6qxc8" Mar 18 16:43:51.888134 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:43:51.888058 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Mar 18 16:43:51.888134 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:43:51.888121 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d7144ee2-f2d9-497d-acc9-a5a9a6afcf2b-original-pull-secret podName:d7144ee2-f2d9-497d-acc9-a5a9a6afcf2b nodeName:}" failed. No retries permitted until 2026-03-18 16:43:52.888103959 +0000 UTC m=+23.937919381 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d7144ee2-f2d9-497d-acc9-a5a9a6afcf2b-original-pull-secret") pod "global-pull-secret-syncer-6qxc8" (UID: "d7144ee2-f2d9-497d-acc9-a5a9a6afcf2b") : object "kube-system"/"original-pull-secret" not registered Mar 18 16:43:52.558807 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:52.558776 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-j676s" Mar 18 16:43:52.559008 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:52.558776 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bd2jt" Mar 18 16:43:52.559008 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:43:52.558905 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-j676s" podUID="ce437450-60be-4502-aa99-03a4af6c8e7c" Mar 18 16:43:52.559148 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:43:52.559006 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bd2jt" podUID="e3720b07-5dd9-403e-9a66-17abd67f145f" Mar 18 16:43:52.894629 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:52.894550 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d7144ee2-f2d9-497d-acc9-a5a9a6afcf2b-original-pull-secret\") pod \"global-pull-secret-syncer-6qxc8\" (UID: \"d7144ee2-f2d9-497d-acc9-a5a9a6afcf2b\") " pod="kube-system/global-pull-secret-syncer-6qxc8" Mar 18 16:43:52.895099 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:43:52.895029 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Mar 18 16:43:52.895099 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:43:52.895098 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d7144ee2-f2d9-497d-acc9-a5a9a6afcf2b-original-pull-secret podName:d7144ee2-f2d9-497d-acc9-a5a9a6afcf2b nodeName:}" failed. No retries permitted until 2026-03-18 16:43:54.895078104 +0000 UTC m=+25.944893523 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d7144ee2-f2d9-497d-acc9-a5a9a6afcf2b-original-pull-secret") pod "global-pull-secret-syncer-6qxc8" (UID: "d7144ee2-f2d9-497d-acc9-a5a9a6afcf2b") : object "kube-system"/"original-pull-secret" not registered Mar 18 16:43:53.564570 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:53.564542 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6qxc8" Mar 18 16:43:53.564750 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:43:53.564661 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6qxc8" podUID="d7144ee2-f2d9-497d-acc9-a5a9a6afcf2b" Mar 18 16:43:54.558302 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:54.558268 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-j676s" Mar 18 16:43:54.558771 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:54.558268 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bd2jt" Mar 18 16:43:54.558771 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:43:54.558391 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-j676s" podUID="ce437450-60be-4502-aa99-03a4af6c8e7c" Mar 18 16:43:54.558771 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:43:54.558490 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bd2jt" podUID="e3720b07-5dd9-403e-9a66-17abd67f145f" Mar 18 16:43:54.785033 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:54.784849 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g2jdr_df16c6f7-55c8-4f26-80fc-ae0ba3fa368a/ovn-acl-logging/0.log" Mar 18 16:43:54.785356 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:54.785334 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2jdr" event={"ID":"df16c6f7-55c8-4f26-80fc-ae0ba3fa368a","Type":"ContainerStarted","Data":"115b3c109e4b52ca12faa6c362d716420d0c2764ee57f26afa0a5e1ec8a018d4"} Mar 18 16:43:54.911119 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:54.911038 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d7144ee2-f2d9-497d-acc9-a5a9a6afcf2b-original-pull-secret\") pod \"global-pull-secret-syncer-6qxc8\" (UID: \"d7144ee2-f2d9-497d-acc9-a5a9a6afcf2b\") " pod="kube-system/global-pull-secret-syncer-6qxc8" Mar 18 16:43:54.911291 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:43:54.911180 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Mar 18 16:43:54.911291 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:43:54.911269 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d7144ee2-f2d9-497d-acc9-a5a9a6afcf2b-original-pull-secret podName:d7144ee2-f2d9-497d-acc9-a5a9a6afcf2b nodeName:}" failed. No retries permitted until 2026-03-18 16:43:58.911246508 +0000 UTC m=+29.961061927 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d7144ee2-f2d9-497d-acc9-a5a9a6afcf2b-original-pull-secret") pod "global-pull-secret-syncer-6qxc8" (UID: "d7144ee2-f2d9-497d-acc9-a5a9a6afcf2b") : object "kube-system"/"original-pull-secret" not registered Mar 18 16:43:55.559262 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:55.559227 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6qxc8" Mar 18 16:43:55.559983 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:43:55.559336 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6qxc8" podUID="d7144ee2-f2d9-497d-acc9-a5a9a6afcf2b" Mar 18 16:43:55.788626 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:55.788594 2576 generic.go:358] "Generic (PLEG): container finished" podID="fc696d94-dd9b-4ec4-8c68-3ef333b3aaa1" containerID="a3d9eff584975d3eb8ceefccea56c452dbcb8a4306c3ca58dc686524e1401a94" exitCode=0 Mar 18 16:43:55.788772 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:55.788680 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-x7hss" event={"ID":"fc696d94-dd9b-4ec4-8c68-3ef333b3aaa1","Type":"ContainerDied","Data":"a3d9eff584975d3eb8ceefccea56c452dbcb8a4306c3ca58dc686524e1401a94"} Mar 18 16:43:55.789104 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:55.789083 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-g2jdr" Mar 18 16:43:55.789172 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:55.789116 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-g2jdr" Mar 18 16:43:55.789172 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:55.789128 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-g2jdr" Mar 18 16:43:55.789263 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:55.789227 2576 scope.go:117] "RemoveContainer" containerID="c1a90c4ca43471e1bc1b7908416f6c5800bba675960da55e7ab233bbb33eabd7" Mar 18 16:43:55.808867 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:55.808842 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-g2jdr" Mar 18 16:43:55.808985 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:55.808972 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-g2jdr" Mar 18 16:43:56.559288 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:56.559109 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-j676s" Mar 18 16:43:56.559683 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:56.559119 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bd2jt" Mar 18 16:43:56.559683 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:43:56.559385 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-j676s" podUID="ce437450-60be-4502-aa99-03a4af6c8e7c" Mar 18 16:43:56.559683 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:43:56.559438 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bd2jt" podUID="e3720b07-5dd9-403e-9a66-17abd67f145f" Mar 18 16:43:56.718040 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:56.718004 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-6qxc8"] Mar 18 16:43:56.718201 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:56.718144 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6qxc8" Mar 18 16:43:56.718275 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:43:56.718253 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6qxc8" podUID="d7144ee2-f2d9-497d-acc9-a5a9a6afcf2b" Mar 18 16:43:56.721689 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:56.721660 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-j676s"] Mar 18 16:43:56.722282 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:56.722261 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-bd2jt"] Mar 18 16:43:56.792475 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:56.792441 2576 generic.go:358] "Generic (PLEG): container finished" podID="fc696d94-dd9b-4ec4-8c68-3ef333b3aaa1" containerID="f61e8f713e187f1a95ce1bb37ed9fcab936bae02d62dfe8769c57e98a75af8a2" exitCode=0 Mar 18 16:43:56.792630 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:56.792519 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-x7hss" event={"ID":"fc696d94-dd9b-4ec4-8c68-3ef333b3aaa1","Type":"ContainerDied","Data":"f61e8f713e187f1a95ce1bb37ed9fcab936bae02d62dfe8769c57e98a75af8a2"} Mar 18 16:43:56.795910 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:56.795893 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g2jdr_df16c6f7-55c8-4f26-80fc-ae0ba3fa368a/ovn-acl-logging/0.log" Mar 18 16:43:56.796252 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:56.796208 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g2jdr" event={"ID":"df16c6f7-55c8-4f26-80fc-ae0ba3fa368a","Type":"ContainerStarted","Data":"64074aaff97823af8df0e6c8e76525b58475ecaa9dd289ffaa9fb4a559c94741"} Mar 18 16:43:56.796308 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:56.796260 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-j676s" Mar 18 16:43:56.796346 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:56.796315 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bd2jt" Mar 18 16:43:56.796409 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:43:56.796391 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bd2jt" podUID="e3720b07-5dd9-403e-9a66-17abd67f145f" Mar 18 16:43:56.796590 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:43:56.796567 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-j676s" podUID="ce437450-60be-4502-aa99-03a4af6c8e7c" Mar 18 16:43:56.864565 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:56.864512 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-g2jdr" podStartSLOduration=9.75561795 podStartE2EDuration="27.86449619s" podCreationTimestamp="2026-03-18 16:43:29 +0000 UTC" firstStartedPulling="2026-03-18 16:43:30.798817761 +0000 UTC m=+1.848633167" lastFinishedPulling="2026-03-18 16:43:48.907695986 +0000 UTC m=+19.957511407" observedRunningTime="2026-03-18 16:43:56.864029387 +0000 UTC m=+27.913844813" watchObservedRunningTime="2026-03-18 16:43:56.86449619 +0000 UTC m=+27.914311617" Mar 18 16:43:57.800304 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:57.800266 2576 generic.go:358] "Generic (PLEG): container finished" podID="fc696d94-dd9b-4ec4-8c68-3ef333b3aaa1" containerID="9d54a99605af8225ab2ca3eb0122a240d8b6e024bb89fa0d75a95d41ebb4aa97" exitCode=0 Mar 18 16:43:57.800744 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:57.800329 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-x7hss" event={"ID":"fc696d94-dd9b-4ec4-8c68-3ef333b3aaa1","Type":"ContainerDied","Data":"9d54a99605af8225ab2ca3eb0122a240d8b6e024bb89fa0d75a95d41ebb4aa97"} Mar 18 16:43:58.559281 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:58.559185 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6qxc8" Mar 18 16:43:58.559281 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:58.559209 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-j676s" Mar 18 16:43:58.559497 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:58.559305 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bd2jt" Mar 18 16:43:58.559497 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:43:58.559322 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6qxc8" podUID="d7144ee2-f2d9-497d-acc9-a5a9a6afcf2b" Mar 18 16:43:58.559497 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:43:58.559435 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bd2jt" podUID="e3720b07-5dd9-403e-9a66-17abd67f145f" Mar 18 16:43:58.559657 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:43:58.559529 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-j676s" podUID="ce437450-60be-4502-aa99-03a4af6c8e7c" Mar 18 16:43:58.941718 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:43:58.941640 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d7144ee2-f2d9-497d-acc9-a5a9a6afcf2b-original-pull-secret\") pod \"global-pull-secret-syncer-6qxc8\" (UID: \"d7144ee2-f2d9-497d-acc9-a5a9a6afcf2b\") " pod="kube-system/global-pull-secret-syncer-6qxc8" Mar 18 16:43:58.942137 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:43:58.941819 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Mar 18 16:43:58.942137 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:43:58.941891 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d7144ee2-f2d9-497d-acc9-a5a9a6afcf2b-original-pull-secret podName:d7144ee2-f2d9-497d-acc9-a5a9a6afcf2b nodeName:}" failed. No retries permitted until 2026-03-18 16:44:06.941871567 +0000 UTC m=+37.991686989 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d7144ee2-f2d9-497d-acc9-a5a9a6afcf2b-original-pull-secret") pod "global-pull-secret-syncer-6qxc8" (UID: "d7144ee2-f2d9-497d-acc9-a5a9a6afcf2b") : object "kube-system"/"original-pull-secret" not registered Mar 18 16:44:00.558517 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:00.558485 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6qxc8" Mar 18 16:44:00.558983 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:00.558485 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bd2jt" Mar 18 16:44:00.558983 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:44:00.558621 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6qxc8" podUID="d7144ee2-f2d9-497d-acc9-a5a9a6afcf2b" Mar 18 16:44:00.558983 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:44:00.558718 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bd2jt" podUID="e3720b07-5dd9-403e-9a66-17abd67f145f" Mar 18 16:44:00.558983 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:00.558485 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-j676s" Mar 18 16:44:00.558983 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:44:00.558848 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-j676s" podUID="ce437450-60be-4502-aa99-03a4af6c8e7c" Mar 18 16:44:02.170380 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:02.170113 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e3720b07-5dd9-403e-9a66-17abd67f145f-metrics-certs\") pod \"network-metrics-daemon-bd2jt\" (UID: \"e3720b07-5dd9-403e-9a66-17abd67f145f\") " pod="openshift-multus/network-metrics-daemon-bd2jt" Mar 18 16:44:02.170810 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:44:02.170291 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:44:02.170810 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:44:02.170491 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e3720b07-5dd9-403e-9a66-17abd67f145f-metrics-certs podName:e3720b07-5dd9-403e-9a66-17abd67f145f nodeName:}" failed. No retries permitted until 2026-03-18 16:44:34.170466267 +0000 UTC m=+65.220281680 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e3720b07-5dd9-403e-9a66-17abd67f145f-metrics-certs") pod "network-metrics-daemon-bd2jt" (UID: "e3720b07-5dd9-403e-9a66-17abd67f145f") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:44:02.271519 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:02.271477 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z68v5\" (UniqueName: \"kubernetes.io/projected/ce437450-60be-4502-aa99-03a4af6c8e7c-kube-api-access-z68v5\") pod \"network-check-target-j676s\" (UID: \"ce437450-60be-4502-aa99-03a4af6c8e7c\") " pod="openshift-network-diagnostics/network-check-target-j676s" Mar 18 16:44:02.271702 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:44:02.271620 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 16:44:02.271702 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:44:02.271642 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 16:44:02.271702 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:44:02.271654 2576 projected.go:194] Error preparing data for projected volume kube-api-access-z68v5 for pod openshift-network-diagnostics/network-check-target-j676s: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:44:02.271862 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:44:02.271716 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ce437450-60be-4502-aa99-03a4af6c8e7c-kube-api-access-z68v5 podName:ce437450-60be-4502-aa99-03a4af6c8e7c nodeName:}" failed. No retries permitted until 2026-03-18 16:44:34.271697273 +0000 UTC m=+65.321512694 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-z68v5" (UniqueName: "kubernetes.io/projected/ce437450-60be-4502-aa99-03a4af6c8e7c-kube-api-access-z68v5") pod "network-check-target-j676s" (UID: "ce437450-60be-4502-aa99-03a4af6c8e7c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:44:02.294339 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:02.294308 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-190.ec2.internal" event="NodeReady" Mar 18 16:44:02.294505 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:02.294469 2576 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Mar 18 16:44:02.328434 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:02.328389 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-9cbc96769-fp2t2"] Mar 18 16:44:02.354843 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:02.354798 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-9cbc96769-fp2t2"] Mar 18 16:44:02.354843 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:02.354836 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-mv4xk"] Mar 18 16:44:02.355091 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:02.354977 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-9cbc96769-fp2t2" Mar 18 16:44:02.357017 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:02.356909 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Mar 18 16:44:02.357147 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:02.357019 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Mar 18 16:44:02.357415 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:02.357329 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-27ljg\"" Mar 18 16:44:02.357415 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:02.357334 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Mar 18 16:44:02.371392 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:02.371371 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Mar 18 16:44:02.377508 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:02.377488 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-zlctd"] Mar 18 16:44:02.377646 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:02.377631 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-mv4xk" Mar 18 16:44:02.379345 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:02.379324 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Mar 18 16:44:02.379466 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:02.379334 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-xkmdp\"" Mar 18 16:44:02.379466 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:02.379383 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Mar 18 16:44:02.379720 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:02.379706 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Mar 18 16:44:02.395653 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:02.395616 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-mv4xk"] Mar 18 16:44:02.395653 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:02.395646 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-zlctd"] Mar 18 16:44:02.395800 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:02.395750 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-zlctd" Mar 18 16:44:02.397601 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:02.397569 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-7bwt6\"" Mar 18 16:44:02.397729 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:02.397664 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Mar 18 16:44:02.397909 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:02.397886 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Mar 18 16:44:02.472814 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:02.472765 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca-registry-tls\") pod \"image-registry-9cbc96769-fp2t2\" (UID: \"f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca\") " pod="openshift-image-registry/image-registry-9cbc96769-fp2t2" Mar 18 16:44:02.472814 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:02.472814 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c617a8be-fcd4-4bd3-971f-b335484b9beb-config-volume\") pod \"dns-default-zlctd\" (UID: \"c617a8be-fcd4-4bd3-971f-b335484b9beb\") " pod="openshift-dns/dns-default-zlctd" Mar 18 16:44:02.473089 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:02.472883 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca-installation-pull-secrets\") pod \"image-registry-9cbc96769-fp2t2\" (UID: \"f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca\") " pod="openshift-image-registry/image-registry-9cbc96769-fp2t2" Mar 18 16:44:02.473089 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:02.472932 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75dtn\" (UniqueName: \"kubernetes.io/projected/f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca-kube-api-access-75dtn\") pod \"image-registry-9cbc96769-fp2t2\" (UID: \"f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca\") " pod="openshift-image-registry/image-registry-9cbc96769-fp2t2" Mar 18 16:44:02.473089 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:02.472975 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca-registry-certificates\") pod \"image-registry-9cbc96769-fp2t2\" (UID: \"f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca\") " pod="openshift-image-registry/image-registry-9cbc96769-fp2t2" Mar 18 16:44:02.473089 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:02.473003 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca-trusted-ca\") pod \"image-registry-9cbc96769-fp2t2\" (UID: \"f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca\") " pod="openshift-image-registry/image-registry-9cbc96769-fp2t2" Mar 18 16:44:02.473089 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:02.473022 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4220311f-b2d3-43c5-87a6-ddf6edd88e2f-cert\") pod \"ingress-canary-mv4xk\" (UID: \"4220311f-b2d3-43c5-87a6-ddf6edd88e2f\") " pod="openshift-ingress-canary/ingress-canary-mv4xk" Mar 18 16:44:02.473089 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:02.473045 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca-bound-sa-token\") pod \"image-registry-9cbc96769-fp2t2\" (UID: \"f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca\") " pod="openshift-image-registry/image-registry-9cbc96769-fp2t2" Mar 18 16:44:02.473335 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:02.473120 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca-ca-trust-extracted\") pod \"image-registry-9cbc96769-fp2t2\" (UID: \"f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca\") " pod="openshift-image-registry/image-registry-9cbc96769-fp2t2" Mar 18 16:44:02.473335 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:02.473141 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c617a8be-fcd4-4bd3-971f-b335484b9beb-tmp-dir\") pod \"dns-default-zlctd\" (UID: \"c617a8be-fcd4-4bd3-971f-b335484b9beb\") " pod="openshift-dns/dns-default-zlctd" Mar 18 16:44:02.473335 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:02.473156 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvvf2\" (UniqueName: \"kubernetes.io/projected/c617a8be-fcd4-4bd3-971f-b335484b9beb-kube-api-access-rvvf2\") pod \"dns-default-zlctd\" (UID: \"c617a8be-fcd4-4bd3-971f-b335484b9beb\") " pod="openshift-dns/dns-default-zlctd" Mar 18 16:44:02.473335 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:02.473252 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca-image-registry-private-configuration\") pod \"image-registry-9cbc96769-fp2t2\" (UID: \"f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca\") " pod="openshift-image-registry/image-registry-9cbc96769-fp2t2" Mar 18 16:44:02.473335 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:02.473287 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c617a8be-fcd4-4bd3-971f-b335484b9beb-metrics-tls\") pod \"dns-default-zlctd\" (UID: \"c617a8be-fcd4-4bd3-971f-b335484b9beb\") " pod="openshift-dns/dns-default-zlctd" Mar 18 16:44:02.473335 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:02.473316 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmtdc\" (UniqueName: \"kubernetes.io/projected/4220311f-b2d3-43c5-87a6-ddf6edd88e2f-kube-api-access-wmtdc\") pod \"ingress-canary-mv4xk\" (UID: \"4220311f-b2d3-43c5-87a6-ddf6edd88e2f\") " pod="openshift-ingress-canary/ingress-canary-mv4xk" Mar 18 16:44:02.558687 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:02.558649 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-j676s" Mar 18 16:44:02.558891 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:02.558702 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6qxc8" Mar 18 16:44:02.558891 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:02.558649 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bd2jt" Mar 18 16:44:02.561412 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:02.560970 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Mar 18 16:44:02.561412 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:02.560980 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Mar 18 16:44:02.561412 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:02.560996 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Mar 18 16:44:02.561412 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:02.561012 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Mar 18 16:44:02.561412 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:02.561043 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-pqk66\"" Mar 18 16:44:02.561412 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:02.561161 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-nk2x6\"" Mar 18 16:44:02.573775 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:02.573739 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca-trusted-ca\") pod \"image-registry-9cbc96769-fp2t2\" (UID: \"f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca\") " pod="openshift-image-registry/image-registry-9cbc96769-fp2t2" Mar 18 16:44:02.573921 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:02.573796 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4220311f-b2d3-43c5-87a6-ddf6edd88e2f-cert\") pod \"ingress-canary-mv4xk\" (UID: \"4220311f-b2d3-43c5-87a6-ddf6edd88e2f\") " pod="openshift-ingress-canary/ingress-canary-mv4xk" Mar 18 16:44:02.573921 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:02.573831 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca-bound-sa-token\") pod \"image-registry-9cbc96769-fp2t2\" (UID: \"f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca\") " pod="openshift-image-registry/image-registry-9cbc96769-fp2t2" Mar 18 16:44:02.573921 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:02.573888 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca-ca-trust-extracted\") pod \"image-registry-9cbc96769-fp2t2\" (UID: \"f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca\") " pod="openshift-image-registry/image-registry-9cbc96769-fp2t2" Mar 18 16:44:02.573921 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:02.573915 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c617a8be-fcd4-4bd3-971f-b335484b9beb-tmp-dir\") pod \"dns-default-zlctd\" (UID: \"c617a8be-fcd4-4bd3-971f-b335484b9beb\") " pod="openshift-dns/dns-default-zlctd" Mar 18 16:44:02.574163 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:02.573954 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rvvf2\" (UniqueName: \"kubernetes.io/projected/c617a8be-fcd4-4bd3-971f-b335484b9beb-kube-api-access-rvvf2\") pod \"dns-default-zlctd\" (UID: \"c617a8be-fcd4-4bd3-971f-b335484b9beb\") " pod="openshift-dns/dns-default-zlctd" Mar 18 16:44:02.574163 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:44:02.573988 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Mar 18 16:44:02.574163 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:44:02.574048 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4220311f-b2d3-43c5-87a6-ddf6edd88e2f-cert podName:4220311f-b2d3-43c5-87a6-ddf6edd88e2f nodeName:}" failed. No retries permitted until 2026-03-18 16:44:03.074029307 +0000 UTC m=+34.123844712 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4220311f-b2d3-43c5-87a6-ddf6edd88e2f-cert") pod "ingress-canary-mv4xk" (UID: "4220311f-b2d3-43c5-87a6-ddf6edd88e2f") : secret "canary-serving-cert" not found Mar 18 16:44:02.574163 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:02.573991 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca-image-registry-private-configuration\") pod \"image-registry-9cbc96769-fp2t2\" (UID: \"f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca\") " pod="openshift-image-registry/image-registry-9cbc96769-fp2t2" Mar 18 16:44:02.574163 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:02.574094 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c617a8be-fcd4-4bd3-971f-b335484b9beb-metrics-tls\") pod \"dns-default-zlctd\" (UID: \"c617a8be-fcd4-4bd3-971f-b335484b9beb\") " pod="openshift-dns/dns-default-zlctd" Mar 18 16:44:02.574163 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:02.574126 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wmtdc\" (UniqueName: \"kubernetes.io/projected/4220311f-b2d3-43c5-87a6-ddf6edd88e2f-kube-api-access-wmtdc\") pod \"ingress-canary-mv4xk\" (UID: \"4220311f-b2d3-43c5-87a6-ddf6edd88e2f\") " pod="openshift-ingress-canary/ingress-canary-mv4xk" Mar 18 16:44:02.574163 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:02.574153 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca-registry-tls\") pod \"image-registry-9cbc96769-fp2t2\" (UID: \"f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca\") " pod="openshift-image-registry/image-registry-9cbc96769-fp2t2" Mar 18 16:44:02.574924 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:02.574178 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c617a8be-fcd4-4bd3-971f-b335484b9beb-config-volume\") pod \"dns-default-zlctd\" (UID: \"c617a8be-fcd4-4bd3-971f-b335484b9beb\") " pod="openshift-dns/dns-default-zlctd" Mar 18 16:44:02.574924 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:02.574219 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca-installation-pull-secrets\") pod \"image-registry-9cbc96769-fp2t2\" (UID: \"f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca\") " pod="openshift-image-registry/image-registry-9cbc96769-fp2t2" Mar 18 16:44:02.574924 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:02.574261 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-75dtn\" (UniqueName: \"kubernetes.io/projected/f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca-kube-api-access-75dtn\") pod \"image-registry-9cbc96769-fp2t2\" (UID: \"f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca\") " pod="openshift-image-registry/image-registry-9cbc96769-fp2t2" Mar 18 16:44:02.574924 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:02.574294 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca-registry-certificates\") pod \"image-registry-9cbc96769-fp2t2\" (UID: \"f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca\") " pod="openshift-image-registry/image-registry-9cbc96769-fp2t2" Mar 18 16:44:02.574924 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:02.574297 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c617a8be-fcd4-4bd3-971f-b335484b9beb-tmp-dir\") pod \"dns-default-zlctd\" (UID: \"c617a8be-fcd4-4bd3-971f-b335484b9beb\") " pod="openshift-dns/dns-default-zlctd" Mar 18 16:44:02.574924 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:02.574309 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca-ca-trust-extracted\") pod \"image-registry-9cbc96769-fp2t2\" (UID: \"f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca\") " pod="openshift-image-registry/image-registry-9cbc96769-fp2t2" Mar 18 16:44:02.574924 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:44:02.574389 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Mar 18 16:44:02.574924 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:44:02.574402 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-9cbc96769-fp2t2: secret "image-registry-tls" not found Mar 18 16:44:02.574924 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:44:02.574795 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca-registry-tls podName:f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca nodeName:}" failed. No retries permitted until 2026-03-18 16:44:03.07477581 +0000 UTC m=+34.124591232 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca-registry-tls") pod "image-registry-9cbc96769-fp2t2" (UID: "f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca") : secret "image-registry-tls" not found Mar 18 16:44:02.574924 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:44:02.574834 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Mar 18 16:44:02.574924 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:02.574841 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c617a8be-fcd4-4bd3-971f-b335484b9beb-config-volume\") pod \"dns-default-zlctd\" (UID: \"c617a8be-fcd4-4bd3-971f-b335484b9beb\") " pod="openshift-dns/dns-default-zlctd" Mar 18 16:44:02.574924 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:44:02.574922 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c617a8be-fcd4-4bd3-971f-b335484b9beb-metrics-tls podName:c617a8be-fcd4-4bd3-971f-b335484b9beb nodeName:}" failed. No retries permitted until 2026-03-18 16:44:03.074902941 +0000 UTC m=+34.124718362 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c617a8be-fcd4-4bd3-971f-b335484b9beb-metrics-tls") pod "dns-default-zlctd" (UID: "c617a8be-fcd4-4bd3-971f-b335484b9beb") : secret "dns-default-metrics-tls" not found Mar 18 16:44:02.575696 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:02.575098 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca-trusted-ca\") pod \"image-registry-9cbc96769-fp2t2\" (UID: \"f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca\") " pod="openshift-image-registry/image-registry-9cbc96769-fp2t2" Mar 18 16:44:02.575696 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:02.575341 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca-registry-certificates\") pod \"image-registry-9cbc96769-fp2t2\" (UID: \"f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca\") " pod="openshift-image-registry/image-registry-9cbc96769-fp2t2" Mar 18 16:44:02.579165 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:02.579140 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca-installation-pull-secrets\") pod \"image-registry-9cbc96769-fp2t2\" (UID: \"f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca\") " pod="openshift-image-registry/image-registry-9cbc96769-fp2t2" Mar 18 16:44:02.579266 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:02.579164 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca-image-registry-private-configuration\") pod \"image-registry-9cbc96769-fp2t2\" (UID: \"f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca\") " pod="openshift-image-registry/image-registry-9cbc96769-fp2t2" Mar 18 16:44:02.584813 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:02.584758 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvvf2\" (UniqueName: \"kubernetes.io/projected/c617a8be-fcd4-4bd3-971f-b335484b9beb-kube-api-access-rvvf2\") pod \"dns-default-zlctd\" (UID: \"c617a8be-fcd4-4bd3-971f-b335484b9beb\") " pod="openshift-dns/dns-default-zlctd" Mar 18 16:44:02.584813 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:02.584792 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca-bound-sa-token\") pod \"image-registry-9cbc96769-fp2t2\" (UID: \"f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca\") " pod="openshift-image-registry/image-registry-9cbc96769-fp2t2" Mar 18 16:44:02.585350 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:02.585323 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-75dtn\" (UniqueName: \"kubernetes.io/projected/f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca-kube-api-access-75dtn\") pod \"image-registry-9cbc96769-fp2t2\" (UID: \"f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca\") " pod="openshift-image-registry/image-registry-9cbc96769-fp2t2" Mar 18 16:44:02.595118 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:02.595091 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmtdc\" (UniqueName: \"kubernetes.io/projected/4220311f-b2d3-43c5-87a6-ddf6edd88e2f-kube-api-access-wmtdc\") pod \"ingress-canary-mv4xk\" (UID: \"4220311f-b2d3-43c5-87a6-ddf6edd88e2f\") " pod="openshift-ingress-canary/ingress-canary-mv4xk" Mar 18 16:44:03.077705 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:03.077667 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c617a8be-fcd4-4bd3-971f-b335484b9beb-metrics-tls\") pod \"dns-default-zlctd\" (UID: \"c617a8be-fcd4-4bd3-971f-b335484b9beb\") " pod="openshift-dns/dns-default-zlctd" Mar 18 16:44:03.077894 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:03.077714 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca-registry-tls\") pod \"image-registry-9cbc96769-fp2t2\" (UID: \"f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca\") " pod="openshift-image-registry/image-registry-9cbc96769-fp2t2" Mar 18 16:44:03.077894 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:03.077785 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4220311f-b2d3-43c5-87a6-ddf6edd88e2f-cert\") pod \"ingress-canary-mv4xk\" (UID: \"4220311f-b2d3-43c5-87a6-ddf6edd88e2f\") " pod="openshift-ingress-canary/ingress-canary-mv4xk" Mar 18 16:44:03.077894 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:44:03.077837 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Mar 18 16:44:03.077894 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:44:03.077862 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Mar 18 16:44:03.077894 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:44:03.077882 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-9cbc96769-fp2t2: secret "image-registry-tls" not found Mar 18 16:44:03.077894 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:44:03.077887 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Mar 18 16:44:03.078167 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:44:03.077911 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c617a8be-fcd4-4bd3-971f-b335484b9beb-metrics-tls podName:c617a8be-fcd4-4bd3-971f-b335484b9beb nodeName:}" failed. No retries permitted until 2026-03-18 16:44:04.077891545 +0000 UTC m=+35.127706953 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c617a8be-fcd4-4bd3-971f-b335484b9beb-metrics-tls") pod "dns-default-zlctd" (UID: "c617a8be-fcd4-4bd3-971f-b335484b9beb") : secret "dns-default-metrics-tls" not found Mar 18 16:44:03.078167 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:44:03.077933 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4220311f-b2d3-43c5-87a6-ddf6edd88e2f-cert podName:4220311f-b2d3-43c5-87a6-ddf6edd88e2f nodeName:}" failed. No retries permitted until 2026-03-18 16:44:04.07792222 +0000 UTC m=+35.127737625 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4220311f-b2d3-43c5-87a6-ddf6edd88e2f-cert") pod "ingress-canary-mv4xk" (UID: "4220311f-b2d3-43c5-87a6-ddf6edd88e2f") : secret "canary-serving-cert" not found Mar 18 16:44:03.078167 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:44:03.077965 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca-registry-tls podName:f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca nodeName:}" failed. No retries permitted until 2026-03-18 16:44:04.077955275 +0000 UTC m=+35.127770681 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca-registry-tls") pod "image-registry-9cbc96769-fp2t2" (UID: "f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca") : secret "image-registry-tls" not found Mar 18 16:44:04.084346 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:04.084317 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c617a8be-fcd4-4bd3-971f-b335484b9beb-metrics-tls\") pod \"dns-default-zlctd\" (UID: \"c617a8be-fcd4-4bd3-971f-b335484b9beb\") " pod="openshift-dns/dns-default-zlctd" Mar 18 16:44:04.084922 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:04.084354 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca-registry-tls\") pod \"image-registry-9cbc96769-fp2t2\" (UID: \"f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca\") " pod="openshift-image-registry/image-registry-9cbc96769-fp2t2" Mar 18 16:44:04.084922 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:04.084406 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4220311f-b2d3-43c5-87a6-ddf6edd88e2f-cert\") pod \"ingress-canary-mv4xk\" (UID: \"4220311f-b2d3-43c5-87a6-ddf6edd88e2f\") " pod="openshift-ingress-canary/ingress-canary-mv4xk" Mar 18 16:44:04.084922 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:44:04.084466 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Mar 18 16:44:04.084922 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:44:04.084521 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Mar 18 16:44:04.084922 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:44:04.084529 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c617a8be-fcd4-4bd3-971f-b335484b9beb-metrics-tls podName:c617a8be-fcd4-4bd3-971f-b335484b9beb nodeName:}" failed. No retries permitted until 2026-03-18 16:44:06.084515237 +0000 UTC m=+37.134330643 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c617a8be-fcd4-4bd3-971f-b335484b9beb-metrics-tls") pod "dns-default-zlctd" (UID: "c617a8be-fcd4-4bd3-971f-b335484b9beb") : secret "dns-default-metrics-tls" not found Mar 18 16:44:04.084922 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:44:04.084568 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4220311f-b2d3-43c5-87a6-ddf6edd88e2f-cert podName:4220311f-b2d3-43c5-87a6-ddf6edd88e2f nodeName:}" failed. No retries permitted until 2026-03-18 16:44:06.084553808 +0000 UTC m=+37.134369213 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4220311f-b2d3-43c5-87a6-ddf6edd88e2f-cert") pod "ingress-canary-mv4xk" (UID: "4220311f-b2d3-43c5-87a6-ddf6edd88e2f") : secret "canary-serving-cert" not found Mar 18 16:44:04.084922 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:44:04.084524 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Mar 18 16:44:04.084922 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:44:04.084589 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-9cbc96769-fp2t2: secret "image-registry-tls" not found Mar 18 16:44:04.084922 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:44:04.084633 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca-registry-tls podName:f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca nodeName:}" failed. No retries permitted until 2026-03-18 16:44:06.084622716 +0000 UTC m=+37.134438124 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca-registry-tls") pod "image-registry-9cbc96769-fp2t2" (UID: "f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca") : secret "image-registry-tls" not found Mar 18 16:44:04.818108 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:04.818071 2576 generic.go:358] "Generic (PLEG): container finished" podID="fc696d94-dd9b-4ec4-8c68-3ef333b3aaa1" containerID="8cae4058126e24d90f40caf35a68409a77c3b31905b71ca47e45eab892e466d5" exitCode=0 Mar 18 16:44:04.818257 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:04.818115 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-x7hss" event={"ID":"fc696d94-dd9b-4ec4-8c68-3ef333b3aaa1","Type":"ContainerDied","Data":"8cae4058126e24d90f40caf35a68409a77c3b31905b71ca47e45eab892e466d5"} Mar 18 16:44:05.822029 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:05.821996 2576 generic.go:358] "Generic (PLEG): container finished" podID="fc696d94-dd9b-4ec4-8c68-3ef333b3aaa1" containerID="0ff085e94f710d6ac02b73501e271963abcd387e9bcb666a1d53cb2005f3ec91" exitCode=0 Mar 18 16:44:05.822452 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:05.822038 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-x7hss" event={"ID":"fc696d94-dd9b-4ec4-8c68-3ef333b3aaa1","Type":"ContainerDied","Data":"0ff085e94f710d6ac02b73501e271963abcd387e9bcb666a1d53cb2005f3ec91"} Mar 18 16:44:06.102189 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:06.102101 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca-registry-tls\") pod \"image-registry-9cbc96769-fp2t2\" (UID: \"f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca\") " pod="openshift-image-registry/image-registry-9cbc96769-fp2t2" Mar 18 16:44:06.102189 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:06.102167 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4220311f-b2d3-43c5-87a6-ddf6edd88e2f-cert\") pod \"ingress-canary-mv4xk\" (UID: \"4220311f-b2d3-43c5-87a6-ddf6edd88e2f\") " pod="openshift-ingress-canary/ingress-canary-mv4xk" Mar 18 16:44:06.102376 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:06.102214 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c617a8be-fcd4-4bd3-971f-b335484b9beb-metrics-tls\") pod \"dns-default-zlctd\" (UID: \"c617a8be-fcd4-4bd3-971f-b335484b9beb\") " pod="openshift-dns/dns-default-zlctd" Mar 18 16:44:06.102376 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:44:06.102251 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Mar 18 16:44:06.102376 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:44:06.102269 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-9cbc96769-fp2t2: secret "image-registry-tls" not found Mar 18 16:44:06.102376 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:44:06.102297 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Mar 18 16:44:06.102376 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:44:06.102320 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca-registry-tls podName:f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca nodeName:}" failed. No retries permitted until 2026-03-18 16:44:10.102304975 +0000 UTC m=+41.152120379 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca-registry-tls") pod "image-registry-9cbc96769-fp2t2" (UID: "f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca") : secret "image-registry-tls" not found Mar 18 16:44:06.102376 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:44:06.102340 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c617a8be-fcd4-4bd3-971f-b335484b9beb-metrics-tls podName:c617a8be-fcd4-4bd3-971f-b335484b9beb nodeName:}" failed. No retries permitted until 2026-03-18 16:44:10.10232766 +0000 UTC m=+41.152143065 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c617a8be-fcd4-4bd3-971f-b335484b9beb-metrics-tls") pod "dns-default-zlctd" (UID: "c617a8be-fcd4-4bd3-971f-b335484b9beb") : secret "dns-default-metrics-tls" not found Mar 18 16:44:06.102376 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:44:06.102298 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Mar 18 16:44:06.102376 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:44:06.102366 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4220311f-b2d3-43c5-87a6-ddf6edd88e2f-cert podName:4220311f-b2d3-43c5-87a6-ddf6edd88e2f nodeName:}" failed. No retries permitted until 2026-03-18 16:44:10.102358038 +0000 UTC m=+41.152173444 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4220311f-b2d3-43c5-87a6-ddf6edd88e2f-cert") pod "ingress-canary-mv4xk" (UID: "4220311f-b2d3-43c5-87a6-ddf6edd88e2f") : secret "canary-serving-cert" not found Mar 18 16:44:06.602090 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:06.601859 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-6b589cdcc-hxcp4"] Mar 18 16:44:06.642251 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:06.642212 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-6b589cdcc-hxcp4"] Mar 18 16:44:06.642417 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:06.642303 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-6b589cdcc-hxcp4" Mar 18 16:44:06.644517 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:06.644494 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Mar 18 16:44:06.644645 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:06.644492 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Mar 18 16:44:06.644645 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:06.644565 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-nkf4w\"" Mar 18 16:44:06.806532 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:06.806503 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tckgv\" (UniqueName: \"kubernetes.io/projected/c1a43c8a-8d1f-4c15-804d-994b6386c863-kube-api-access-tckgv\") pod \"migrator-6b589cdcc-hxcp4\" (UID: \"c1a43c8a-8d1f-4c15-804d-994b6386c863\") " pod="openshift-kube-storage-version-migrator/migrator-6b589cdcc-hxcp4" Mar 18 16:44:06.828889 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:06.828855 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-x7hss" event={"ID":"fc696d94-dd9b-4ec4-8c68-3ef333b3aaa1","Type":"ContainerStarted","Data":"ed584554583f0b41500aa58e2c60f7f4347447cd5ce6a792f758b2478d9a42ef"} Mar 18 16:44:06.852853 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:06.852759 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-x7hss" podStartSLOduration=4.87127579 podStartE2EDuration="37.852745653s" podCreationTimestamp="2026-03-18 16:43:29 +0000 UTC" firstStartedPulling="2026-03-18 16:43:30.750235765 +0000 UTC m=+1.800051170" lastFinishedPulling="2026-03-18 16:44:03.731705624 +0000 UTC m=+34.781521033" observedRunningTime="2026-03-18 16:44:06.85184097 +0000 UTC m=+37.901656398" watchObservedRunningTime="2026-03-18 16:44:06.852745653 +0000 UTC m=+37.902561079" Mar 18 16:44:06.907379 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:06.907347 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tckgv\" (UniqueName: \"kubernetes.io/projected/c1a43c8a-8d1f-4c15-804d-994b6386c863-kube-api-access-tckgv\") pod \"migrator-6b589cdcc-hxcp4\" (UID: \"c1a43c8a-8d1f-4c15-804d-994b6386c863\") " pod="openshift-kube-storage-version-migrator/migrator-6b589cdcc-hxcp4" Mar 18 16:44:06.917513 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:06.917491 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tckgv\" (UniqueName: \"kubernetes.io/projected/c1a43c8a-8d1f-4c15-804d-994b6386c863-kube-api-access-tckgv\") pod \"migrator-6b589cdcc-hxcp4\" (UID: \"c1a43c8a-8d1f-4c15-804d-994b6386c863\") " pod="openshift-kube-storage-version-migrator/migrator-6b589cdcc-hxcp4" Mar 18 16:44:06.951008 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:06.950975 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-6b589cdcc-hxcp4" Mar 18 16:44:07.008677 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:07.008640 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d7144ee2-f2d9-497d-acc9-a5a9a6afcf2b-original-pull-secret\") pod \"global-pull-secret-syncer-6qxc8\" (UID: \"d7144ee2-f2d9-497d-acc9-a5a9a6afcf2b\") " pod="kube-system/global-pull-secret-syncer-6qxc8" Mar 18 16:44:07.024632 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:07.024572 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d7144ee2-f2d9-497d-acc9-a5a9a6afcf2b-original-pull-secret\") pod \"global-pull-secret-syncer-6qxc8\" (UID: \"d7144ee2-f2d9-497d-acc9-a5a9a6afcf2b\") " pod="kube-system/global-pull-secret-syncer-6qxc8" Mar 18 16:44:07.104565 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:07.104478 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6qxc8" Mar 18 16:44:07.116478 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:07.116450 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-6b589cdcc-hxcp4"] Mar 18 16:44:07.119784 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:44:07.119758 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc1a43c8a_8d1f_4c15_804d_994b6386c863.slice/crio-924801b03cdd037ca114e8d7a51c4ed2f9f9a2cd5cfd46868dd36a8904d25503 WatchSource:0}: Error finding container 924801b03cdd037ca114e8d7a51c4ed2f9f9a2cd5cfd46868dd36a8904d25503: Status 404 returned error can't find the container with id 924801b03cdd037ca114e8d7a51c4ed2f9f9a2cd5cfd46868dd36a8904d25503 Mar 18 16:44:07.218408 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:07.218377 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-6qxc8"] Mar 18 16:44:07.221267 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:44:07.221239 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7144ee2_f2d9_497d_acc9_a5a9a6afcf2b.slice/crio-9f67cedc4984610c2ab6c6ba96c7f511df423aa5ed41072e72a5b10acad029c6 WatchSource:0}: Error finding container 9f67cedc4984610c2ab6c6ba96c7f511df423aa5ed41072e72a5b10acad029c6: Status 404 returned error can't find the container with id 9f67cedc4984610c2ab6c6ba96c7f511df423aa5ed41072e72a5b10acad029c6 Mar 18 16:44:07.785854 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:07.785827 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-9w7wj_0bc34432-5dcb-459c-a6d8-587f77ae9dcf/dns-node-resolver/0.log" Mar 18 16:44:07.833808 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:07.833731 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-6qxc8" event={"ID":"d7144ee2-f2d9-497d-acc9-a5a9a6afcf2b","Type":"ContainerStarted","Data":"9f67cedc4984610c2ab6c6ba96c7f511df423aa5ed41072e72a5b10acad029c6"} Mar 18 16:44:07.835099 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:07.835065 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-6b589cdcc-hxcp4" event={"ID":"c1a43c8a-8d1f-4c15-804d-994b6386c863","Type":"ContainerStarted","Data":"924801b03cdd037ca114e8d7a51c4ed2f9f9a2cd5cfd46868dd36a8904d25503"} Mar 18 16:44:08.786116 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:08.785924 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-sr4nn_ad6e055b-96d2-47b3-bb6b-f2e0165f3470/node-ca/0.log" Mar 18 16:44:08.915935 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:08.915899 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-8bb587b94-jxhtd"] Mar 18 16:44:08.939246 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:08.939221 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-8bb587b94-jxhtd"] Mar 18 16:44:08.939387 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:08.939333 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-8bb587b94-jxhtd" Mar 18 16:44:08.941476 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:08.941454 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Mar 18 16:44:08.941585 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:08.941465 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Mar 18 16:44:08.942063 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:08.942028 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Mar 18 16:44:08.942063 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:08.942041 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Mar 18 16:44:08.942212 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:08.942043 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-6ljvl\"" Mar 18 16:44:09.023152 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:09.023109 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/7967fe79-4d8c-47cb-813f-77e8b7d330e1-signing-cabundle\") pod \"service-ca-8bb587b94-jxhtd\" (UID: \"7967fe79-4d8c-47cb-813f-77e8b7d330e1\") " pod="openshift-service-ca/service-ca-8bb587b94-jxhtd" Mar 18 16:44:09.023291 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:09.023252 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/7967fe79-4d8c-47cb-813f-77e8b7d330e1-signing-key\") pod \"service-ca-8bb587b94-jxhtd\" (UID: \"7967fe79-4d8c-47cb-813f-77e8b7d330e1\") " pod="openshift-service-ca/service-ca-8bb587b94-jxhtd" Mar 18 16:44:09.023473 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:09.023309 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8m7n\" (UniqueName: \"kubernetes.io/projected/7967fe79-4d8c-47cb-813f-77e8b7d330e1-kube-api-access-w8m7n\") pod \"service-ca-8bb587b94-jxhtd\" (UID: \"7967fe79-4d8c-47cb-813f-77e8b7d330e1\") " pod="openshift-service-ca/service-ca-8bb587b94-jxhtd" Mar 18 16:44:09.124517 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:09.124476 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/7967fe79-4d8c-47cb-813f-77e8b7d330e1-signing-key\") pod \"service-ca-8bb587b94-jxhtd\" (UID: \"7967fe79-4d8c-47cb-813f-77e8b7d330e1\") " pod="openshift-service-ca/service-ca-8bb587b94-jxhtd" Mar 18 16:44:09.124684 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:09.124537 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w8m7n\" (UniqueName: \"kubernetes.io/projected/7967fe79-4d8c-47cb-813f-77e8b7d330e1-kube-api-access-w8m7n\") pod \"service-ca-8bb587b94-jxhtd\" (UID: \"7967fe79-4d8c-47cb-813f-77e8b7d330e1\") " pod="openshift-service-ca/service-ca-8bb587b94-jxhtd" Mar 18 16:44:09.124684 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:09.124570 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/7967fe79-4d8c-47cb-813f-77e8b7d330e1-signing-cabundle\") pod \"service-ca-8bb587b94-jxhtd\" (UID: \"7967fe79-4d8c-47cb-813f-77e8b7d330e1\") " pod="openshift-service-ca/service-ca-8bb587b94-jxhtd" Mar 18 16:44:09.128380 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:09.128351 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/7967fe79-4d8c-47cb-813f-77e8b7d330e1-signing-key\") pod \"service-ca-8bb587b94-jxhtd\" (UID: \"7967fe79-4d8c-47cb-813f-77e8b7d330e1\") " pod="openshift-service-ca/service-ca-8bb587b94-jxhtd" Mar 18 16:44:09.135645 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:09.135610 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/7967fe79-4d8c-47cb-813f-77e8b7d330e1-signing-cabundle\") pod \"service-ca-8bb587b94-jxhtd\" (UID: \"7967fe79-4d8c-47cb-813f-77e8b7d330e1\") " pod="openshift-service-ca/service-ca-8bb587b94-jxhtd" Mar 18 16:44:09.137664 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:09.137639 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8m7n\" (UniqueName: \"kubernetes.io/projected/7967fe79-4d8c-47cb-813f-77e8b7d330e1-kube-api-access-w8m7n\") pod \"service-ca-8bb587b94-jxhtd\" (UID: \"7967fe79-4d8c-47cb-813f-77e8b7d330e1\") " pod="openshift-service-ca/service-ca-8bb587b94-jxhtd" Mar 18 16:44:09.260803 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:09.260767 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-8bb587b94-jxhtd" Mar 18 16:44:09.395982 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:09.395911 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-8bb587b94-jxhtd"] Mar 18 16:44:09.400967 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:44:09.400920 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7967fe79_4d8c_47cb_813f_77e8b7d330e1.slice/crio-6bf59cf30945112213a794ee4b1f6e72c53a5bce4800315bd3418abb0a954774 WatchSource:0}: Error finding container 6bf59cf30945112213a794ee4b1f6e72c53a5bce4800315bd3418abb0a954774: Status 404 returned error can't find the container with id 6bf59cf30945112213a794ee4b1f6e72c53a5bce4800315bd3418abb0a954774 Mar 18 16:44:09.840383 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:09.840331 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-6b589cdcc-hxcp4" event={"ID":"c1a43c8a-8d1f-4c15-804d-994b6386c863","Type":"ContainerStarted","Data":"385c1530953f6072ba3411bb9bfb11c900cfea12ad80edffff096b56bea42ed0"} Mar 18 16:44:09.840383 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:09.840374 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-6b589cdcc-hxcp4" event={"ID":"c1a43c8a-8d1f-4c15-804d-994b6386c863","Type":"ContainerStarted","Data":"f408e54acfd571ad422e998b6fbd6f0ca05690ed2f59ce0961c0076ac3435319"} Mar 18 16:44:09.841556 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:09.841520 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-8bb587b94-jxhtd" event={"ID":"7967fe79-4d8c-47cb-813f-77e8b7d330e1","Type":"ContainerStarted","Data":"6bf59cf30945112213a794ee4b1f6e72c53a5bce4800315bd3418abb0a954774"} Mar 18 16:44:09.856412 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:09.856365 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-6b589cdcc-hxcp4" podStartSLOduration=2.17678475 podStartE2EDuration="3.856349385s" podCreationTimestamp="2026-03-18 16:44:06 +0000 UTC" firstStartedPulling="2026-03-18 16:44:07.122039721 +0000 UTC m=+38.171855125" lastFinishedPulling="2026-03-18 16:44:08.801604355 +0000 UTC m=+39.851419760" observedRunningTime="2026-03-18 16:44:09.855359028 +0000 UTC m=+40.905174455" watchObservedRunningTime="2026-03-18 16:44:09.856349385 +0000 UTC m=+40.906164811" Mar 18 16:44:10.133388 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:10.133300 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c617a8be-fcd4-4bd3-971f-b335484b9beb-metrics-tls\") pod \"dns-default-zlctd\" (UID: \"c617a8be-fcd4-4bd3-971f-b335484b9beb\") " pod="openshift-dns/dns-default-zlctd" Mar 18 16:44:10.133388 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:10.133360 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca-registry-tls\") pod \"image-registry-9cbc96769-fp2t2\" (UID: \"f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca\") " pod="openshift-image-registry/image-registry-9cbc96769-fp2t2" Mar 18 16:44:10.133841 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:10.133407 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4220311f-b2d3-43c5-87a6-ddf6edd88e2f-cert\") pod \"ingress-canary-mv4xk\" (UID: \"4220311f-b2d3-43c5-87a6-ddf6edd88e2f\") " pod="openshift-ingress-canary/ingress-canary-mv4xk" Mar 18 16:44:10.133841 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:44:10.133424 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Mar 18 16:44:10.133841 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:44:10.133493 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c617a8be-fcd4-4bd3-971f-b335484b9beb-metrics-tls podName:c617a8be-fcd4-4bd3-971f-b335484b9beb nodeName:}" failed. No retries permitted until 2026-03-18 16:44:18.133476763 +0000 UTC m=+49.183292173 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c617a8be-fcd4-4bd3-971f-b335484b9beb-metrics-tls") pod "dns-default-zlctd" (UID: "c617a8be-fcd4-4bd3-971f-b335484b9beb") : secret "dns-default-metrics-tls" not found Mar 18 16:44:10.133841 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:44:10.133514 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Mar 18 16:44:10.133841 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:44:10.133527 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Mar 18 16:44:10.133841 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:44:10.133548 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-9cbc96769-fp2t2: secret "image-registry-tls" not found Mar 18 16:44:10.133841 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:44:10.133570 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4220311f-b2d3-43c5-87a6-ddf6edd88e2f-cert podName:4220311f-b2d3-43c5-87a6-ddf6edd88e2f nodeName:}" failed. No retries permitted until 2026-03-18 16:44:18.133556535 +0000 UTC m=+49.183371941 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4220311f-b2d3-43c5-87a6-ddf6edd88e2f-cert") pod "ingress-canary-mv4xk" (UID: "4220311f-b2d3-43c5-87a6-ddf6edd88e2f") : secret "canary-serving-cert" not found Mar 18 16:44:10.133841 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:44:10.133611 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca-registry-tls podName:f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca nodeName:}" failed. No retries permitted until 2026-03-18 16:44:18.133594335 +0000 UTC m=+49.183409746 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca-registry-tls") pod "image-registry-9cbc96769-fp2t2" (UID: "f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca") : secret "image-registry-tls" not found Mar 18 16:44:11.846897 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:11.846856 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-6qxc8" event={"ID":"d7144ee2-f2d9-497d-acc9-a5a9a6afcf2b","Type":"ContainerStarted","Data":"9486f79d9dc0a20ca57a173f124ccf66ef3ef9e762ec271abaaa380fa52e5c38"} Mar 18 16:44:11.863273 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:11.863217 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-6qxc8" podStartSLOduration=16.694066338 podStartE2EDuration="20.863202349s" podCreationTimestamp="2026-03-18 16:43:51 +0000 UTC" firstStartedPulling="2026-03-18 16:44:07.222868946 +0000 UTC m=+38.272684360" lastFinishedPulling="2026-03-18 16:44:11.392004962 +0000 UTC m=+42.441820371" observedRunningTime="2026-03-18 16:44:11.862279661 +0000 UTC m=+42.912095080" watchObservedRunningTime="2026-03-18 16:44:11.863202349 +0000 UTC m=+42.913017776" Mar 18 16:44:13.852192 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:13.852155 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-8bb587b94-jxhtd" event={"ID":"7967fe79-4d8c-47cb-813f-77e8b7d330e1","Type":"ContainerStarted","Data":"d3554a1b9cbd11ca44bc5f13be25687205c48634b40799a00aa6b27e0bc36d5f"} Mar 18 16:44:13.866657 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:13.866611 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-8bb587b94-jxhtd" podStartSLOduration=2.08801952 podStartE2EDuration="5.866595903s" podCreationTimestamp="2026-03-18 16:44:08 +0000 UTC" firstStartedPulling="2026-03-18 16:44:09.416453671 +0000 UTC m=+40.466269082" lastFinishedPulling="2026-03-18 16:44:13.195030057 +0000 UTC m=+44.244845465" observedRunningTime="2026-03-18 16:44:13.865687021 +0000 UTC m=+44.915502452" watchObservedRunningTime="2026-03-18 16:44:13.866595903 +0000 UTC m=+44.916411308" Mar 18 16:44:18.197932 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:18.197889 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c617a8be-fcd4-4bd3-971f-b335484b9beb-metrics-tls\") pod \"dns-default-zlctd\" (UID: \"c617a8be-fcd4-4bd3-971f-b335484b9beb\") " pod="openshift-dns/dns-default-zlctd" Mar 18 16:44:18.197932 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:18.197935 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca-registry-tls\") pod \"image-registry-9cbc96769-fp2t2\" (UID: \"f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca\") " pod="openshift-image-registry/image-registry-9cbc96769-fp2t2" Mar 18 16:44:18.198563 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:18.197988 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4220311f-b2d3-43c5-87a6-ddf6edd88e2f-cert\") pod \"ingress-canary-mv4xk\" (UID: \"4220311f-b2d3-43c5-87a6-ddf6edd88e2f\") " pod="openshift-ingress-canary/ingress-canary-mv4xk" Mar 18 16:44:18.198563 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:44:18.198055 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Mar 18 16:44:18.198563 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:44:18.198116 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Mar 18 16:44:18.198563 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:44:18.198134 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c617a8be-fcd4-4bd3-971f-b335484b9beb-metrics-tls podName:c617a8be-fcd4-4bd3-971f-b335484b9beb nodeName:}" failed. No retries permitted until 2026-03-18 16:44:34.198113554 +0000 UTC m=+65.247928959 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c617a8be-fcd4-4bd3-971f-b335484b9beb-metrics-tls") pod "dns-default-zlctd" (UID: "c617a8be-fcd4-4bd3-971f-b335484b9beb") : secret "dns-default-metrics-tls" not found Mar 18 16:44:18.198563 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:44:18.198139 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-9cbc96769-fp2t2: secret "image-registry-tls" not found Mar 18 16:44:18.198563 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:44:18.198190 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca-registry-tls podName:f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca nodeName:}" failed. No retries permitted until 2026-03-18 16:44:34.198173871 +0000 UTC m=+65.247989282 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca-registry-tls") pod "image-registry-9cbc96769-fp2t2" (UID: "f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca") : secret "image-registry-tls" not found Mar 18 16:44:18.198563 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:44:18.198067 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Mar 18 16:44:18.198563 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:44:18.198236 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4220311f-b2d3-43c5-87a6-ddf6edd88e2f-cert podName:4220311f-b2d3-43c5-87a6-ddf6edd88e2f nodeName:}" failed. No retries permitted until 2026-03-18 16:44:34.198227385 +0000 UTC m=+65.248042794 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4220311f-b2d3-43c5-87a6-ddf6edd88e2f-cert") pod "ingress-canary-mv4xk" (UID: "4220311f-b2d3-43c5-87a6-ddf6edd88e2f") : secret "canary-serving-cert" not found Mar 18 16:44:27.811071 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:27.811038 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-g2jdr" Mar 18 16:44:31.757714 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:31.757682 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-5b85974fd6-4k2gx"] Mar 18 16:44:31.761914 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:31.761895 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-5b85974fd6-4k2gx" Mar 18 16:44:31.763696 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:31.763679 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Mar 18 16:44:31.764053 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:31.764037 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Mar 18 16:44:31.764120 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:31.764110 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-qxgc5\"" Mar 18 16:44:31.774662 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:31.774633 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-5b85974fd6-4k2gx"] Mar 18 16:44:31.775397 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:31.775376 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-9cbc96769-fp2t2"] Mar 18 16:44:31.775570 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:44:31.775550 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-image-registry/image-registry-9cbc96769-fp2t2" podUID="f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca" Mar 18 16:44:31.803782 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:31.803748 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqv2c\" (UniqueName: \"kubernetes.io/projected/b7fbe299-6374-4715-a716-d47ac426d698-kube-api-access-kqv2c\") pod \"downloads-5b85974fd6-4k2gx\" (UID: \"b7fbe299-6374-4715-a716-d47ac426d698\") " pod="openshift-console/downloads-5b85974fd6-4k2gx" Mar 18 16:44:31.866196 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:31.866158 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-8444df798b-59lfw"] Mar 18 16:44:31.869021 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:31.868996 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-8444df798b-59lfw" Mar 18 16:44:31.870037 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:31.870015 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-89d76c4f-f4bs8"] Mar 18 16:44:31.871582 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:31.871566 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Mar 18 16:44:31.872504 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:31.872483 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-5mdwq\"" Mar 18 16:44:31.872759 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:31.872743 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-89d76c4f-f4bs8" Mar 18 16:44:31.888316 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:31.888287 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-9cbc96769-fp2t2" Mar 18 16:44:31.889237 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:31.889217 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-8444df798b-59lfw"] Mar 18 16:44:31.892278 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:31.892263 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-9cbc96769-fp2t2" Mar 18 16:44:31.899201 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:31.899172 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-89d76c4f-f4bs8"] Mar 18 16:44:31.901072 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:31.901052 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-c5nc7"] Mar 18 16:44:31.904141 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:31.904123 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-c5nc7" Mar 18 16:44:31.904259 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:31.904230 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/962d1b72-c4cb-47c2-af2c-3921403d90f0-ca-trust-extracted\") pod \"image-registry-89d76c4f-f4bs8\" (UID: \"962d1b72-c4cb-47c2-af2c-3921403d90f0\") " pod="openshift-image-registry/image-registry-89d76c4f-f4bs8" Mar 18 16:44:31.904330 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:31.904270 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/962d1b72-c4cb-47c2-af2c-3921403d90f0-registry-certificates\") pod \"image-registry-89d76c4f-f4bs8\" (UID: \"962d1b72-c4cb-47c2-af2c-3921403d90f0\") " pod="openshift-image-registry/image-registry-89d76c4f-f4bs8" Mar 18 16:44:31.904330 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:31.904320 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/78f9ac4f-10cd-438c-8e38-5a772fb6f4e1-tls-certificates\") pod \"prometheus-operator-admission-webhook-8444df798b-59lfw\" (UID: \"78f9ac4f-10cd-438c-8e38-5a772fb6f4e1\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-8444df798b-59lfw" Mar 18 16:44:31.904426 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:31.904347 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/962d1b72-c4cb-47c2-af2c-3921403d90f0-trusted-ca\") pod \"image-registry-89d76c4f-f4bs8\" (UID: \"962d1b72-c4cb-47c2-af2c-3921403d90f0\") " pod="openshift-image-registry/image-registry-89d76c4f-f4bs8" Mar 18 16:44:31.904426 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:31.904385 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kqv2c\" (UniqueName: \"kubernetes.io/projected/b7fbe299-6374-4715-a716-d47ac426d698-kube-api-access-kqv2c\") pod \"downloads-5b85974fd6-4k2gx\" (UID: \"b7fbe299-6374-4715-a716-d47ac426d698\") " pod="openshift-console/downloads-5b85974fd6-4k2gx" Mar 18 16:44:31.904515 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:31.904435 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hb658\" (UniqueName: \"kubernetes.io/projected/962d1b72-c4cb-47c2-af2c-3921403d90f0-kube-api-access-hb658\") pod \"image-registry-89d76c4f-f4bs8\" (UID: \"962d1b72-c4cb-47c2-af2c-3921403d90f0\") " pod="openshift-image-registry/image-registry-89d76c4f-f4bs8" Mar 18 16:44:31.904561 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:31.904529 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/962d1b72-c4cb-47c2-af2c-3921403d90f0-image-registry-private-configuration\") pod \"image-registry-89d76c4f-f4bs8\" (UID: \"962d1b72-c4cb-47c2-af2c-3921403d90f0\") " pod="openshift-image-registry/image-registry-89d76c4f-f4bs8" Mar 18 16:44:31.904612 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:31.904572 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/962d1b72-c4cb-47c2-af2c-3921403d90f0-bound-sa-token\") pod \"image-registry-89d76c4f-f4bs8\" (UID: \"962d1b72-c4cb-47c2-af2c-3921403d90f0\") " pod="openshift-image-registry/image-registry-89d76c4f-f4bs8" Mar 18 16:44:31.904658 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:31.904606 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/962d1b72-c4cb-47c2-af2c-3921403d90f0-registry-tls\") pod \"image-registry-89d76c4f-f4bs8\" (UID: \"962d1b72-c4cb-47c2-af2c-3921403d90f0\") " pod="openshift-image-registry/image-registry-89d76c4f-f4bs8" Mar 18 16:44:31.904707 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:31.904670 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/962d1b72-c4cb-47c2-af2c-3921403d90f0-installation-pull-secrets\") pod \"image-registry-89d76c4f-f4bs8\" (UID: \"962d1b72-c4cb-47c2-af2c-3921403d90f0\") " pod="openshift-image-registry/image-registry-89d76c4f-f4bs8" Mar 18 16:44:31.907623 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:31.907604 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Mar 18 16:44:31.907720 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:31.907604 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Mar 18 16:44:31.907808 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:31.907791 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Mar 18 16:44:31.907855 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:31.907841 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-5f2hh\"" Mar 18 16:44:31.908369 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:31.908357 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Mar 18 16:44:31.921091 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:31.921072 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-c5nc7"] Mar 18 16:44:31.933617 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:31.933593 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqv2c\" (UniqueName: \"kubernetes.io/projected/b7fbe299-6374-4715-a716-d47ac426d698-kube-api-access-kqv2c\") pod \"downloads-5b85974fd6-4k2gx\" (UID: \"b7fbe299-6374-4715-a716-d47ac426d698\") " pod="openshift-console/downloads-5b85974fd6-4k2gx" Mar 18 16:44:32.005340 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:32.005303 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca-image-registry-private-configuration\") pod \"f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca\" (UID: \"f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca\") " Mar 18 16:44:32.005520 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:32.005354 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca-installation-pull-secrets\") pod \"f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca\" (UID: \"f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca\") " Mar 18 16:44:32.005520 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:32.005390 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-75dtn\" (UniqueName: \"kubernetes.io/projected/f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca-kube-api-access-75dtn\") pod \"f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca\" (UID: \"f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca\") " Mar 18 16:44:32.005520 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:32.005410 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca-ca-trust-extracted\") pod \"f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca\" (UID: \"f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca\") " Mar 18 16:44:32.005520 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:32.005442 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca-bound-sa-token\") pod \"f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca\" (UID: \"f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca\") " Mar 18 16:44:32.005520 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:32.005478 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca-trusted-ca\") pod \"f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca\" (UID: \"f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca\") " Mar 18 16:44:32.005520 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:32.005512 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca-registry-certificates\") pod \"f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca\" (UID: \"f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca\") " Mar 18 16:44:32.005819 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:32.005603 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/962d1b72-c4cb-47c2-af2c-3921403d90f0-image-registry-private-configuration\") pod \"image-registry-89d76c4f-f4bs8\" (UID: \"962d1b72-c4cb-47c2-af2c-3921403d90f0\") " pod="openshift-image-registry/image-registry-89d76c4f-f4bs8" Mar 18 16:44:32.005819 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:32.005629 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/962d1b72-c4cb-47c2-af2c-3921403d90f0-bound-sa-token\") pod \"image-registry-89d76c4f-f4bs8\" (UID: \"962d1b72-c4cb-47c2-af2c-3921403d90f0\") " pod="openshift-image-registry/image-registry-89d76c4f-f4bs8" Mar 18 16:44:32.005819 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:32.005651 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/962d1b72-c4cb-47c2-af2c-3921403d90f0-registry-tls\") pod \"image-registry-89d76c4f-f4bs8\" (UID: \"962d1b72-c4cb-47c2-af2c-3921403d90f0\") " pod="openshift-image-registry/image-registry-89d76c4f-f4bs8" Mar 18 16:44:32.005819 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:32.005711 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/962d1b72-c4cb-47c2-af2c-3921403d90f0-installation-pull-secrets\") pod \"image-registry-89d76c4f-f4bs8\" (UID: \"962d1b72-c4cb-47c2-af2c-3921403d90f0\") " pod="openshift-image-registry/image-registry-89d76c4f-f4bs8" Mar 18 16:44:32.005819 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:32.005741 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/c49f1334-5f9a-4e6b-8d3c-dadd6d818f8e-data-volume\") pod \"insights-runtime-extractor-c5nc7\" (UID: \"c49f1334-5f9a-4e6b-8d3c-dadd6d818f8e\") " pod="openshift-insights/insights-runtime-extractor-c5nc7" Mar 18 16:44:32.006045 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:32.005832 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/962d1b72-c4cb-47c2-af2c-3921403d90f0-ca-trust-extracted\") pod \"image-registry-89d76c4f-f4bs8\" (UID: \"962d1b72-c4cb-47c2-af2c-3921403d90f0\") " pod="openshift-image-registry/image-registry-89d76c4f-f4bs8" Mar 18 16:44:32.006045 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:32.005867 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/962d1b72-c4cb-47c2-af2c-3921403d90f0-registry-certificates\") pod \"image-registry-89d76c4f-f4bs8\" (UID: \"962d1b72-c4cb-47c2-af2c-3921403d90f0\") " pod="openshift-image-registry/image-registry-89d76c4f-f4bs8" Mar 18 16:44:32.006434 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:32.006383 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca" (UID: "f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 18 16:44:32.006586 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:32.006566 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca" (UID: "f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 16:44:32.006702 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:32.006682 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stzhp\" (UniqueName: \"kubernetes.io/projected/c49f1334-5f9a-4e6b-8d3c-dadd6d818f8e-kube-api-access-stzhp\") pod \"insights-runtime-extractor-c5nc7\" (UID: \"c49f1334-5f9a-4e6b-8d3c-dadd6d818f8e\") " pod="openshift-insights/insights-runtime-extractor-c5nc7" Mar 18 16:44:32.008667 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:32.007381 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/78f9ac4f-10cd-438c-8e38-5a772fb6f4e1-tls-certificates\") pod \"prometheus-operator-admission-webhook-8444df798b-59lfw\" (UID: \"78f9ac4f-10cd-438c-8e38-5a772fb6f4e1\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-8444df798b-59lfw" Mar 18 16:44:32.008667 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:32.007548 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/962d1b72-c4cb-47c2-af2c-3921403d90f0-trusted-ca\") pod \"image-registry-89d76c4f-f4bs8\" (UID: \"962d1b72-c4cb-47c2-af2c-3921403d90f0\") " pod="openshift-image-registry/image-registry-89d76c4f-f4bs8" Mar 18 16:44:32.008667 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:32.007714 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/c49f1334-5f9a-4e6b-8d3c-dadd6d818f8e-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-c5nc7\" (UID: \"c49f1334-5f9a-4e6b-8d3c-dadd6d818f8e\") " pod="openshift-insights/insights-runtime-extractor-c5nc7" Mar 18 16:44:32.008667 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:32.007875 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/c49f1334-5f9a-4e6b-8d3c-dadd6d818f8e-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-c5nc7\" (UID: \"c49f1334-5f9a-4e6b-8d3c-dadd6d818f8e\") " pod="openshift-insights/insights-runtime-extractor-c5nc7" Mar 18 16:44:32.008667 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:32.007863 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca" (UID: "f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 16:44:32.008667 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:32.007920 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hb658\" (UniqueName: \"kubernetes.io/projected/962d1b72-c4cb-47c2-af2c-3921403d90f0-kube-api-access-hb658\") pod \"image-registry-89d76c4f-f4bs8\" (UID: \"962d1b72-c4cb-47c2-af2c-3921403d90f0\") " pod="openshift-image-registry/image-registry-89d76c4f-f4bs8" Mar 18 16:44:32.008667 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:32.008086 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca" (UID: "f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 16:44:32.008667 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:32.007979 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/c49f1334-5f9a-4e6b-8d3c-dadd6d818f8e-crio-socket\") pod \"insights-runtime-extractor-c5nc7\" (UID: \"c49f1334-5f9a-4e6b-8d3c-dadd6d818f8e\") " pod="openshift-insights/insights-runtime-extractor-c5nc7" Mar 18 16:44:32.008667 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:32.008591 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/962d1b72-c4cb-47c2-af2c-3921403d90f0-registry-certificates\") pod \"image-registry-89d76c4f-f4bs8\" (UID: \"962d1b72-c4cb-47c2-af2c-3921403d90f0\") " pod="openshift-image-registry/image-registry-89d76c4f-f4bs8" Mar 18 16:44:32.009199 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:32.009014 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca" (UID: "f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:44:32.009469 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:32.009420 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/962d1b72-c4cb-47c2-af2c-3921403d90f0-ca-trust-extracted\") pod \"image-registry-89d76c4f-f4bs8\" (UID: \"962d1b72-c4cb-47c2-af2c-3921403d90f0\") " pod="openshift-image-registry/image-registry-89d76c4f-f4bs8" Mar 18 16:44:32.010008 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:32.009825 2576 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca-trusted-ca\") on node \"ip-10-0-133-190.ec2.internal\" DevicePath \"\"" Mar 18 16:44:32.010008 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:32.009858 2576 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca-registry-certificates\") on node \"ip-10-0-133-190.ec2.internal\" DevicePath \"\"" Mar 18 16:44:32.010008 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:32.009879 2576 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca-ca-trust-extracted\") on node \"ip-10-0-133-190.ec2.internal\" DevicePath \"\"" Mar 18 16:44:32.010008 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:32.009895 2576 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca-bound-sa-token\") on node \"ip-10-0-133-190.ec2.internal\" DevicePath \"\"" Mar 18 16:44:32.010235 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:32.010156 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca-kube-api-access-75dtn" (OuterVolumeSpecName: "kube-api-access-75dtn") pod "f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca" (UID: "f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca"). InnerVolumeSpecName "kube-api-access-75dtn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 16:44:32.012156 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:32.012122 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca" (UID: "f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:44:32.012584 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:32.012558 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/78f9ac4f-10cd-438c-8e38-5a772fb6f4e1-tls-certificates\") pod \"prometheus-operator-admission-webhook-8444df798b-59lfw\" (UID: \"78f9ac4f-10cd-438c-8e38-5a772fb6f4e1\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-8444df798b-59lfw" Mar 18 16:44:32.013843 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:32.013818 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/962d1b72-c4cb-47c2-af2c-3921403d90f0-installation-pull-secrets\") pod \"image-registry-89d76c4f-f4bs8\" (UID: \"962d1b72-c4cb-47c2-af2c-3921403d90f0\") " pod="openshift-image-registry/image-registry-89d76c4f-f4bs8" Mar 18 16:44:32.013959 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:32.013880 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/962d1b72-c4cb-47c2-af2c-3921403d90f0-registry-tls\") pod \"image-registry-89d76c4f-f4bs8\" (UID: \"962d1b72-c4cb-47c2-af2c-3921403d90f0\") " pod="openshift-image-registry/image-registry-89d76c4f-f4bs8" Mar 18 16:44:32.014024 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:32.013988 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/962d1b72-c4cb-47c2-af2c-3921403d90f0-image-registry-private-configuration\") pod \"image-registry-89d76c4f-f4bs8\" (UID: \"962d1b72-c4cb-47c2-af2c-3921403d90f0\") " pod="openshift-image-registry/image-registry-89d76c4f-f4bs8" Mar 18 16:44:32.014392 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:32.014287 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/962d1b72-c4cb-47c2-af2c-3921403d90f0-trusted-ca\") pod \"image-registry-89d76c4f-f4bs8\" (UID: \"962d1b72-c4cb-47c2-af2c-3921403d90f0\") " pod="openshift-image-registry/image-registry-89d76c4f-f4bs8" Mar 18 16:44:32.014934 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:32.014914 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/962d1b72-c4cb-47c2-af2c-3921403d90f0-bound-sa-token\") pod \"image-registry-89d76c4f-f4bs8\" (UID: \"962d1b72-c4cb-47c2-af2c-3921403d90f0\") " pod="openshift-image-registry/image-registry-89d76c4f-f4bs8" Mar 18 16:44:32.016694 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:32.016673 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hb658\" (UniqueName: \"kubernetes.io/projected/962d1b72-c4cb-47c2-af2c-3921403d90f0-kube-api-access-hb658\") pod \"image-registry-89d76c4f-f4bs8\" (UID: \"962d1b72-c4cb-47c2-af2c-3921403d90f0\") " pod="openshift-image-registry/image-registry-89d76c4f-f4bs8" Mar 18 16:44:32.071593 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:32.071558 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-5b85974fd6-4k2gx" Mar 18 16:44:32.111239 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:32.111199 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/c49f1334-5f9a-4e6b-8d3c-dadd6d818f8e-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-c5nc7\" (UID: \"c49f1334-5f9a-4e6b-8d3c-dadd6d818f8e\") " pod="openshift-insights/insights-runtime-extractor-c5nc7" Mar 18 16:44:32.111239 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:32.111240 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/c49f1334-5f9a-4e6b-8d3c-dadd6d818f8e-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-c5nc7\" (UID: \"c49f1334-5f9a-4e6b-8d3c-dadd6d818f8e\") " pod="openshift-insights/insights-runtime-extractor-c5nc7" Mar 18 16:44:32.111488 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:32.111269 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/c49f1334-5f9a-4e6b-8d3c-dadd6d818f8e-crio-socket\") pod \"insights-runtime-extractor-c5nc7\" (UID: \"c49f1334-5f9a-4e6b-8d3c-dadd6d818f8e\") " pod="openshift-insights/insights-runtime-extractor-c5nc7" Mar 18 16:44:32.111488 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:32.111355 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/c49f1334-5f9a-4e6b-8d3c-dadd6d818f8e-data-volume\") pod \"insights-runtime-extractor-c5nc7\" (UID: \"c49f1334-5f9a-4e6b-8d3c-dadd6d818f8e\") " pod="openshift-insights/insights-runtime-extractor-c5nc7" Mar 18 16:44:32.111488 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:32.111402 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-stzhp\" (UniqueName: \"kubernetes.io/projected/c49f1334-5f9a-4e6b-8d3c-dadd6d818f8e-kube-api-access-stzhp\") pod \"insights-runtime-extractor-c5nc7\" (UID: \"c49f1334-5f9a-4e6b-8d3c-dadd6d818f8e\") " pod="openshift-insights/insights-runtime-extractor-c5nc7" Mar 18 16:44:32.111488 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:32.111448 2576 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca-image-registry-private-configuration\") on node \"ip-10-0-133-190.ec2.internal\" DevicePath \"\"" Mar 18 16:44:32.111488 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:32.111449 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/c49f1334-5f9a-4e6b-8d3c-dadd6d818f8e-crio-socket\") pod \"insights-runtime-extractor-c5nc7\" (UID: \"c49f1334-5f9a-4e6b-8d3c-dadd6d818f8e\") " pod="openshift-insights/insights-runtime-extractor-c5nc7" Mar 18 16:44:32.111488 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:32.111466 2576 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca-installation-pull-secrets\") on node \"ip-10-0-133-190.ec2.internal\" DevicePath \"\"" Mar 18 16:44:32.111488 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:32.111481 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-75dtn\" (UniqueName: \"kubernetes.io/projected/f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca-kube-api-access-75dtn\") on node \"ip-10-0-133-190.ec2.internal\" DevicePath \"\"" Mar 18 16:44:32.111824 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:32.111731 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/c49f1334-5f9a-4e6b-8d3c-dadd6d818f8e-data-volume\") pod \"insights-runtime-extractor-c5nc7\" (UID: \"c49f1334-5f9a-4e6b-8d3c-dadd6d818f8e\") " pod="openshift-insights/insights-runtime-extractor-c5nc7" Mar 18 16:44:32.111824 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:32.111814 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/c49f1334-5f9a-4e6b-8d3c-dadd6d818f8e-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-c5nc7\" (UID: \"c49f1334-5f9a-4e6b-8d3c-dadd6d818f8e\") " pod="openshift-insights/insights-runtime-extractor-c5nc7" Mar 18 16:44:32.113653 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:32.113629 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/c49f1334-5f9a-4e6b-8d3c-dadd6d818f8e-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-c5nc7\" (UID: \"c49f1334-5f9a-4e6b-8d3c-dadd6d818f8e\") " pod="openshift-insights/insights-runtime-extractor-c5nc7" Mar 18 16:44:32.118476 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:32.118452 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-stzhp\" (UniqueName: \"kubernetes.io/projected/c49f1334-5f9a-4e6b-8d3c-dadd6d818f8e-kube-api-access-stzhp\") pod \"insights-runtime-extractor-c5nc7\" (UID: \"c49f1334-5f9a-4e6b-8d3c-dadd6d818f8e\") " pod="openshift-insights/insights-runtime-extractor-c5nc7" Mar 18 16:44:32.178873 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:32.178840 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-8444df798b-59lfw" Mar 18 16:44:32.185696 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:32.185673 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-27ljg\"" Mar 18 16:44:32.191706 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:32.191672 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-5b85974fd6-4k2gx"] Mar 18 16:44:32.194362 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:32.194338 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-89d76c4f-f4bs8" Mar 18 16:44:32.195975 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:44:32.195925 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb7fbe299_6374_4715_a716_d47ac426d698.slice/crio-b598cfe3b5c8472d2dd0cc554bd78bb1f94cc6c072e004aef75bafc7446487ee WatchSource:0}: Error finding container b598cfe3b5c8472d2dd0cc554bd78bb1f94cc6c072e004aef75bafc7446487ee: Status 404 returned error can't find the container with id b598cfe3b5c8472d2dd0cc554bd78bb1f94cc6c072e004aef75bafc7446487ee Mar 18 16:44:32.212389 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:32.212362 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-c5nc7" Mar 18 16:44:32.323383 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:32.323322 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-8444df798b-59lfw"] Mar 18 16:44:32.326827 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:44:32.326773 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod78f9ac4f_10cd_438c_8e38_5a772fb6f4e1.slice/crio-46e1e518957a143eefc77962226bed42e57973ba56ece8806d836c9a4ea350f4 WatchSource:0}: Error finding container 46e1e518957a143eefc77962226bed42e57973ba56ece8806d836c9a4ea350f4: Status 404 returned error can't find the container with id 46e1e518957a143eefc77962226bed42e57973ba56ece8806d836c9a4ea350f4 Mar 18 16:44:32.337703 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:32.337674 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-89d76c4f-f4bs8"] Mar 18 16:44:32.340531 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:44:32.340501 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod962d1b72_c4cb_47c2_af2c_3921403d90f0.slice/crio-3b259f00340f850bec67567db4d08e5fc0b00daaeb15bba120a76e3842902314 WatchSource:0}: Error finding container 3b259f00340f850bec67567db4d08e5fc0b00daaeb15bba120a76e3842902314: Status 404 returned error can't find the container with id 3b259f00340f850bec67567db4d08e5fc0b00daaeb15bba120a76e3842902314 Mar 18 16:44:32.351633 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:32.351610 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-c5nc7"] Mar 18 16:44:32.354888 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:44:32.354864 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc49f1334_5f9a_4e6b_8d3c_dadd6d818f8e.slice/crio-13e54c6ca0260d592a746a34b3f837bf3918fe097adbb52e8116e847db1b4eaf WatchSource:0}: Error finding container 13e54c6ca0260d592a746a34b3f837bf3918fe097adbb52e8116e847db1b4eaf: Status 404 returned error can't find the container with id 13e54c6ca0260d592a746a34b3f837bf3918fe097adbb52e8116e847db1b4eaf Mar 18 16:44:32.895062 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:32.895015 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-c5nc7" event={"ID":"c49f1334-5f9a-4e6b-8d3c-dadd6d818f8e","Type":"ContainerStarted","Data":"2fa052cbe41f3a8bf5562935b3a74fcaf69fc88026938a5741f426b19b087052"} Mar 18 16:44:32.895522 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:32.895070 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-c5nc7" event={"ID":"c49f1334-5f9a-4e6b-8d3c-dadd6d818f8e","Type":"ContainerStarted","Data":"13e54c6ca0260d592a746a34b3f837bf3918fe097adbb52e8116e847db1b4eaf"} Mar 18 16:44:32.896876 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:32.896839 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-89d76c4f-f4bs8" event={"ID":"962d1b72-c4cb-47c2-af2c-3921403d90f0","Type":"ContainerStarted","Data":"781f3052a89164c7ace2a0543df90403e999788c9ea5c67b6951bcd169643217"} Mar 18 16:44:32.896876 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:32.896876 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-89d76c4f-f4bs8" event={"ID":"962d1b72-c4cb-47c2-af2c-3921403d90f0","Type":"ContainerStarted","Data":"3b259f00340f850bec67567db4d08e5fc0b00daaeb15bba120a76e3842902314"} Mar 18 16:44:32.897061 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:32.897006 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-89d76c4f-f4bs8" Mar 18 16:44:32.898441 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:32.898379 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-5b85974fd6-4k2gx" event={"ID":"b7fbe299-6374-4715-a716-d47ac426d698","Type":"ContainerStarted","Data":"b598cfe3b5c8472d2dd0cc554bd78bb1f94cc6c072e004aef75bafc7446487ee"} Mar 18 16:44:32.899781 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:32.899753 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-8444df798b-59lfw" event={"ID":"78f9ac4f-10cd-438c-8e38-5a772fb6f4e1","Type":"ContainerStarted","Data":"46e1e518957a143eefc77962226bed42e57973ba56ece8806d836c9a4ea350f4"} Mar 18 16:44:32.899781 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:32.899765 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-9cbc96769-fp2t2" Mar 18 16:44:32.915456 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:32.915368 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-89d76c4f-f4bs8" podStartSLOduration=1.915353986 podStartE2EDuration="1.915353986s" podCreationTimestamp="2026-03-18 16:44:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:44:32.914028447 +0000 UTC m=+63.963843874" watchObservedRunningTime="2026-03-18 16:44:32.915353986 +0000 UTC m=+63.965169438" Mar 18 16:44:32.942469 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:32.942437 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-9cbc96769-fp2t2"] Mar 18 16:44:32.950486 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:32.950430 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-9cbc96769-fp2t2"] Mar 18 16:44:33.020418 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:33.019907 2576 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca-registry-tls\") on node \"ip-10-0-133-190.ec2.internal\" DevicePath \"\"" Mar 18 16:44:33.562197 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:33.562156 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca" path="/var/lib/kubelet/pods/f8367604-a4f9-4f59-8ed4-a8d5a8a0d9ca/volumes" Mar 18 16:44:33.905686 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:33.905642 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-8444df798b-59lfw" event={"ID":"78f9ac4f-10cd-438c-8e38-5a772fb6f4e1","Type":"ContainerStarted","Data":"f5aa5a803e3257179d493ae9321c0fb40f92dbc851ecbb9822f73a7c65b1ba1f"} Mar 18 16:44:33.906159 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:33.906089 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-8444df798b-59lfw" Mar 18 16:44:33.909095 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:33.909058 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-c5nc7" event={"ID":"c49f1334-5f9a-4e6b-8d3c-dadd6d818f8e","Type":"ContainerStarted","Data":"ee01c38a46c7f22b6c9476887b3815572c4ca8bfd5879c540121f8a329d803bc"} Mar 18 16:44:33.911625 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:33.911606 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-8444df798b-59lfw" Mar 18 16:44:33.922776 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:33.922706 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-8444df798b-59lfw" podStartSLOduration=1.542123785 podStartE2EDuration="2.922688561s" podCreationTimestamp="2026-03-18 16:44:31 +0000 UTC" firstStartedPulling="2026-03-18 16:44:32.328885799 +0000 UTC m=+63.378701205" lastFinishedPulling="2026-03-18 16:44:33.709450562 +0000 UTC m=+64.759265981" observedRunningTime="2026-03-18 16:44:33.921851198 +0000 UTC m=+64.971666626" watchObservedRunningTime="2026-03-18 16:44:33.922688561 +0000 UTC m=+64.972503992" Mar 18 16:44:34.230229 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:34.230186 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e3720b07-5dd9-403e-9a66-17abd67f145f-metrics-certs\") pod \"network-metrics-daemon-bd2jt\" (UID: \"e3720b07-5dd9-403e-9a66-17abd67f145f\") " pod="openshift-multus/network-metrics-daemon-bd2jt" Mar 18 16:44:34.230425 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:34.230238 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c617a8be-fcd4-4bd3-971f-b335484b9beb-metrics-tls\") pod \"dns-default-zlctd\" (UID: \"c617a8be-fcd4-4bd3-971f-b335484b9beb\") " pod="openshift-dns/dns-default-zlctd" Mar 18 16:44:34.230425 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:34.230287 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4220311f-b2d3-43c5-87a6-ddf6edd88e2f-cert\") pod \"ingress-canary-mv4xk\" (UID: \"4220311f-b2d3-43c5-87a6-ddf6edd88e2f\") " pod="openshift-ingress-canary/ingress-canary-mv4xk" Mar 18 16:44:34.232284 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:34.232259 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Mar 18 16:44:34.233245 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:34.233222 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c617a8be-fcd4-4bd3-971f-b335484b9beb-metrics-tls\") pod \"dns-default-zlctd\" (UID: \"c617a8be-fcd4-4bd3-971f-b335484b9beb\") " pod="openshift-dns/dns-default-zlctd" Mar 18 16:44:34.233359 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:34.233344 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4220311f-b2d3-43c5-87a6-ddf6edd88e2f-cert\") pod \"ingress-canary-mv4xk\" (UID: \"4220311f-b2d3-43c5-87a6-ddf6edd88e2f\") " pod="openshift-ingress-canary/ingress-canary-mv4xk" Mar 18 16:44:34.243683 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:34.243655 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e3720b07-5dd9-403e-9a66-17abd67f145f-metrics-certs\") pod \"network-metrics-daemon-bd2jt\" (UID: \"e3720b07-5dd9-403e-9a66-17abd67f145f\") " pod="openshift-multus/network-metrics-daemon-bd2jt" Mar 18 16:44:34.331399 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:34.331312 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z68v5\" (UniqueName: \"kubernetes.io/projected/ce437450-60be-4502-aa99-03a4af6c8e7c-kube-api-access-z68v5\") pod \"network-check-target-j676s\" (UID: \"ce437450-60be-4502-aa99-03a4af6c8e7c\") " pod="openshift-network-diagnostics/network-check-target-j676s" Mar 18 16:44:34.333685 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:34.333648 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Mar 18 16:44:34.344569 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:34.344541 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Mar 18 16:44:34.356505 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:34.356471 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z68v5\" (UniqueName: \"kubernetes.io/projected/ce437450-60be-4502-aa99-03a4af6c8e7c-kube-api-access-z68v5\") pod \"network-check-target-j676s\" (UID: \"ce437450-60be-4502-aa99-03a4af6c8e7c\") " pod="openshift-network-diagnostics/network-check-target-j676s" Mar 18 16:44:34.373142 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:34.373110 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-nk2x6\"" Mar 18 16:44:34.380887 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:34.380817 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-pqk66\"" Mar 18 16:44:34.381812 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:34.381741 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-j676s" Mar 18 16:44:34.389483 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:34.389457 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bd2jt" Mar 18 16:44:34.490190 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:34.490106 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-xkmdp\"" Mar 18 16:44:34.498665 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:34.498633 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-mv4xk" Mar 18 16:44:34.507704 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:34.507673 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-7bwt6\"" Mar 18 16:44:34.516296 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:34.516234 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-zlctd" Mar 18 16:44:34.742879 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:34.742791 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-6b948c769-g4hrz"] Mar 18 16:44:34.770682 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:34.770652 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-6b948c769-g4hrz"] Mar 18 16:44:34.770859 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:34.770779 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-6b948c769-g4hrz" Mar 18 16:44:34.772793 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:34.772769 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Mar 18 16:44:34.772906 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:34.772771 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Mar 18 16:44:34.773305 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:34.773287 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Mar 18 16:44:34.773651 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:34.773618 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Mar 18 16:44:34.773772 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:34.773731 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-nc8b9\"" Mar 18 16:44:34.773772 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:34.773618 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Mar 18 16:44:34.836029 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:34.835841 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/01085e71-bd91-4d49-a24f-b59a6e29f008-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-6b948c769-g4hrz\" (UID: \"01085e71-bd91-4d49-a24f-b59a6e29f008\") " pod="openshift-monitoring/prometheus-operator-6b948c769-g4hrz" Mar 18 16:44:34.836029 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:34.835933 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cp58f\" (UniqueName: \"kubernetes.io/projected/01085e71-bd91-4d49-a24f-b59a6e29f008-kube-api-access-cp58f\") pod \"prometheus-operator-6b948c769-g4hrz\" (UID: \"01085e71-bd91-4d49-a24f-b59a6e29f008\") " pod="openshift-monitoring/prometheus-operator-6b948c769-g4hrz" Mar 18 16:44:34.836029 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:34.836011 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/01085e71-bd91-4d49-a24f-b59a6e29f008-metrics-client-ca\") pod \"prometheus-operator-6b948c769-g4hrz\" (UID: \"01085e71-bd91-4d49-a24f-b59a6e29f008\") " pod="openshift-monitoring/prometheus-operator-6b948c769-g4hrz" Mar 18 16:44:34.836292 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:34.836050 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/01085e71-bd91-4d49-a24f-b59a6e29f008-prometheus-operator-tls\") pod \"prometheus-operator-6b948c769-g4hrz\" (UID: \"01085e71-bd91-4d49-a24f-b59a6e29f008\") " pod="openshift-monitoring/prometheus-operator-6b948c769-g4hrz" Mar 18 16:44:34.940623 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:34.936889 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/01085e71-bd91-4d49-a24f-b59a6e29f008-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-6b948c769-g4hrz\" (UID: \"01085e71-bd91-4d49-a24f-b59a6e29f008\") " pod="openshift-monitoring/prometheus-operator-6b948c769-g4hrz" Mar 18 16:44:34.940623 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:34.936988 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cp58f\" (UniqueName: \"kubernetes.io/projected/01085e71-bd91-4d49-a24f-b59a6e29f008-kube-api-access-cp58f\") pod \"prometheus-operator-6b948c769-g4hrz\" (UID: \"01085e71-bd91-4d49-a24f-b59a6e29f008\") " pod="openshift-monitoring/prometheus-operator-6b948c769-g4hrz" Mar 18 16:44:34.940623 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:34.937015 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/01085e71-bd91-4d49-a24f-b59a6e29f008-metrics-client-ca\") pod \"prometheus-operator-6b948c769-g4hrz\" (UID: \"01085e71-bd91-4d49-a24f-b59a6e29f008\") " pod="openshift-monitoring/prometheus-operator-6b948c769-g4hrz" Mar 18 16:44:34.940623 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:34.937049 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/01085e71-bd91-4d49-a24f-b59a6e29f008-prometheus-operator-tls\") pod \"prometheus-operator-6b948c769-g4hrz\" (UID: \"01085e71-bd91-4d49-a24f-b59a6e29f008\") " pod="openshift-monitoring/prometheus-operator-6b948c769-g4hrz" Mar 18 16:44:34.940623 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:44:34.937174 2576 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Mar 18 16:44:34.940623 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:44:34.937236 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/01085e71-bd91-4d49-a24f-b59a6e29f008-prometheus-operator-tls podName:01085e71-bd91-4d49-a24f-b59a6e29f008 nodeName:}" failed. No retries permitted until 2026-03-18 16:44:35.437216018 +0000 UTC m=+66.487031436 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/01085e71-bd91-4d49-a24f-b59a6e29f008-prometheus-operator-tls") pod "prometheus-operator-6b948c769-g4hrz" (UID: "01085e71-bd91-4d49-a24f-b59a6e29f008") : secret "prometheus-operator-tls" not found Mar 18 16:44:34.941815 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:34.941764 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/01085e71-bd91-4d49-a24f-b59a6e29f008-metrics-client-ca\") pod \"prometheus-operator-6b948c769-g4hrz\" (UID: \"01085e71-bd91-4d49-a24f-b59a6e29f008\") " pod="openshift-monitoring/prometheus-operator-6b948c769-g4hrz" Mar 18 16:44:34.950586 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:34.950507 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/01085e71-bd91-4d49-a24f-b59a6e29f008-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-6b948c769-g4hrz\" (UID: \"01085e71-bd91-4d49-a24f-b59a6e29f008\") " pod="openshift-monitoring/prometheus-operator-6b948c769-g4hrz" Mar 18 16:44:34.952606 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:34.952455 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cp58f\" (UniqueName: \"kubernetes.io/projected/01085e71-bd91-4d49-a24f-b59a6e29f008-kube-api-access-cp58f\") pod \"prometheus-operator-6b948c769-g4hrz\" (UID: \"01085e71-bd91-4d49-a24f-b59a6e29f008\") " pod="openshift-monitoring/prometheus-operator-6b948c769-g4hrz" Mar 18 16:44:34.993364 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:34.993285 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-j676s"] Mar 18 16:44:35.013068 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:35.013035 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-zlctd"] Mar 18 16:44:35.125372 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:44:35.125331 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce437450_60be_4502_aa99_03a4af6c8e7c.slice/crio-83bd0f0b7f57f2fd1135dd979e203fb58bd2505e737b1774722d9e3121526b95 WatchSource:0}: Error finding container 83bd0f0b7f57f2fd1135dd979e203fb58bd2505e737b1774722d9e3121526b95: Status 404 returned error can't find the container with id 83bd0f0b7f57f2fd1135dd979e203fb58bd2505e737b1774722d9e3121526b95 Mar 18 16:44:35.125633 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:44:35.125593 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc617a8be_fcd4_4bd3_971f_b335484b9beb.slice/crio-4b2f9938a4a8967b1a7f7e1778353eb3ae89c646da592619c9385050c088a319 WatchSource:0}: Error finding container 4b2f9938a4a8967b1a7f7e1778353eb3ae89c646da592619c9385050c088a319: Status 404 returned error can't find the container with id 4b2f9938a4a8967b1a7f7e1778353eb3ae89c646da592619c9385050c088a319 Mar 18 16:44:35.238111 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:35.238081 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-mv4xk"] Mar 18 16:44:35.241552 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:35.241504 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-bd2jt"] Mar 18 16:44:35.242253 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:44:35.242226 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4220311f_b2d3_43c5_87a6_ddf6edd88e2f.slice/crio-0f222d6c306dfdf1f2e6537a7a59220b1921a8d59e328118fdc97d6912b9a7e0 WatchSource:0}: Error finding container 0f222d6c306dfdf1f2e6537a7a59220b1921a8d59e328118fdc97d6912b9a7e0: Status 404 returned error can't find the container with id 0f222d6c306dfdf1f2e6537a7a59220b1921a8d59e328118fdc97d6912b9a7e0 Mar 18 16:44:35.248014 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:44:35.247986 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3720b07_5dd9_403e_9a66_17abd67f145f.slice/crio-660936feb8c49bae9e3659ecaf1bb3f061d97f22c881c706632df72fb2080ccf WatchSource:0}: Error finding container 660936feb8c49bae9e3659ecaf1bb3f061d97f22c881c706632df72fb2080ccf: Status 404 returned error can't find the container with id 660936feb8c49bae9e3659ecaf1bb3f061d97f22c881c706632df72fb2080ccf Mar 18 16:44:35.442378 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:35.442350 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/01085e71-bd91-4d49-a24f-b59a6e29f008-prometheus-operator-tls\") pod \"prometheus-operator-6b948c769-g4hrz\" (UID: \"01085e71-bd91-4d49-a24f-b59a6e29f008\") " pod="openshift-monitoring/prometheus-operator-6b948c769-g4hrz" Mar 18 16:44:35.445021 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:35.444985 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/01085e71-bd91-4d49-a24f-b59a6e29f008-prometheus-operator-tls\") pod \"prometheus-operator-6b948c769-g4hrz\" (UID: \"01085e71-bd91-4d49-a24f-b59a6e29f008\") " pod="openshift-monitoring/prometheus-operator-6b948c769-g4hrz" Mar 18 16:44:35.683512 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:35.683431 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-6b948c769-g4hrz" Mar 18 16:44:35.854874 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:35.854832 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-6b948c769-g4hrz"] Mar 18 16:44:35.860223 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:44:35.860192 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01085e71_bd91_4d49_a24f_b59a6e29f008.slice/crio-f121e0e57dde2ef34ccd4c3dc9c2d8765362eb0d38852712d0076ddb45875450 WatchSource:0}: Error finding container f121e0e57dde2ef34ccd4c3dc9c2d8765362eb0d38852712d0076ddb45875450: Status 404 returned error can't find the container with id f121e0e57dde2ef34ccd4c3dc9c2d8765362eb0d38852712d0076ddb45875450 Mar 18 16:44:35.919886 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:35.919817 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-bd2jt" event={"ID":"e3720b07-5dd9-403e-9a66-17abd67f145f","Type":"ContainerStarted","Data":"660936feb8c49bae9e3659ecaf1bb3f061d97f22c881c706632df72fb2080ccf"} Mar 18 16:44:35.922392 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:35.922335 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-mv4xk" event={"ID":"4220311f-b2d3-43c5-87a6-ddf6edd88e2f","Type":"ContainerStarted","Data":"0f222d6c306dfdf1f2e6537a7a59220b1921a8d59e328118fdc97d6912b9a7e0"} Mar 18 16:44:35.924824 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:35.924778 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-6b948c769-g4hrz" event={"ID":"01085e71-bd91-4d49-a24f-b59a6e29f008","Type":"ContainerStarted","Data":"f121e0e57dde2ef34ccd4c3dc9c2d8765362eb0d38852712d0076ddb45875450"} Mar 18 16:44:35.926315 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:35.926265 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-zlctd" event={"ID":"c617a8be-fcd4-4bd3-971f-b335484b9beb","Type":"ContainerStarted","Data":"4b2f9938a4a8967b1a7f7e1778353eb3ae89c646da592619c9385050c088a319"} Mar 18 16:44:35.928755 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:35.928712 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-j676s" event={"ID":"ce437450-60be-4502-aa99-03a4af6c8e7c","Type":"ContainerStarted","Data":"83bd0f0b7f57f2fd1135dd979e203fb58bd2505e737b1774722d9e3121526b95"} Mar 18 16:44:35.934006 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:35.933885 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-c5nc7" event={"ID":"c49f1334-5f9a-4e6b-8d3c-dadd6d818f8e","Type":"ContainerStarted","Data":"9dee8568d1a006fc77454bbab9e25f3e8330e0f58133fd9d73043cac79f02be1"} Mar 18 16:44:35.952787 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:35.951815 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-c5nc7" podStartSLOduration=2.20084581 podStartE2EDuration="4.951793745s" podCreationTimestamp="2026-03-18 16:44:31 +0000 UTC" firstStartedPulling="2026-03-18 16:44:32.407423053 +0000 UTC m=+63.457238459" lastFinishedPulling="2026-03-18 16:44:35.158370989 +0000 UTC m=+66.208186394" observedRunningTime="2026-03-18 16:44:35.950660285 +0000 UTC m=+67.000475709" watchObservedRunningTime="2026-03-18 16:44:35.951793745 +0000 UTC m=+67.001609173" Mar 18 16:44:37.116145 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:37.115712 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7858d76678-42r8h"] Mar 18 16:44:37.119344 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:37.119315 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7858d76678-42r8h" Mar 18 16:44:37.121661 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:37.121636 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Mar 18 16:44:37.122635 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:37.121864 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Mar 18 16:44:37.122635 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:37.122056 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-l5zrn\"" Mar 18 16:44:37.122635 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:37.122161 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Mar 18 16:44:37.122635 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:37.122291 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Mar 18 16:44:37.122635 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:37.122497 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Mar 18 16:44:37.129623 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:37.129589 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7858d76678-42r8h"] Mar 18 16:44:37.163888 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:37.163853 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bldj\" (UniqueName: \"kubernetes.io/projected/e0adc9b1-b3ba-423f-8f71-ed54d50bf40d-kube-api-access-9bldj\") pod \"console-7858d76678-42r8h\" (UID: \"e0adc9b1-b3ba-423f-8f71-ed54d50bf40d\") " pod="openshift-console/console-7858d76678-42r8h" Mar 18 16:44:37.164090 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:37.163985 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e0adc9b1-b3ba-423f-8f71-ed54d50bf40d-oauth-serving-cert\") pod \"console-7858d76678-42r8h\" (UID: \"e0adc9b1-b3ba-423f-8f71-ed54d50bf40d\") " pod="openshift-console/console-7858d76678-42r8h" Mar 18 16:44:37.164090 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:37.164047 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e0adc9b1-b3ba-423f-8f71-ed54d50bf40d-console-serving-cert\") pod \"console-7858d76678-42r8h\" (UID: \"e0adc9b1-b3ba-423f-8f71-ed54d50bf40d\") " pod="openshift-console/console-7858d76678-42r8h" Mar 18 16:44:37.164203 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:37.164091 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e0adc9b1-b3ba-423f-8f71-ed54d50bf40d-service-ca\") pod \"console-7858d76678-42r8h\" (UID: \"e0adc9b1-b3ba-423f-8f71-ed54d50bf40d\") " pod="openshift-console/console-7858d76678-42r8h" Mar 18 16:44:37.164203 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:37.164122 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e0adc9b1-b3ba-423f-8f71-ed54d50bf40d-console-oauth-config\") pod \"console-7858d76678-42r8h\" (UID: \"e0adc9b1-b3ba-423f-8f71-ed54d50bf40d\") " pod="openshift-console/console-7858d76678-42r8h" Mar 18 16:44:37.164203 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:37.164149 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e0adc9b1-b3ba-423f-8f71-ed54d50bf40d-console-config\") pod \"console-7858d76678-42r8h\" (UID: \"e0adc9b1-b3ba-423f-8f71-ed54d50bf40d\") " pod="openshift-console/console-7858d76678-42r8h" Mar 18 16:44:37.265285 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:37.265237 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9bldj\" (UniqueName: \"kubernetes.io/projected/e0adc9b1-b3ba-423f-8f71-ed54d50bf40d-kube-api-access-9bldj\") pod \"console-7858d76678-42r8h\" (UID: \"e0adc9b1-b3ba-423f-8f71-ed54d50bf40d\") " pod="openshift-console/console-7858d76678-42r8h" Mar 18 16:44:37.265471 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:37.265295 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e0adc9b1-b3ba-423f-8f71-ed54d50bf40d-oauth-serving-cert\") pod \"console-7858d76678-42r8h\" (UID: \"e0adc9b1-b3ba-423f-8f71-ed54d50bf40d\") " pod="openshift-console/console-7858d76678-42r8h" Mar 18 16:44:37.265471 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:37.265353 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e0adc9b1-b3ba-423f-8f71-ed54d50bf40d-console-serving-cert\") pod \"console-7858d76678-42r8h\" (UID: \"e0adc9b1-b3ba-423f-8f71-ed54d50bf40d\") " pod="openshift-console/console-7858d76678-42r8h" Mar 18 16:44:37.265471 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:37.265391 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e0adc9b1-b3ba-423f-8f71-ed54d50bf40d-service-ca\") pod \"console-7858d76678-42r8h\" (UID: \"e0adc9b1-b3ba-423f-8f71-ed54d50bf40d\") " pod="openshift-console/console-7858d76678-42r8h" Mar 18 16:44:37.265471 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:37.265422 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e0adc9b1-b3ba-423f-8f71-ed54d50bf40d-console-oauth-config\") pod \"console-7858d76678-42r8h\" (UID: \"e0adc9b1-b3ba-423f-8f71-ed54d50bf40d\") " pod="openshift-console/console-7858d76678-42r8h" Mar 18 16:44:37.265471 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:37.265445 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e0adc9b1-b3ba-423f-8f71-ed54d50bf40d-console-config\") pod \"console-7858d76678-42r8h\" (UID: \"e0adc9b1-b3ba-423f-8f71-ed54d50bf40d\") " pod="openshift-console/console-7858d76678-42r8h" Mar 18 16:44:37.266213 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:37.266162 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e0adc9b1-b3ba-423f-8f71-ed54d50bf40d-oauth-serving-cert\") pod \"console-7858d76678-42r8h\" (UID: \"e0adc9b1-b3ba-423f-8f71-ed54d50bf40d\") " pod="openshift-console/console-7858d76678-42r8h" Mar 18 16:44:37.266397 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:37.266325 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e0adc9b1-b3ba-423f-8f71-ed54d50bf40d-console-config\") pod \"console-7858d76678-42r8h\" (UID: \"e0adc9b1-b3ba-423f-8f71-ed54d50bf40d\") " pod="openshift-console/console-7858d76678-42r8h" Mar 18 16:44:37.266397 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:37.266352 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e0adc9b1-b3ba-423f-8f71-ed54d50bf40d-service-ca\") pod \"console-7858d76678-42r8h\" (UID: \"e0adc9b1-b3ba-423f-8f71-ed54d50bf40d\") " pod="openshift-console/console-7858d76678-42r8h" Mar 18 16:44:37.269103 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:37.269031 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e0adc9b1-b3ba-423f-8f71-ed54d50bf40d-console-oauth-config\") pod \"console-7858d76678-42r8h\" (UID: \"e0adc9b1-b3ba-423f-8f71-ed54d50bf40d\") " pod="openshift-console/console-7858d76678-42r8h" Mar 18 16:44:37.269624 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:37.269580 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e0adc9b1-b3ba-423f-8f71-ed54d50bf40d-console-serving-cert\") pod \"console-7858d76678-42r8h\" (UID: \"e0adc9b1-b3ba-423f-8f71-ed54d50bf40d\") " pod="openshift-console/console-7858d76678-42r8h" Mar 18 16:44:37.274432 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:37.274408 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bldj\" (UniqueName: \"kubernetes.io/projected/e0adc9b1-b3ba-423f-8f71-ed54d50bf40d-kube-api-access-9bldj\") pod \"console-7858d76678-42r8h\" (UID: \"e0adc9b1-b3ba-423f-8f71-ed54d50bf40d\") " pod="openshift-console/console-7858d76678-42r8h" Mar 18 16:44:37.432983 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:37.432884 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7858d76678-42r8h" Mar 18 16:44:39.916061 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:39.916032 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7858d76678-42r8h"] Mar 18 16:44:39.922868 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:44:39.922749 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0adc9b1_b3ba_423f_8f71_ed54d50bf40d.slice/crio-f2ff82095ade4b96d94bce4ce2fa5e877c2b709cbca6c6a3c2f0f04e2f0989f3 WatchSource:0}: Error finding container f2ff82095ade4b96d94bce4ce2fa5e877c2b709cbca6c6a3c2f0f04e2f0989f3: Status 404 returned error can't find the container with id f2ff82095ade4b96d94bce4ce2fa5e877c2b709cbca6c6a3c2f0f04e2f0989f3 Mar 18 16:44:39.956246 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:39.956135 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-6b948c769-g4hrz" event={"ID":"01085e71-bd91-4d49-a24f-b59a6e29f008","Type":"ContainerStarted","Data":"1833b0b75ba7facb7e991872d77add8acc3727691b46fa598f2cce50abe2c96a"} Mar 18 16:44:39.958210 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:39.958176 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-j676s" event={"ID":"ce437450-60be-4502-aa99-03a4af6c8e7c","Type":"ContainerStarted","Data":"87db5eea84d87399b4551b213e95934015e24b32a2034bd348c1bda3532701d4"} Mar 18 16:44:39.958926 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:39.958905 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-j676s" Mar 18 16:44:39.961627 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:39.961504 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-mv4xk" event={"ID":"4220311f-b2d3-43c5-87a6-ddf6edd88e2f","Type":"ContainerStarted","Data":"edcdc4feb581062076fbdcf65beb108c610eefc07d615f61afa1330e6fb2c1d9"} Mar 18 16:44:39.963006 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:39.962975 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7858d76678-42r8h" event={"ID":"e0adc9b1-b3ba-423f-8f71-ed54d50bf40d","Type":"ContainerStarted","Data":"f2ff82095ade4b96d94bce4ce2fa5e877c2b709cbca6c6a3c2f0f04e2f0989f3"} Mar 18 16:44:39.992745 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:39.991459 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-j676s" podStartSLOduration=66.333809028 podStartE2EDuration="1m10.991440489s" podCreationTimestamp="2026-03-18 16:43:29 +0000 UTC" firstStartedPulling="2026-03-18 16:44:35.152228763 +0000 UTC m=+66.202044182" lastFinishedPulling="2026-03-18 16:44:39.809860232 +0000 UTC m=+70.859675643" observedRunningTime="2026-03-18 16:44:39.991005321 +0000 UTC m=+71.040820748" watchObservedRunningTime="2026-03-18 16:44:39.991440489 +0000 UTC m=+71.041255917" Mar 18 16:44:40.968207 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:40.968166 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-6b948c769-g4hrz" event={"ID":"01085e71-bd91-4d49-a24f-b59a6e29f008","Type":"ContainerStarted","Data":"05c0b744222da98a0a5661ae61b91e82e8a22224843028154c05250f258ec509"} Mar 18 16:44:40.970402 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:40.970372 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-zlctd" event={"ID":"c617a8be-fcd4-4bd3-971f-b335484b9beb","Type":"ContainerStarted","Data":"efd05f816b91a50e26aacf49b5b7829b0ed8d59ea71bd30b60d8dbc46f151e16"} Mar 18 16:44:40.970527 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:40.970409 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-zlctd" event={"ID":"c617a8be-fcd4-4bd3-971f-b335484b9beb","Type":"ContainerStarted","Data":"8cc1897b6089964e3fb5fc63432fa6b66d225ff8f1591ee6bcb361b6717a4060"} Mar 18 16:44:40.970527 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:40.970459 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-zlctd" Mar 18 16:44:40.972358 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:40.972329 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-bd2jt" event={"ID":"e3720b07-5dd9-403e-9a66-17abd67f145f","Type":"ContainerStarted","Data":"32fba25c9ddba23e474b86cb9df2c8c995a4d3f0260c6d288dda6c70e0da4697"} Mar 18 16:44:40.972481 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:40.972362 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-bd2jt" event={"ID":"e3720b07-5dd9-403e-9a66-17abd67f145f","Type":"ContainerStarted","Data":"c20483a5f879b4cfcadcba9c2d08b89914f18f09b7a7bee5d10cd77ed8376552"} Mar 18 16:44:40.984569 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:40.984521 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-mv4xk" podStartSLOduration=34.478978022 podStartE2EDuration="38.984505155s" podCreationTimestamp="2026-03-18 16:44:02 +0000 UTC" firstStartedPulling="2026-03-18 16:44:35.244452547 +0000 UTC m=+66.294267957" lastFinishedPulling="2026-03-18 16:44:39.749979676 +0000 UTC m=+70.799795090" observedRunningTime="2026-03-18 16:44:40.030365621 +0000 UTC m=+71.080181050" watchObservedRunningTime="2026-03-18 16:44:40.984505155 +0000 UTC m=+72.034320565" Mar 18 16:44:40.985538 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:40.985494 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-6b948c769-g4hrz" podStartSLOduration=3.098241924 podStartE2EDuration="6.985481227s" podCreationTimestamp="2026-03-18 16:44:34 +0000 UTC" firstStartedPulling="2026-03-18 16:44:35.865871253 +0000 UTC m=+66.915686675" lastFinishedPulling="2026-03-18 16:44:39.753110566 +0000 UTC m=+70.802925978" observedRunningTime="2026-03-18 16:44:40.983845113 +0000 UTC m=+72.033660541" watchObservedRunningTime="2026-03-18 16:44:40.985481227 +0000 UTC m=+72.035296657" Mar 18 16:44:40.998115 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:40.998042 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-bd2jt" podStartSLOduration=67.504378396 podStartE2EDuration="1m11.998028945s" podCreationTimestamp="2026-03-18 16:43:29 +0000 UTC" firstStartedPulling="2026-03-18 16:44:35.250822034 +0000 UTC m=+66.300637457" lastFinishedPulling="2026-03-18 16:44:39.744472583 +0000 UTC m=+70.794288006" observedRunningTime="2026-03-18 16:44:40.997486954 +0000 UTC m=+72.047302476" watchObservedRunningTime="2026-03-18 16:44:40.998028945 +0000 UTC m=+72.047844374" Mar 18 16:44:41.015199 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:41.015138 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-zlctd" podStartSLOduration=34.416825521 podStartE2EDuration="39.015121148s" podCreationTimestamp="2026-03-18 16:44:02 +0000 UTC" firstStartedPulling="2026-03-18 16:44:35.152280418 +0000 UTC m=+66.202095829" lastFinishedPulling="2026-03-18 16:44:39.750576052 +0000 UTC m=+70.800391456" observedRunningTime="2026-03-18 16:44:41.014521354 +0000 UTC m=+72.064336785" watchObservedRunningTime="2026-03-18 16:44:41.015121148 +0000 UTC m=+72.064936577" Mar 18 16:44:43.135282 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:43.135049 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-ctmdb"] Mar 18 16:44:43.211974 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:43.211919 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-ctmdb" Mar 18 16:44:43.215575 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:43.214760 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-t8t9n\"" Mar 18 16:44:43.215575 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:43.215102 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Mar 18 16:44:43.215575 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:43.215219 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Mar 18 16:44:43.215575 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:43.215322 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Mar 18 16:44:43.316573 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:43.316543 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b33135c9-e8d8-4c75-9332-67bbe7bfcaab-sys\") pod \"node-exporter-ctmdb\" (UID: \"b33135c9-e8d8-4c75-9332-67bbe7bfcaab\") " pod="openshift-monitoring/node-exporter-ctmdb" Mar 18 16:44:43.316758 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:43.316587 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/b33135c9-e8d8-4c75-9332-67bbe7bfcaab-node-exporter-textfile\") pod \"node-exporter-ctmdb\" (UID: \"b33135c9-e8d8-4c75-9332-67bbe7bfcaab\") " pod="openshift-monitoring/node-exporter-ctmdb" Mar 18 16:44:43.316758 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:43.316617 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/b33135c9-e8d8-4c75-9332-67bbe7bfcaab-node-exporter-wtmp\") pod \"node-exporter-ctmdb\" (UID: \"b33135c9-e8d8-4c75-9332-67bbe7bfcaab\") " pod="openshift-monitoring/node-exporter-ctmdb" Mar 18 16:44:43.316758 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:43.316640 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vjmx\" (UniqueName: \"kubernetes.io/projected/b33135c9-e8d8-4c75-9332-67bbe7bfcaab-kube-api-access-8vjmx\") pod \"node-exporter-ctmdb\" (UID: \"b33135c9-e8d8-4c75-9332-67bbe7bfcaab\") " pod="openshift-monitoring/node-exporter-ctmdb" Mar 18 16:44:43.316758 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:43.316666 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/b33135c9-e8d8-4c75-9332-67bbe7bfcaab-node-exporter-accelerators-collector-config\") pod \"node-exporter-ctmdb\" (UID: \"b33135c9-e8d8-4c75-9332-67bbe7bfcaab\") " pod="openshift-monitoring/node-exporter-ctmdb" Mar 18 16:44:43.316758 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:43.316690 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/b33135c9-e8d8-4c75-9332-67bbe7bfcaab-node-exporter-tls\") pod \"node-exporter-ctmdb\" (UID: \"b33135c9-e8d8-4c75-9332-67bbe7bfcaab\") " pod="openshift-monitoring/node-exporter-ctmdb" Mar 18 16:44:43.316758 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:43.316728 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/b33135c9-e8d8-4c75-9332-67bbe7bfcaab-root\") pod \"node-exporter-ctmdb\" (UID: \"b33135c9-e8d8-4c75-9332-67bbe7bfcaab\") " pod="openshift-monitoring/node-exporter-ctmdb" Mar 18 16:44:43.317105 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:43.316766 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b33135c9-e8d8-4c75-9332-67bbe7bfcaab-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-ctmdb\" (UID: \"b33135c9-e8d8-4c75-9332-67bbe7bfcaab\") " pod="openshift-monitoring/node-exporter-ctmdb" Mar 18 16:44:43.317105 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:43.316812 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b33135c9-e8d8-4c75-9332-67bbe7bfcaab-metrics-client-ca\") pod \"node-exporter-ctmdb\" (UID: \"b33135c9-e8d8-4c75-9332-67bbe7bfcaab\") " pod="openshift-monitoring/node-exporter-ctmdb" Mar 18 16:44:43.417326 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:43.417253 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b33135c9-e8d8-4c75-9332-67bbe7bfcaab-sys\") pod \"node-exporter-ctmdb\" (UID: \"b33135c9-e8d8-4c75-9332-67bbe7bfcaab\") " pod="openshift-monitoring/node-exporter-ctmdb" Mar 18 16:44:43.417326 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:43.417298 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/b33135c9-e8d8-4c75-9332-67bbe7bfcaab-node-exporter-textfile\") pod \"node-exporter-ctmdb\" (UID: \"b33135c9-e8d8-4c75-9332-67bbe7bfcaab\") " pod="openshift-monitoring/node-exporter-ctmdb" Mar 18 16:44:43.417550 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:43.417331 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/b33135c9-e8d8-4c75-9332-67bbe7bfcaab-node-exporter-wtmp\") pod \"node-exporter-ctmdb\" (UID: \"b33135c9-e8d8-4c75-9332-67bbe7bfcaab\") " pod="openshift-monitoring/node-exporter-ctmdb" Mar 18 16:44:43.417550 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:43.417351 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8vjmx\" (UniqueName: \"kubernetes.io/projected/b33135c9-e8d8-4c75-9332-67bbe7bfcaab-kube-api-access-8vjmx\") pod \"node-exporter-ctmdb\" (UID: \"b33135c9-e8d8-4c75-9332-67bbe7bfcaab\") " pod="openshift-monitoring/node-exporter-ctmdb" Mar 18 16:44:43.417550 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:43.417381 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/b33135c9-e8d8-4c75-9332-67bbe7bfcaab-node-exporter-accelerators-collector-config\") pod \"node-exporter-ctmdb\" (UID: \"b33135c9-e8d8-4c75-9332-67bbe7bfcaab\") " pod="openshift-monitoring/node-exporter-ctmdb" Mar 18 16:44:43.417550 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:43.417384 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b33135c9-e8d8-4c75-9332-67bbe7bfcaab-sys\") pod \"node-exporter-ctmdb\" (UID: \"b33135c9-e8d8-4c75-9332-67bbe7bfcaab\") " pod="openshift-monitoring/node-exporter-ctmdb" Mar 18 16:44:43.417550 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:43.417411 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/b33135c9-e8d8-4c75-9332-67bbe7bfcaab-node-exporter-tls\") pod \"node-exporter-ctmdb\" (UID: \"b33135c9-e8d8-4c75-9332-67bbe7bfcaab\") " pod="openshift-monitoring/node-exporter-ctmdb" Mar 18 16:44:43.417550 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:43.417472 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/b33135c9-e8d8-4c75-9332-67bbe7bfcaab-root\") pod \"node-exporter-ctmdb\" (UID: \"b33135c9-e8d8-4c75-9332-67bbe7bfcaab\") " pod="openshift-monitoring/node-exporter-ctmdb" Mar 18 16:44:43.417550 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:43.417524 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b33135c9-e8d8-4c75-9332-67bbe7bfcaab-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-ctmdb\" (UID: \"b33135c9-e8d8-4c75-9332-67bbe7bfcaab\") " pod="openshift-monitoring/node-exporter-ctmdb" Mar 18 16:44:43.417550 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:43.417526 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/b33135c9-e8d8-4c75-9332-67bbe7bfcaab-node-exporter-wtmp\") pod \"node-exporter-ctmdb\" (UID: \"b33135c9-e8d8-4c75-9332-67bbe7bfcaab\") " pod="openshift-monitoring/node-exporter-ctmdb" Mar 18 16:44:43.417966 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:43.417562 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/b33135c9-e8d8-4c75-9332-67bbe7bfcaab-root\") pod \"node-exporter-ctmdb\" (UID: \"b33135c9-e8d8-4c75-9332-67bbe7bfcaab\") " pod="openshift-monitoring/node-exporter-ctmdb" Mar 18 16:44:43.417966 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:43.417585 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b33135c9-e8d8-4c75-9332-67bbe7bfcaab-metrics-client-ca\") pod \"node-exporter-ctmdb\" (UID: \"b33135c9-e8d8-4c75-9332-67bbe7bfcaab\") " pod="openshift-monitoring/node-exporter-ctmdb" Mar 18 16:44:43.417966 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:43.417741 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/b33135c9-e8d8-4c75-9332-67bbe7bfcaab-node-exporter-textfile\") pod \"node-exporter-ctmdb\" (UID: \"b33135c9-e8d8-4c75-9332-67bbe7bfcaab\") " pod="openshift-monitoring/node-exporter-ctmdb" Mar 18 16:44:43.418216 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:43.418172 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b33135c9-e8d8-4c75-9332-67bbe7bfcaab-metrics-client-ca\") pod \"node-exporter-ctmdb\" (UID: \"b33135c9-e8d8-4c75-9332-67bbe7bfcaab\") " pod="openshift-monitoring/node-exporter-ctmdb" Mar 18 16:44:43.418309 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:43.418217 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/b33135c9-e8d8-4c75-9332-67bbe7bfcaab-node-exporter-accelerators-collector-config\") pod \"node-exporter-ctmdb\" (UID: \"b33135c9-e8d8-4c75-9332-67bbe7bfcaab\") " pod="openshift-monitoring/node-exporter-ctmdb" Mar 18 16:44:43.420358 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:43.420331 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b33135c9-e8d8-4c75-9332-67bbe7bfcaab-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-ctmdb\" (UID: \"b33135c9-e8d8-4c75-9332-67bbe7bfcaab\") " pod="openshift-monitoring/node-exporter-ctmdb" Mar 18 16:44:43.420853 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:43.420831 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/b33135c9-e8d8-4c75-9332-67bbe7bfcaab-node-exporter-tls\") pod \"node-exporter-ctmdb\" (UID: \"b33135c9-e8d8-4c75-9332-67bbe7bfcaab\") " pod="openshift-monitoring/node-exporter-ctmdb" Mar 18 16:44:43.426401 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:43.426377 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vjmx\" (UniqueName: \"kubernetes.io/projected/b33135c9-e8d8-4c75-9332-67bbe7bfcaab-kube-api-access-8vjmx\") pod \"node-exporter-ctmdb\" (UID: \"b33135c9-e8d8-4c75-9332-67bbe7bfcaab\") " pod="openshift-monitoring/node-exporter-ctmdb" Mar 18 16:44:43.526497 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:43.526403 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-ctmdb" Mar 18 16:44:43.613869 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:44:43.613832 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb33135c9_e8d8_4c75_9332_67bbe7bfcaab.slice/crio-d05cd9cac007d9ba4192662e4e071e76f9394386f85ffa4390008f6234f8aa7c WatchSource:0}: Error finding container d05cd9cac007d9ba4192662e4e071e76f9394386f85ffa4390008f6234f8aa7c: Status 404 returned error can't find the container with id d05cd9cac007d9ba4192662e4e071e76f9394386f85ffa4390008f6234f8aa7c Mar 18 16:44:43.984182 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:43.984058 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7858d76678-42r8h" event={"ID":"e0adc9b1-b3ba-423f-8f71-ed54d50bf40d","Type":"ContainerStarted","Data":"25d10bbff3332012774633ce04a83403167a0b618297b0c52056c620976636b7"} Mar 18 16:44:43.985295 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:43.985268 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-ctmdb" event={"ID":"b33135c9-e8d8-4c75-9332-67bbe7bfcaab","Type":"ContainerStarted","Data":"d05cd9cac007d9ba4192662e4e071e76f9394386f85ffa4390008f6234f8aa7c"} Mar 18 16:44:44.001129 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:44.001078 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7858d76678-42r8h" podStartSLOduration=3.328572588 podStartE2EDuration="7.001060639s" podCreationTimestamp="2026-03-18 16:44:37 +0000 UTC" firstStartedPulling="2026-03-18 16:44:39.943048746 +0000 UTC m=+70.992864154" lastFinishedPulling="2026-03-18 16:44:43.615536796 +0000 UTC m=+74.665352205" observedRunningTime="2026-03-18 16:44:44.000577369 +0000 UTC m=+75.050392824" watchObservedRunningTime="2026-03-18 16:44:44.001060639 +0000 UTC m=+75.050876079" Mar 18 16:44:47.433557 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:47.433512 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7858d76678-42r8h" Mar 18 16:44:47.433557 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:47.433563 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7858d76678-42r8h" Mar 18 16:44:47.438978 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:47.438931 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7858d76678-42r8h" Mar 18 16:44:48.007446 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:48.007415 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7858d76678-42r8h" Mar 18 16:44:50.154254 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:50.154121 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5fc7687f76-452pq"] Mar 18 16:44:50.158733 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:50.158708 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5fc7687f76-452pq" Mar 18 16:44:50.172612 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:50.172585 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Mar 18 16:44:50.175458 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:50.175432 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5fc7687f76-452pq"] Mar 18 16:44:50.274569 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:50.274534 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/55ff9595-d7e5-4453-8bcd-095510d332d3-trusted-ca-bundle\") pod \"console-5fc7687f76-452pq\" (UID: \"55ff9595-d7e5-4453-8bcd-095510d332d3\") " pod="openshift-console/console-5fc7687f76-452pq" Mar 18 16:44:50.274747 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:50.274582 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/55ff9595-d7e5-4453-8bcd-095510d332d3-console-oauth-config\") pod \"console-5fc7687f76-452pq\" (UID: \"55ff9595-d7e5-4453-8bcd-095510d332d3\") " pod="openshift-console/console-5fc7687f76-452pq" Mar 18 16:44:50.274747 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:50.274600 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/55ff9595-d7e5-4453-8bcd-095510d332d3-oauth-serving-cert\") pod \"console-5fc7687f76-452pq\" (UID: \"55ff9595-d7e5-4453-8bcd-095510d332d3\") " pod="openshift-console/console-5fc7687f76-452pq" Mar 18 16:44:50.274747 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:50.274687 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/55ff9595-d7e5-4453-8bcd-095510d332d3-service-ca\") pod \"console-5fc7687f76-452pq\" (UID: \"55ff9595-d7e5-4453-8bcd-095510d332d3\") " pod="openshift-console/console-5fc7687f76-452pq" Mar 18 16:44:50.274747 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:50.274744 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x57mp\" (UniqueName: \"kubernetes.io/projected/55ff9595-d7e5-4453-8bcd-095510d332d3-kube-api-access-x57mp\") pod \"console-5fc7687f76-452pq\" (UID: \"55ff9595-d7e5-4453-8bcd-095510d332d3\") " pod="openshift-console/console-5fc7687f76-452pq" Mar 18 16:44:50.274985 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:50.274774 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/55ff9595-d7e5-4453-8bcd-095510d332d3-console-serving-cert\") pod \"console-5fc7687f76-452pq\" (UID: \"55ff9595-d7e5-4453-8bcd-095510d332d3\") " pod="openshift-console/console-5fc7687f76-452pq" Mar 18 16:44:50.274985 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:50.274804 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/55ff9595-d7e5-4453-8bcd-095510d332d3-console-config\") pod \"console-5fc7687f76-452pq\" (UID: \"55ff9595-d7e5-4453-8bcd-095510d332d3\") " pod="openshift-console/console-5fc7687f76-452pq" Mar 18 16:44:50.375744 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:50.375700 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/55ff9595-d7e5-4453-8bcd-095510d332d3-trusted-ca-bundle\") pod \"console-5fc7687f76-452pq\" (UID: \"55ff9595-d7e5-4453-8bcd-095510d332d3\") " pod="openshift-console/console-5fc7687f76-452pq" Mar 18 16:44:50.375919 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:50.375767 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/55ff9595-d7e5-4453-8bcd-095510d332d3-console-oauth-config\") pod \"console-5fc7687f76-452pq\" (UID: \"55ff9595-d7e5-4453-8bcd-095510d332d3\") " pod="openshift-console/console-5fc7687f76-452pq" Mar 18 16:44:50.375919 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:50.375798 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/55ff9595-d7e5-4453-8bcd-095510d332d3-oauth-serving-cert\") pod \"console-5fc7687f76-452pq\" (UID: \"55ff9595-d7e5-4453-8bcd-095510d332d3\") " pod="openshift-console/console-5fc7687f76-452pq" Mar 18 16:44:50.375919 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:50.375837 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/55ff9595-d7e5-4453-8bcd-095510d332d3-service-ca\") pod \"console-5fc7687f76-452pq\" (UID: \"55ff9595-d7e5-4453-8bcd-095510d332d3\") " pod="openshift-console/console-5fc7687f76-452pq" Mar 18 16:44:50.375919 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:50.375877 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x57mp\" (UniqueName: \"kubernetes.io/projected/55ff9595-d7e5-4453-8bcd-095510d332d3-kube-api-access-x57mp\") pod \"console-5fc7687f76-452pq\" (UID: \"55ff9595-d7e5-4453-8bcd-095510d332d3\") " pod="openshift-console/console-5fc7687f76-452pq" Mar 18 16:44:50.375919 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:50.375903 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/55ff9595-d7e5-4453-8bcd-095510d332d3-console-serving-cert\") pod \"console-5fc7687f76-452pq\" (UID: \"55ff9595-d7e5-4453-8bcd-095510d332d3\") " pod="openshift-console/console-5fc7687f76-452pq" Mar 18 16:44:50.376215 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:50.375927 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/55ff9595-d7e5-4453-8bcd-095510d332d3-console-config\") pod \"console-5fc7687f76-452pq\" (UID: \"55ff9595-d7e5-4453-8bcd-095510d332d3\") " pod="openshift-console/console-5fc7687f76-452pq" Mar 18 16:44:50.376639 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:50.376608 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/55ff9595-d7e5-4453-8bcd-095510d332d3-oauth-serving-cert\") pod \"console-5fc7687f76-452pq\" (UID: \"55ff9595-d7e5-4453-8bcd-095510d332d3\") " pod="openshift-console/console-5fc7687f76-452pq" Mar 18 16:44:50.376639 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:50.376632 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/55ff9595-d7e5-4453-8bcd-095510d332d3-service-ca\") pod \"console-5fc7687f76-452pq\" (UID: \"55ff9595-d7e5-4453-8bcd-095510d332d3\") " pod="openshift-console/console-5fc7687f76-452pq" Mar 18 16:44:50.376796 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:50.376687 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/55ff9595-d7e5-4453-8bcd-095510d332d3-console-config\") pod \"console-5fc7687f76-452pq\" (UID: \"55ff9595-d7e5-4453-8bcd-095510d332d3\") " pod="openshift-console/console-5fc7687f76-452pq" Mar 18 16:44:50.376796 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:50.376697 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/55ff9595-d7e5-4453-8bcd-095510d332d3-trusted-ca-bundle\") pod \"console-5fc7687f76-452pq\" (UID: \"55ff9595-d7e5-4453-8bcd-095510d332d3\") " pod="openshift-console/console-5fc7687f76-452pq" Mar 18 16:44:50.379128 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:50.379108 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/55ff9595-d7e5-4453-8bcd-095510d332d3-console-serving-cert\") pod \"console-5fc7687f76-452pq\" (UID: \"55ff9595-d7e5-4453-8bcd-095510d332d3\") " pod="openshift-console/console-5fc7687f76-452pq" Mar 18 16:44:50.379230 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:50.379169 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/55ff9595-d7e5-4453-8bcd-095510d332d3-console-oauth-config\") pod \"console-5fc7687f76-452pq\" (UID: \"55ff9595-d7e5-4453-8bcd-095510d332d3\") " pod="openshift-console/console-5fc7687f76-452pq" Mar 18 16:44:50.384165 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:50.384144 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x57mp\" (UniqueName: \"kubernetes.io/projected/55ff9595-d7e5-4453-8bcd-095510d332d3-kube-api-access-x57mp\") pod \"console-5fc7687f76-452pq\" (UID: \"55ff9595-d7e5-4453-8bcd-095510d332d3\") " pod="openshift-console/console-5fc7687f76-452pq" Mar 18 16:44:50.472489 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:50.472454 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5fc7687f76-452pq" Mar 18 16:44:50.978239 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:50.978206 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-zlctd" Mar 18 16:44:51.385278 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:51.385201 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5fc7687f76-452pq"] Mar 18 16:44:51.388816 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:44:51.388779 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55ff9595_d7e5_4453_8bcd_095510d332d3.slice/crio-eaba5850e06d9a48c030e7f2c1c6fbc2dd3127dad62eab38f94c183c859b0160 WatchSource:0}: Error finding container eaba5850e06d9a48c030e7f2c1c6fbc2dd3127dad62eab38f94c183c859b0160: Status 404 returned error can't find the container with id eaba5850e06d9a48c030e7f2c1c6fbc2dd3127dad62eab38f94c183c859b0160 Mar 18 16:44:52.017829 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:52.017788 2576 generic.go:358] "Generic (PLEG): container finished" podID="b33135c9-e8d8-4c75-9332-67bbe7bfcaab" containerID="e032e55272c54cea1e70aeb7e8cc0077af179e265102e992941296ccfea0fff1" exitCode=0 Mar 18 16:44:52.018028 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:52.017930 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-ctmdb" event={"ID":"b33135c9-e8d8-4c75-9332-67bbe7bfcaab","Type":"ContainerDied","Data":"e032e55272c54cea1e70aeb7e8cc0077af179e265102e992941296ccfea0fff1"} Mar 18 16:44:52.019739 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:52.019711 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5fc7687f76-452pq" event={"ID":"55ff9595-d7e5-4453-8bcd-095510d332d3","Type":"ContainerStarted","Data":"7eeaf9f0c9043180cf52b6c8b85671d063373b8ec97ce975e22cf5156ab84e98"} Mar 18 16:44:52.019863 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:52.019750 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5fc7687f76-452pq" event={"ID":"55ff9595-d7e5-4453-8bcd-095510d332d3","Type":"ContainerStarted","Data":"eaba5850e06d9a48c030e7f2c1c6fbc2dd3127dad62eab38f94c183c859b0160"} Mar 18 16:44:52.021336 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:52.021303 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-5b85974fd6-4k2gx" event={"ID":"b7fbe299-6374-4715-a716-d47ac426d698","Type":"ContainerStarted","Data":"48ab9cf58a95c1679c8e906709fce41e88228918ac100fc72f600968fa730f4d"} Mar 18 16:44:52.021548 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:52.021518 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-5b85974fd6-4k2gx" Mar 18 16:44:52.028034 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:52.028009 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-5b85974fd6-4k2gx" Mar 18 16:44:52.056630 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:52.056579 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5fc7687f76-452pq" podStartSLOduration=2.056561734 podStartE2EDuration="2.056561734s" podCreationTimestamp="2026-03-18 16:44:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:44:52.055442683 +0000 UTC m=+83.105258112" watchObservedRunningTime="2026-03-18 16:44:52.056561734 +0000 UTC m=+83.106377161" Mar 18 16:44:52.070212 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:52.070151 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-5b85974fd6-4k2gx" podStartSLOduration=1.956295522 podStartE2EDuration="21.070131187s" podCreationTimestamp="2026-03-18 16:44:31 +0000 UTC" firstStartedPulling="2026-03-18 16:44:32.198966697 +0000 UTC m=+63.248782105" lastFinishedPulling="2026-03-18 16:44:51.312802356 +0000 UTC m=+82.362617770" observedRunningTime="2026-03-18 16:44:52.06990655 +0000 UTC m=+83.119721980" watchObservedRunningTime="2026-03-18 16:44:52.070131187 +0000 UTC m=+83.119946615" Mar 18 16:44:53.026657 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:53.026616 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-ctmdb" event={"ID":"b33135c9-e8d8-4c75-9332-67bbe7bfcaab","Type":"ContainerStarted","Data":"2cd3d8bf18a8cc6bc17ae5551cf6e321385a774a88b0e32190420715f02c3032"} Mar 18 16:44:53.027156 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:53.026668 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-ctmdb" event={"ID":"b33135c9-e8d8-4c75-9332-67bbe7bfcaab","Type":"ContainerStarted","Data":"b39ab7c80ee8c3aad9298a7c0f48a7fa6d33c1063bc07fe2094e9bc204ae92b4"} Mar 18 16:44:53.047768 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:53.047708 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-ctmdb" podStartSLOduration=2.398530778 podStartE2EDuration="10.047689539s" podCreationTimestamp="2026-03-18 16:44:43 +0000 UTC" firstStartedPulling="2026-03-18 16:44:43.61610721 +0000 UTC m=+74.665922619" lastFinishedPulling="2026-03-18 16:44:51.265265975 +0000 UTC m=+82.315081380" observedRunningTime="2026-03-18 16:44:53.046625522 +0000 UTC m=+84.096440949" watchObservedRunningTime="2026-03-18 16:44:53.047689539 +0000 UTC m=+84.097504968" Mar 18 16:44:53.913925 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:44:53.913891 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-89d76c4f-f4bs8" Mar 18 16:45:00.472594 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:45:00.472553 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5fc7687f76-452pq" Mar 18 16:45:00.472594 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:45:00.472593 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5fc7687f76-452pq" Mar 18 16:45:00.477335 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:45:00.477316 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5fc7687f76-452pq" Mar 18 16:45:01.056863 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:45:01.056834 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5fc7687f76-452pq" Mar 18 16:45:01.105442 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:45:01.105409 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7858d76678-42r8h"] Mar 18 16:45:11.978671 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:45:11.978634 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-j676s" Mar 18 16:45:26.124375 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:45:26.124313 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-7858d76678-42r8h" podUID="e0adc9b1-b3ba-423f-8f71-ed54d50bf40d" containerName="console" containerID="cri-o://25d10bbff3332012774633ce04a83403167a0b618297b0c52056c620976636b7" gracePeriod=15 Mar 18 16:45:26.364506 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:45:26.364482 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7858d76678-42r8h_e0adc9b1-b3ba-423f-8f71-ed54d50bf40d/console/0.log" Mar 18 16:45:26.364625 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:45:26.364546 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7858d76678-42r8h" Mar 18 16:45:26.457350 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:45:26.457265 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9bldj\" (UniqueName: \"kubernetes.io/projected/e0adc9b1-b3ba-423f-8f71-ed54d50bf40d-kube-api-access-9bldj\") pod \"e0adc9b1-b3ba-423f-8f71-ed54d50bf40d\" (UID: \"e0adc9b1-b3ba-423f-8f71-ed54d50bf40d\") " Mar 18 16:45:26.457350 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:45:26.457307 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e0adc9b1-b3ba-423f-8f71-ed54d50bf40d-console-config\") pod \"e0adc9b1-b3ba-423f-8f71-ed54d50bf40d\" (UID: \"e0adc9b1-b3ba-423f-8f71-ed54d50bf40d\") " Mar 18 16:45:26.457350 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:45:26.457335 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e0adc9b1-b3ba-423f-8f71-ed54d50bf40d-console-serving-cert\") pod \"e0adc9b1-b3ba-423f-8f71-ed54d50bf40d\" (UID: \"e0adc9b1-b3ba-423f-8f71-ed54d50bf40d\") " Mar 18 16:45:26.457615 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:45:26.457364 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e0adc9b1-b3ba-423f-8f71-ed54d50bf40d-oauth-serving-cert\") pod \"e0adc9b1-b3ba-423f-8f71-ed54d50bf40d\" (UID: \"e0adc9b1-b3ba-423f-8f71-ed54d50bf40d\") " Mar 18 16:45:26.457615 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:45:26.457437 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e0adc9b1-b3ba-423f-8f71-ed54d50bf40d-console-oauth-config\") pod \"e0adc9b1-b3ba-423f-8f71-ed54d50bf40d\" (UID: \"e0adc9b1-b3ba-423f-8f71-ed54d50bf40d\") " Mar 18 16:45:26.457615 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:45:26.457464 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e0adc9b1-b3ba-423f-8f71-ed54d50bf40d-service-ca\") pod \"e0adc9b1-b3ba-423f-8f71-ed54d50bf40d\" (UID: \"e0adc9b1-b3ba-423f-8f71-ed54d50bf40d\") " Mar 18 16:45:26.457822 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:45:26.457790 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0adc9b1-b3ba-423f-8f71-ed54d50bf40d-console-config" (OuterVolumeSpecName: "console-config") pod "e0adc9b1-b3ba-423f-8f71-ed54d50bf40d" (UID: "e0adc9b1-b3ba-423f-8f71-ed54d50bf40d"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 16:45:26.457888 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:45:26.457840 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0adc9b1-b3ba-423f-8f71-ed54d50bf40d-service-ca" (OuterVolumeSpecName: "service-ca") pod "e0adc9b1-b3ba-423f-8f71-ed54d50bf40d" (UID: "e0adc9b1-b3ba-423f-8f71-ed54d50bf40d"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 16:45:26.457888 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:45:26.457851 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0adc9b1-b3ba-423f-8f71-ed54d50bf40d-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "e0adc9b1-b3ba-423f-8f71-ed54d50bf40d" (UID: "e0adc9b1-b3ba-423f-8f71-ed54d50bf40d"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 16:45:26.459657 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:45:26.459634 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0adc9b1-b3ba-423f-8f71-ed54d50bf40d-kube-api-access-9bldj" (OuterVolumeSpecName: "kube-api-access-9bldj") pod "e0adc9b1-b3ba-423f-8f71-ed54d50bf40d" (UID: "e0adc9b1-b3ba-423f-8f71-ed54d50bf40d"). InnerVolumeSpecName "kube-api-access-9bldj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 16:45:26.459784 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:45:26.459673 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0adc9b1-b3ba-423f-8f71-ed54d50bf40d-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "e0adc9b1-b3ba-423f-8f71-ed54d50bf40d" (UID: "e0adc9b1-b3ba-423f-8f71-ed54d50bf40d"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:45:26.459784 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:45:26.459733 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0adc9b1-b3ba-423f-8f71-ed54d50bf40d-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "e0adc9b1-b3ba-423f-8f71-ed54d50bf40d" (UID: "e0adc9b1-b3ba-423f-8f71-ed54d50bf40d"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:45:26.558564 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:45:26.558514 2576 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e0adc9b1-b3ba-423f-8f71-ed54d50bf40d-console-oauth-config\") on node \"ip-10-0-133-190.ec2.internal\" DevicePath \"\"" Mar 18 16:45:26.558564 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:45:26.558557 2576 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e0adc9b1-b3ba-423f-8f71-ed54d50bf40d-service-ca\") on node \"ip-10-0-133-190.ec2.internal\" DevicePath \"\"" Mar 18 16:45:26.558564 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:45:26.558568 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9bldj\" (UniqueName: \"kubernetes.io/projected/e0adc9b1-b3ba-423f-8f71-ed54d50bf40d-kube-api-access-9bldj\") on node \"ip-10-0-133-190.ec2.internal\" DevicePath \"\"" Mar 18 16:45:26.558564 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:45:26.558578 2576 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e0adc9b1-b3ba-423f-8f71-ed54d50bf40d-console-config\") on node \"ip-10-0-133-190.ec2.internal\" DevicePath \"\"" Mar 18 16:45:26.558828 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:45:26.558587 2576 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e0adc9b1-b3ba-423f-8f71-ed54d50bf40d-console-serving-cert\") on node \"ip-10-0-133-190.ec2.internal\" DevicePath \"\"" Mar 18 16:45:26.558828 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:45:26.558596 2576 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e0adc9b1-b3ba-423f-8f71-ed54d50bf40d-oauth-serving-cert\") on node \"ip-10-0-133-190.ec2.internal\" DevicePath \"\"" Mar 18 16:45:27.128062 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:45:27.128032 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7858d76678-42r8h_e0adc9b1-b3ba-423f-8f71-ed54d50bf40d/console/0.log" Mar 18 16:45:27.128475 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:45:27.128071 2576 generic.go:358] "Generic (PLEG): container finished" podID="e0adc9b1-b3ba-423f-8f71-ed54d50bf40d" containerID="25d10bbff3332012774633ce04a83403167a0b618297b0c52056c620976636b7" exitCode=2 Mar 18 16:45:27.128475 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:45:27.128102 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7858d76678-42r8h" event={"ID":"e0adc9b1-b3ba-423f-8f71-ed54d50bf40d","Type":"ContainerDied","Data":"25d10bbff3332012774633ce04a83403167a0b618297b0c52056c620976636b7"} Mar 18 16:45:27.128475 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:45:27.128125 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7858d76678-42r8h" event={"ID":"e0adc9b1-b3ba-423f-8f71-ed54d50bf40d","Type":"ContainerDied","Data":"f2ff82095ade4b96d94bce4ce2fa5e877c2b709cbca6c6a3c2f0f04e2f0989f3"} Mar 18 16:45:27.128475 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:45:27.128138 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7858d76678-42r8h" Mar 18 16:45:27.128475 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:45:27.128139 2576 scope.go:117] "RemoveContainer" containerID="25d10bbff3332012774633ce04a83403167a0b618297b0c52056c620976636b7" Mar 18 16:45:27.136542 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:45:27.136523 2576 scope.go:117] "RemoveContainer" containerID="25d10bbff3332012774633ce04a83403167a0b618297b0c52056c620976636b7" Mar 18 16:45:27.136792 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:45:27.136769 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25d10bbff3332012774633ce04a83403167a0b618297b0c52056c620976636b7\": container with ID starting with 25d10bbff3332012774633ce04a83403167a0b618297b0c52056c620976636b7 not found: ID does not exist" containerID="25d10bbff3332012774633ce04a83403167a0b618297b0c52056c620976636b7" Mar 18 16:45:27.136850 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:45:27.136801 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25d10bbff3332012774633ce04a83403167a0b618297b0c52056c620976636b7"} err="failed to get container status \"25d10bbff3332012774633ce04a83403167a0b618297b0c52056c620976636b7\": rpc error: code = NotFound desc = could not find container \"25d10bbff3332012774633ce04a83403167a0b618297b0c52056c620976636b7\": container with ID starting with 25d10bbff3332012774633ce04a83403167a0b618297b0c52056c620976636b7 not found: ID does not exist" Mar 18 16:45:27.147738 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:45:27.147713 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7858d76678-42r8h"] Mar 18 16:45:27.152852 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:45:27.152824 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-7858d76678-42r8h"] Mar 18 16:45:27.562478 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:45:27.562441 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0adc9b1-b3ba-423f-8f71-ed54d50bf40d" path="/var/lib/kubelet/pods/e0adc9b1-b3ba-423f-8f71-ed54d50bf40d/volumes" Mar 18 16:46:01.269596 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:46:01.269559 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6b5f87f654-8m79q"] Mar 18 16:46:01.270082 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:46:01.269839 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e0adc9b1-b3ba-423f-8f71-ed54d50bf40d" containerName="console" Mar 18 16:46:01.270082 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:46:01.269850 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0adc9b1-b3ba-423f-8f71-ed54d50bf40d" containerName="console" Mar 18 16:46:01.270082 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:46:01.269890 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="e0adc9b1-b3ba-423f-8f71-ed54d50bf40d" containerName="console" Mar 18 16:46:01.272820 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:46:01.272804 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6b5f87f654-8m79q" Mar 18 16:46:01.282683 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:46:01.282658 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6b5f87f654-8m79q"] Mar 18 16:46:01.416437 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:46:01.416389 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qvw2\" (UniqueName: \"kubernetes.io/projected/58e6ce32-c778-4728-8c51-bf2e4f5525d0-kube-api-access-5qvw2\") pod \"console-6b5f87f654-8m79q\" (UID: \"58e6ce32-c778-4728-8c51-bf2e4f5525d0\") " pod="openshift-console/console-6b5f87f654-8m79q" Mar 18 16:46:01.416437 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:46:01.416443 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/58e6ce32-c778-4728-8c51-bf2e4f5525d0-console-config\") pod \"console-6b5f87f654-8m79q\" (UID: \"58e6ce32-c778-4728-8c51-bf2e4f5525d0\") " pod="openshift-console/console-6b5f87f654-8m79q" Mar 18 16:46:01.416682 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:46:01.416469 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/58e6ce32-c778-4728-8c51-bf2e4f5525d0-service-ca\") pod \"console-6b5f87f654-8m79q\" (UID: \"58e6ce32-c778-4728-8c51-bf2e4f5525d0\") " pod="openshift-console/console-6b5f87f654-8m79q" Mar 18 16:46:01.416682 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:46:01.416548 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/58e6ce32-c778-4728-8c51-bf2e4f5525d0-trusted-ca-bundle\") pod \"console-6b5f87f654-8m79q\" (UID: \"58e6ce32-c778-4728-8c51-bf2e4f5525d0\") " pod="openshift-console/console-6b5f87f654-8m79q" Mar 18 16:46:01.416682 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:46:01.416590 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/58e6ce32-c778-4728-8c51-bf2e4f5525d0-console-serving-cert\") pod \"console-6b5f87f654-8m79q\" (UID: \"58e6ce32-c778-4728-8c51-bf2e4f5525d0\") " pod="openshift-console/console-6b5f87f654-8m79q" Mar 18 16:46:01.416682 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:46:01.416623 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/58e6ce32-c778-4728-8c51-bf2e4f5525d0-oauth-serving-cert\") pod \"console-6b5f87f654-8m79q\" (UID: \"58e6ce32-c778-4728-8c51-bf2e4f5525d0\") " pod="openshift-console/console-6b5f87f654-8m79q" Mar 18 16:46:01.416682 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:46:01.416645 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/58e6ce32-c778-4728-8c51-bf2e4f5525d0-console-oauth-config\") pod \"console-6b5f87f654-8m79q\" (UID: \"58e6ce32-c778-4728-8c51-bf2e4f5525d0\") " pod="openshift-console/console-6b5f87f654-8m79q" Mar 18 16:46:01.517763 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:46:01.517716 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/58e6ce32-c778-4728-8c51-bf2e4f5525d0-service-ca\") pod \"console-6b5f87f654-8m79q\" (UID: \"58e6ce32-c778-4728-8c51-bf2e4f5525d0\") " pod="openshift-console/console-6b5f87f654-8m79q" Mar 18 16:46:01.517763 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:46:01.517770 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/58e6ce32-c778-4728-8c51-bf2e4f5525d0-trusted-ca-bundle\") pod \"console-6b5f87f654-8m79q\" (UID: \"58e6ce32-c778-4728-8c51-bf2e4f5525d0\") " pod="openshift-console/console-6b5f87f654-8m79q" Mar 18 16:46:01.518048 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:46:01.517895 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/58e6ce32-c778-4728-8c51-bf2e4f5525d0-console-serving-cert\") pod \"console-6b5f87f654-8m79q\" (UID: \"58e6ce32-c778-4728-8c51-bf2e4f5525d0\") " pod="openshift-console/console-6b5f87f654-8m79q" Mar 18 16:46:01.518048 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:46:01.517989 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/58e6ce32-c778-4728-8c51-bf2e4f5525d0-oauth-serving-cert\") pod \"console-6b5f87f654-8m79q\" (UID: \"58e6ce32-c778-4728-8c51-bf2e4f5525d0\") " pod="openshift-console/console-6b5f87f654-8m79q" Mar 18 16:46:01.518048 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:46:01.518032 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/58e6ce32-c778-4728-8c51-bf2e4f5525d0-console-oauth-config\") pod \"console-6b5f87f654-8m79q\" (UID: \"58e6ce32-c778-4728-8c51-bf2e4f5525d0\") " pod="openshift-console/console-6b5f87f654-8m79q" Mar 18 16:46:01.518195 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:46:01.518079 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5qvw2\" (UniqueName: \"kubernetes.io/projected/58e6ce32-c778-4728-8c51-bf2e4f5525d0-kube-api-access-5qvw2\") pod \"console-6b5f87f654-8m79q\" (UID: \"58e6ce32-c778-4728-8c51-bf2e4f5525d0\") " pod="openshift-console/console-6b5f87f654-8m79q" Mar 18 16:46:01.518195 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:46:01.518128 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/58e6ce32-c778-4728-8c51-bf2e4f5525d0-console-config\") pod \"console-6b5f87f654-8m79q\" (UID: \"58e6ce32-c778-4728-8c51-bf2e4f5525d0\") " pod="openshift-console/console-6b5f87f654-8m79q" Mar 18 16:46:01.518553 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:46:01.518523 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/58e6ce32-c778-4728-8c51-bf2e4f5525d0-service-ca\") pod \"console-6b5f87f654-8m79q\" (UID: \"58e6ce32-c778-4728-8c51-bf2e4f5525d0\") " pod="openshift-console/console-6b5f87f654-8m79q" Mar 18 16:46:01.518687 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:46:01.518620 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/58e6ce32-c778-4728-8c51-bf2e4f5525d0-trusted-ca-bundle\") pod \"console-6b5f87f654-8m79q\" (UID: \"58e6ce32-c778-4728-8c51-bf2e4f5525d0\") " pod="openshift-console/console-6b5f87f654-8m79q" Mar 18 16:46:01.518764 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:46:01.518742 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/58e6ce32-c778-4728-8c51-bf2e4f5525d0-console-config\") pod \"console-6b5f87f654-8m79q\" (UID: \"58e6ce32-c778-4728-8c51-bf2e4f5525d0\") " pod="openshift-console/console-6b5f87f654-8m79q" Mar 18 16:46:01.520525 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:46:01.520461 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/58e6ce32-c778-4728-8c51-bf2e4f5525d0-oauth-serving-cert\") pod \"console-6b5f87f654-8m79q\" (UID: \"58e6ce32-c778-4728-8c51-bf2e4f5525d0\") " pod="openshift-console/console-6b5f87f654-8m79q" Mar 18 16:46:01.520525 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:46:01.520516 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/58e6ce32-c778-4728-8c51-bf2e4f5525d0-console-oauth-config\") pod \"console-6b5f87f654-8m79q\" (UID: \"58e6ce32-c778-4728-8c51-bf2e4f5525d0\") " pod="openshift-console/console-6b5f87f654-8m79q" Mar 18 16:46:01.520638 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:46:01.520536 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/58e6ce32-c778-4728-8c51-bf2e4f5525d0-console-serving-cert\") pod \"console-6b5f87f654-8m79q\" (UID: \"58e6ce32-c778-4728-8c51-bf2e4f5525d0\") " pod="openshift-console/console-6b5f87f654-8m79q" Mar 18 16:46:01.525988 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:46:01.525967 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qvw2\" (UniqueName: \"kubernetes.io/projected/58e6ce32-c778-4728-8c51-bf2e4f5525d0-kube-api-access-5qvw2\") pod \"console-6b5f87f654-8m79q\" (UID: \"58e6ce32-c778-4728-8c51-bf2e4f5525d0\") " pod="openshift-console/console-6b5f87f654-8m79q" Mar 18 16:46:01.582367 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:46:01.582326 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6b5f87f654-8m79q" Mar 18 16:46:01.702743 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:46:01.702699 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6b5f87f654-8m79q"] Mar 18 16:46:01.705799 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:46:01.705769 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod58e6ce32_c778_4728_8c51_bf2e4f5525d0.slice/crio-931531dd1475ed8343aa39dbca9aa572e851abf2b4fea6b34004ba4c301dc718 WatchSource:0}: Error finding container 931531dd1475ed8343aa39dbca9aa572e851abf2b4fea6b34004ba4c301dc718: Status 404 returned error can't find the container with id 931531dd1475ed8343aa39dbca9aa572e851abf2b4fea6b34004ba4c301dc718 Mar 18 16:46:02.220830 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:46:02.220790 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6b5f87f654-8m79q" event={"ID":"58e6ce32-c778-4728-8c51-bf2e4f5525d0","Type":"ContainerStarted","Data":"addab9183a59b18c2eef543b9a03cbe0d36c9b85669aabb035e9091fad36122e"} Mar 18 16:46:02.220830 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:46:02.220830 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6b5f87f654-8m79q" event={"ID":"58e6ce32-c778-4728-8c51-bf2e4f5525d0","Type":"ContainerStarted","Data":"931531dd1475ed8343aa39dbca9aa572e851abf2b4fea6b34004ba4c301dc718"} Mar 18 16:46:02.237644 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:46:02.237584 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6b5f87f654-8m79q" podStartSLOduration=1.237564167 podStartE2EDuration="1.237564167s" podCreationTimestamp="2026-03-18 16:46:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:46:02.236577997 +0000 UTC m=+153.286393493" watchObservedRunningTime="2026-03-18 16:46:02.237564167 +0000 UTC m=+153.287379594" Mar 18 16:46:11.583442 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:46:11.583407 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6b5f87f654-8m79q" Mar 18 16:46:11.583996 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:46:11.583500 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6b5f87f654-8m79q" Mar 18 16:46:11.588526 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:46:11.588502 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6b5f87f654-8m79q" Mar 18 16:46:12.254053 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:46:12.254018 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6b5f87f654-8m79q" Mar 18 16:46:12.300098 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:46:12.300065 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5fc7687f76-452pq"] Mar 18 16:46:37.318802 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:46:37.318741 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-5fc7687f76-452pq" podUID="55ff9595-d7e5-4453-8bcd-095510d332d3" containerName="console" containerID="cri-o://7eeaf9f0c9043180cf52b6c8b85671d063373b8ec97ce975e22cf5156ab84e98" gracePeriod=15 Mar 18 16:46:37.566317 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:46:37.566293 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5fc7687f76-452pq_55ff9595-d7e5-4453-8bcd-095510d332d3/console/0.log" Mar 18 16:46:37.566444 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:46:37.566353 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5fc7687f76-452pq" Mar 18 16:46:37.696353 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:46:37.696269 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/55ff9595-d7e5-4453-8bcd-095510d332d3-console-config\") pod \"55ff9595-d7e5-4453-8bcd-095510d332d3\" (UID: \"55ff9595-d7e5-4453-8bcd-095510d332d3\") " Mar 18 16:46:37.696353 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:46:37.696310 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/55ff9595-d7e5-4453-8bcd-095510d332d3-oauth-serving-cert\") pod \"55ff9595-d7e5-4453-8bcd-095510d332d3\" (UID: \"55ff9595-d7e5-4453-8bcd-095510d332d3\") " Mar 18 16:46:37.696353 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:46:37.696329 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/55ff9595-d7e5-4453-8bcd-095510d332d3-console-serving-cert\") pod \"55ff9595-d7e5-4453-8bcd-095510d332d3\" (UID: \"55ff9595-d7e5-4453-8bcd-095510d332d3\") " Mar 18 16:46:37.696608 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:46:37.696370 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/55ff9595-d7e5-4453-8bcd-095510d332d3-console-oauth-config\") pod \"55ff9595-d7e5-4453-8bcd-095510d332d3\" (UID: \"55ff9595-d7e5-4453-8bcd-095510d332d3\") " Mar 18 16:46:37.696608 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:46:37.696395 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x57mp\" (UniqueName: \"kubernetes.io/projected/55ff9595-d7e5-4453-8bcd-095510d332d3-kube-api-access-x57mp\") pod \"55ff9595-d7e5-4453-8bcd-095510d332d3\" (UID: \"55ff9595-d7e5-4453-8bcd-095510d332d3\") " Mar 18 16:46:37.696608 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:46:37.696430 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/55ff9595-d7e5-4453-8bcd-095510d332d3-trusted-ca-bundle\") pod \"55ff9595-d7e5-4453-8bcd-095510d332d3\" (UID: \"55ff9595-d7e5-4453-8bcd-095510d332d3\") " Mar 18 16:46:37.696608 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:46:37.696460 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/55ff9595-d7e5-4453-8bcd-095510d332d3-service-ca\") pod \"55ff9595-d7e5-4453-8bcd-095510d332d3\" (UID: \"55ff9595-d7e5-4453-8bcd-095510d332d3\") " Mar 18 16:46:37.696810 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:46:37.696701 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55ff9595-d7e5-4453-8bcd-095510d332d3-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "55ff9595-d7e5-4453-8bcd-095510d332d3" (UID: "55ff9595-d7e5-4453-8bcd-095510d332d3"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 16:46:37.696927 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:46:37.696902 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55ff9595-d7e5-4453-8bcd-095510d332d3-console-config" (OuterVolumeSpecName: "console-config") pod "55ff9595-d7e5-4453-8bcd-095510d332d3" (UID: "55ff9595-d7e5-4453-8bcd-095510d332d3"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 16:46:37.696927 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:46:37.696914 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55ff9595-d7e5-4453-8bcd-095510d332d3-service-ca" (OuterVolumeSpecName: "service-ca") pod "55ff9595-d7e5-4453-8bcd-095510d332d3" (UID: "55ff9595-d7e5-4453-8bcd-095510d332d3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 16:46:37.697056 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:46:37.696974 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55ff9595-d7e5-4453-8bcd-095510d332d3-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "55ff9595-d7e5-4453-8bcd-095510d332d3" (UID: "55ff9595-d7e5-4453-8bcd-095510d332d3"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 16:46:37.698584 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:46:37.698560 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55ff9595-d7e5-4453-8bcd-095510d332d3-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "55ff9595-d7e5-4453-8bcd-095510d332d3" (UID: "55ff9595-d7e5-4453-8bcd-095510d332d3"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:46:37.698996 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:46:37.698965 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55ff9595-d7e5-4453-8bcd-095510d332d3-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "55ff9595-d7e5-4453-8bcd-095510d332d3" (UID: "55ff9595-d7e5-4453-8bcd-095510d332d3"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:46:37.698996 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:46:37.698975 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55ff9595-d7e5-4453-8bcd-095510d332d3-kube-api-access-x57mp" (OuterVolumeSpecName: "kube-api-access-x57mp") pod "55ff9595-d7e5-4453-8bcd-095510d332d3" (UID: "55ff9595-d7e5-4453-8bcd-095510d332d3"). InnerVolumeSpecName "kube-api-access-x57mp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 16:46:37.797615 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:46:37.797580 2576 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/55ff9595-d7e5-4453-8bcd-095510d332d3-console-config\") on node \"ip-10-0-133-190.ec2.internal\" DevicePath \"\"" Mar 18 16:46:37.797615 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:46:37.797609 2576 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/55ff9595-d7e5-4453-8bcd-095510d332d3-oauth-serving-cert\") on node \"ip-10-0-133-190.ec2.internal\" DevicePath \"\"" Mar 18 16:46:37.797615 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:46:37.797620 2576 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/55ff9595-d7e5-4453-8bcd-095510d332d3-console-serving-cert\") on node \"ip-10-0-133-190.ec2.internal\" DevicePath \"\"" Mar 18 16:46:37.797829 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:46:37.797630 2576 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/55ff9595-d7e5-4453-8bcd-095510d332d3-console-oauth-config\") on node \"ip-10-0-133-190.ec2.internal\" DevicePath \"\"" Mar 18 16:46:37.797829 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:46:37.797639 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-x57mp\" (UniqueName: \"kubernetes.io/projected/55ff9595-d7e5-4453-8bcd-095510d332d3-kube-api-access-x57mp\") on node \"ip-10-0-133-190.ec2.internal\" DevicePath \"\"" Mar 18 16:46:37.797829 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:46:37.797649 2576 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/55ff9595-d7e5-4453-8bcd-095510d332d3-trusted-ca-bundle\") on node \"ip-10-0-133-190.ec2.internal\" DevicePath \"\"" Mar 18 16:46:37.797829 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:46:37.797657 2576 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/55ff9595-d7e5-4453-8bcd-095510d332d3-service-ca\") on node \"ip-10-0-133-190.ec2.internal\" DevicePath \"\"" Mar 18 16:46:38.318185 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:46:38.318158 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5fc7687f76-452pq_55ff9595-d7e5-4453-8bcd-095510d332d3/console/0.log" Mar 18 16:46:38.318380 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:46:38.318197 2576 generic.go:358] "Generic (PLEG): container finished" podID="55ff9595-d7e5-4453-8bcd-095510d332d3" containerID="7eeaf9f0c9043180cf52b6c8b85671d063373b8ec97ce975e22cf5156ab84e98" exitCode=2 Mar 18 16:46:38.318380 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:46:38.318232 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5fc7687f76-452pq" event={"ID":"55ff9595-d7e5-4453-8bcd-095510d332d3","Type":"ContainerDied","Data":"7eeaf9f0c9043180cf52b6c8b85671d063373b8ec97ce975e22cf5156ab84e98"} Mar 18 16:46:38.318380 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:46:38.318270 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5fc7687f76-452pq" event={"ID":"55ff9595-d7e5-4453-8bcd-095510d332d3","Type":"ContainerDied","Data":"eaba5850e06d9a48c030e7f2c1c6fbc2dd3127dad62eab38f94c183c859b0160"} Mar 18 16:46:38.318380 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:46:38.318280 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5fc7687f76-452pq" Mar 18 16:46:38.318583 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:46:38.318288 2576 scope.go:117] "RemoveContainer" containerID="7eeaf9f0c9043180cf52b6c8b85671d063373b8ec97ce975e22cf5156ab84e98" Mar 18 16:46:38.326895 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:46:38.326877 2576 scope.go:117] "RemoveContainer" containerID="7eeaf9f0c9043180cf52b6c8b85671d063373b8ec97ce975e22cf5156ab84e98" Mar 18 16:46:38.327221 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:46:38.327202 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7eeaf9f0c9043180cf52b6c8b85671d063373b8ec97ce975e22cf5156ab84e98\": container with ID starting with 7eeaf9f0c9043180cf52b6c8b85671d063373b8ec97ce975e22cf5156ab84e98 not found: ID does not exist" containerID="7eeaf9f0c9043180cf52b6c8b85671d063373b8ec97ce975e22cf5156ab84e98" Mar 18 16:46:38.327267 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:46:38.327231 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7eeaf9f0c9043180cf52b6c8b85671d063373b8ec97ce975e22cf5156ab84e98"} err="failed to get container status \"7eeaf9f0c9043180cf52b6c8b85671d063373b8ec97ce975e22cf5156ab84e98\": rpc error: code = NotFound desc = could not find container \"7eeaf9f0c9043180cf52b6c8b85671d063373b8ec97ce975e22cf5156ab84e98\": container with ID starting with 7eeaf9f0c9043180cf52b6c8b85671d063373b8ec97ce975e22cf5156ab84e98 not found: ID does not exist" Mar 18 16:46:38.343980 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:46:38.343929 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5fc7687f76-452pq"] Mar 18 16:46:38.347635 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:46:38.347605 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5fc7687f76-452pq"] Mar 18 16:46:39.562407 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:46:39.562375 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55ff9595-d7e5-4453-8bcd-095510d332d3" path="/var/lib/kubelet/pods/55ff9595-d7e5-4453-8bcd-095510d332d3/volumes" Mar 18 16:47:14.591989 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:47:14.591933 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-857857b765-vm745"] Mar 18 16:47:14.592450 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:47:14.592194 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="55ff9595-d7e5-4453-8bcd-095510d332d3" containerName="console" Mar 18 16:47:14.592450 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:47:14.592206 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="55ff9595-d7e5-4453-8bcd-095510d332d3" containerName="console" Mar 18 16:47:14.592450 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:47:14.592254 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="55ff9595-d7e5-4453-8bcd-095510d332d3" containerName="console" Mar 18 16:47:14.595175 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:47:14.595157 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-857857b765-vm745" Mar 18 16:47:14.611010 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:47:14.610684 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-857857b765-vm745"] Mar 18 16:47:14.660390 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:47:14.660349 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/db06968c-5245-4ae6-9ba5-2c367b19044b-service-ca\") pod \"console-857857b765-vm745\" (UID: \"db06968c-5245-4ae6-9ba5-2c367b19044b\") " pod="openshift-console/console-857857b765-vm745" Mar 18 16:47:14.660390 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:47:14.660394 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/db06968c-5245-4ae6-9ba5-2c367b19044b-oauth-serving-cert\") pod \"console-857857b765-vm745\" (UID: \"db06968c-5245-4ae6-9ba5-2c367b19044b\") " pod="openshift-console/console-857857b765-vm745" Mar 18 16:47:14.660643 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:47:14.660434 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/db06968c-5245-4ae6-9ba5-2c367b19044b-console-config\") pod \"console-857857b765-vm745\" (UID: \"db06968c-5245-4ae6-9ba5-2c367b19044b\") " pod="openshift-console/console-857857b765-vm745" Mar 18 16:47:14.660643 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:47:14.660450 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/db06968c-5245-4ae6-9ba5-2c367b19044b-trusted-ca-bundle\") pod \"console-857857b765-vm745\" (UID: \"db06968c-5245-4ae6-9ba5-2c367b19044b\") " pod="openshift-console/console-857857b765-vm745" Mar 18 16:47:14.660643 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:47:14.660470 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/db06968c-5245-4ae6-9ba5-2c367b19044b-console-serving-cert\") pod \"console-857857b765-vm745\" (UID: \"db06968c-5245-4ae6-9ba5-2c367b19044b\") " pod="openshift-console/console-857857b765-vm745" Mar 18 16:47:14.660643 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:47:14.660493 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/db06968c-5245-4ae6-9ba5-2c367b19044b-console-oauth-config\") pod \"console-857857b765-vm745\" (UID: \"db06968c-5245-4ae6-9ba5-2c367b19044b\") " pod="openshift-console/console-857857b765-vm745" Mar 18 16:47:14.660643 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:47:14.660620 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b49dm\" (UniqueName: \"kubernetes.io/projected/db06968c-5245-4ae6-9ba5-2c367b19044b-kube-api-access-b49dm\") pod \"console-857857b765-vm745\" (UID: \"db06968c-5245-4ae6-9ba5-2c367b19044b\") " pod="openshift-console/console-857857b765-vm745" Mar 18 16:47:14.761422 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:47:14.761379 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/db06968c-5245-4ae6-9ba5-2c367b19044b-service-ca\") pod \"console-857857b765-vm745\" (UID: \"db06968c-5245-4ae6-9ba5-2c367b19044b\") " pod="openshift-console/console-857857b765-vm745" Mar 18 16:47:14.761618 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:47:14.761429 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/db06968c-5245-4ae6-9ba5-2c367b19044b-oauth-serving-cert\") pod \"console-857857b765-vm745\" (UID: \"db06968c-5245-4ae6-9ba5-2c367b19044b\") " pod="openshift-console/console-857857b765-vm745" Mar 18 16:47:14.761618 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:47:14.761462 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/db06968c-5245-4ae6-9ba5-2c367b19044b-console-config\") pod \"console-857857b765-vm745\" (UID: \"db06968c-5245-4ae6-9ba5-2c367b19044b\") " pod="openshift-console/console-857857b765-vm745" Mar 18 16:47:14.761618 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:47:14.761485 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/db06968c-5245-4ae6-9ba5-2c367b19044b-trusted-ca-bundle\") pod \"console-857857b765-vm745\" (UID: \"db06968c-5245-4ae6-9ba5-2c367b19044b\") " pod="openshift-console/console-857857b765-vm745" Mar 18 16:47:14.761618 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:47:14.761518 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/db06968c-5245-4ae6-9ba5-2c367b19044b-console-serving-cert\") pod \"console-857857b765-vm745\" (UID: \"db06968c-5245-4ae6-9ba5-2c367b19044b\") " pod="openshift-console/console-857857b765-vm745" Mar 18 16:47:14.761618 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:47:14.761558 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/db06968c-5245-4ae6-9ba5-2c367b19044b-console-oauth-config\") pod \"console-857857b765-vm745\" (UID: \"db06968c-5245-4ae6-9ba5-2c367b19044b\") " pod="openshift-console/console-857857b765-vm745" Mar 18 16:47:14.761618 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:47:14.761584 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b49dm\" (UniqueName: \"kubernetes.io/projected/db06968c-5245-4ae6-9ba5-2c367b19044b-kube-api-access-b49dm\") pod \"console-857857b765-vm745\" (UID: \"db06968c-5245-4ae6-9ba5-2c367b19044b\") " pod="openshift-console/console-857857b765-vm745" Mar 18 16:47:14.762217 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:47:14.762188 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/db06968c-5245-4ae6-9ba5-2c367b19044b-service-ca\") pod \"console-857857b765-vm745\" (UID: \"db06968c-5245-4ae6-9ba5-2c367b19044b\") " pod="openshift-console/console-857857b765-vm745" Mar 18 16:47:14.762327 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:47:14.762267 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/db06968c-5245-4ae6-9ba5-2c367b19044b-oauth-serving-cert\") pod \"console-857857b765-vm745\" (UID: \"db06968c-5245-4ae6-9ba5-2c367b19044b\") " pod="openshift-console/console-857857b765-vm745" Mar 18 16:47:14.762371 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:47:14.762340 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/db06968c-5245-4ae6-9ba5-2c367b19044b-console-config\") pod \"console-857857b765-vm745\" (UID: \"db06968c-5245-4ae6-9ba5-2c367b19044b\") " pod="openshift-console/console-857857b765-vm745" Mar 18 16:47:14.762453 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:47:14.762432 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/db06968c-5245-4ae6-9ba5-2c367b19044b-trusted-ca-bundle\") pod \"console-857857b765-vm745\" (UID: \"db06968c-5245-4ae6-9ba5-2c367b19044b\") " pod="openshift-console/console-857857b765-vm745" Mar 18 16:47:14.764616 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:47:14.764585 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/db06968c-5245-4ae6-9ba5-2c367b19044b-console-serving-cert\") pod \"console-857857b765-vm745\" (UID: \"db06968c-5245-4ae6-9ba5-2c367b19044b\") " pod="openshift-console/console-857857b765-vm745" Mar 18 16:47:14.764707 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:47:14.764691 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/db06968c-5245-4ae6-9ba5-2c367b19044b-console-oauth-config\") pod \"console-857857b765-vm745\" (UID: \"db06968c-5245-4ae6-9ba5-2c367b19044b\") " pod="openshift-console/console-857857b765-vm745" Mar 18 16:47:14.769463 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:47:14.769435 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b49dm\" (UniqueName: \"kubernetes.io/projected/db06968c-5245-4ae6-9ba5-2c367b19044b-kube-api-access-b49dm\") pod \"console-857857b765-vm745\" (UID: \"db06968c-5245-4ae6-9ba5-2c367b19044b\") " pod="openshift-console/console-857857b765-vm745" Mar 18 16:47:14.903907 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:47:14.903792 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-857857b765-vm745" Mar 18 16:47:15.024775 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:47:15.024743 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-857857b765-vm745"] Mar 18 16:47:15.027900 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:47:15.027873 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb06968c_5245_4ae6_9ba5_2c367b19044b.slice/crio-da7ca0443fcfde4892ed852860204b28776153b8b5759b1493c192665c3eb4d5 WatchSource:0}: Error finding container da7ca0443fcfde4892ed852860204b28776153b8b5759b1493c192665c3eb4d5: Status 404 returned error can't find the container with id da7ca0443fcfde4892ed852860204b28776153b8b5759b1493c192665c3eb4d5 Mar 18 16:47:15.415693 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:47:15.415652 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-857857b765-vm745" event={"ID":"db06968c-5245-4ae6-9ba5-2c367b19044b","Type":"ContainerStarted","Data":"b3a7d15bd3f780901b90169165530864e55f6802b51f41b76ac2a113c020959e"} Mar 18 16:47:15.415693 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:47:15.415689 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-857857b765-vm745" event={"ID":"db06968c-5245-4ae6-9ba5-2c367b19044b","Type":"ContainerStarted","Data":"da7ca0443fcfde4892ed852860204b28776153b8b5759b1493c192665c3eb4d5"} Mar 18 16:47:15.434479 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:47:15.434430 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-857857b765-vm745" podStartSLOduration=1.434415371 podStartE2EDuration="1.434415371s" podCreationTimestamp="2026-03-18 16:47:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:47:15.433098757 +0000 UTC m=+226.482914184" watchObservedRunningTime="2026-03-18 16:47:15.434415371 +0000 UTC m=+226.484230797" Mar 18 16:47:24.904558 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:47:24.904522 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-857857b765-vm745" Mar 18 16:47:24.904558 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:47:24.904562 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-857857b765-vm745" Mar 18 16:47:24.909523 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:47:24.909494 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-857857b765-vm745" Mar 18 16:47:25.447975 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:47:25.447924 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-857857b765-vm745" Mar 18 16:47:25.507533 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:47:25.507494 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6b5f87f654-8m79q"] Mar 18 16:47:50.526315 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:47:50.526227 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6b5f87f654-8m79q" podUID="58e6ce32-c778-4728-8c51-bf2e4f5525d0" containerName="console" containerID="cri-o://addab9183a59b18c2eef543b9a03cbe0d36c9b85669aabb035e9091fad36122e" gracePeriod=15 Mar 18 16:47:50.753479 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:47:50.753456 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6b5f87f654-8m79q_58e6ce32-c778-4728-8c51-bf2e4f5525d0/console/0.log" Mar 18 16:47:50.753603 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:47:50.753519 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6b5f87f654-8m79q" Mar 18 16:47:50.942311 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:47:50.942222 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5qvw2\" (UniqueName: \"kubernetes.io/projected/58e6ce32-c778-4728-8c51-bf2e4f5525d0-kube-api-access-5qvw2\") pod \"58e6ce32-c778-4728-8c51-bf2e4f5525d0\" (UID: \"58e6ce32-c778-4728-8c51-bf2e4f5525d0\") " Mar 18 16:47:50.942311 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:47:50.942260 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/58e6ce32-c778-4728-8c51-bf2e4f5525d0-service-ca\") pod \"58e6ce32-c778-4728-8c51-bf2e4f5525d0\" (UID: \"58e6ce32-c778-4728-8c51-bf2e4f5525d0\") " Mar 18 16:47:50.942311 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:47:50.942305 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/58e6ce32-c778-4728-8c51-bf2e4f5525d0-console-oauth-config\") pod \"58e6ce32-c778-4728-8c51-bf2e4f5525d0\" (UID: \"58e6ce32-c778-4728-8c51-bf2e4f5525d0\") " Mar 18 16:47:50.942590 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:47:50.942332 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/58e6ce32-c778-4728-8c51-bf2e4f5525d0-console-config\") pod \"58e6ce32-c778-4728-8c51-bf2e4f5525d0\" (UID: \"58e6ce32-c778-4728-8c51-bf2e4f5525d0\") " Mar 18 16:47:50.942590 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:47:50.942349 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/58e6ce32-c778-4728-8c51-bf2e4f5525d0-oauth-serving-cert\") pod \"58e6ce32-c778-4728-8c51-bf2e4f5525d0\" (UID: \"58e6ce32-c778-4728-8c51-bf2e4f5525d0\") " Mar 18 16:47:50.942590 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:47:50.942379 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/58e6ce32-c778-4728-8c51-bf2e4f5525d0-console-serving-cert\") pod \"58e6ce32-c778-4728-8c51-bf2e4f5525d0\" (UID: \"58e6ce32-c778-4728-8c51-bf2e4f5525d0\") " Mar 18 16:47:50.942590 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:47:50.942403 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/58e6ce32-c778-4728-8c51-bf2e4f5525d0-trusted-ca-bundle\") pod \"58e6ce32-c778-4728-8c51-bf2e4f5525d0\" (UID: \"58e6ce32-c778-4728-8c51-bf2e4f5525d0\") " Mar 18 16:47:50.942850 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:47:50.942823 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/58e6ce32-c778-4728-8c51-bf2e4f5525d0-service-ca" (OuterVolumeSpecName: "service-ca") pod "58e6ce32-c778-4728-8c51-bf2e4f5525d0" (UID: "58e6ce32-c778-4728-8c51-bf2e4f5525d0"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 16:47:50.942850 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:47:50.942832 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/58e6ce32-c778-4728-8c51-bf2e4f5525d0-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "58e6ce32-c778-4728-8c51-bf2e4f5525d0" (UID: "58e6ce32-c778-4728-8c51-bf2e4f5525d0"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 16:47:50.942986 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:47:50.942856 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/58e6ce32-c778-4728-8c51-bf2e4f5525d0-console-config" (OuterVolumeSpecName: "console-config") pod "58e6ce32-c778-4728-8c51-bf2e4f5525d0" (UID: "58e6ce32-c778-4728-8c51-bf2e4f5525d0"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 16:47:50.942986 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:47:50.942910 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/58e6ce32-c778-4728-8c51-bf2e4f5525d0-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "58e6ce32-c778-4728-8c51-bf2e4f5525d0" (UID: "58e6ce32-c778-4728-8c51-bf2e4f5525d0"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 16:47:50.944556 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:47:50.944532 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58e6ce32-c778-4728-8c51-bf2e4f5525d0-kube-api-access-5qvw2" (OuterVolumeSpecName: "kube-api-access-5qvw2") pod "58e6ce32-c778-4728-8c51-bf2e4f5525d0" (UID: "58e6ce32-c778-4728-8c51-bf2e4f5525d0"). InnerVolumeSpecName "kube-api-access-5qvw2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 16:47:50.944653 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:47:50.944583 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58e6ce32-c778-4728-8c51-bf2e4f5525d0-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "58e6ce32-c778-4728-8c51-bf2e4f5525d0" (UID: "58e6ce32-c778-4728-8c51-bf2e4f5525d0"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:47:50.944653 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:47:50.944602 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58e6ce32-c778-4728-8c51-bf2e4f5525d0-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "58e6ce32-c778-4728-8c51-bf2e4f5525d0" (UID: "58e6ce32-c778-4728-8c51-bf2e4f5525d0"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:47:51.043599 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:47:51.043559 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5qvw2\" (UniqueName: \"kubernetes.io/projected/58e6ce32-c778-4728-8c51-bf2e4f5525d0-kube-api-access-5qvw2\") on node \"ip-10-0-133-190.ec2.internal\" DevicePath \"\"" Mar 18 16:47:51.043599 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:47:51.043594 2576 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/58e6ce32-c778-4728-8c51-bf2e4f5525d0-service-ca\") on node \"ip-10-0-133-190.ec2.internal\" DevicePath \"\"" Mar 18 16:47:51.043599 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:47:51.043604 2576 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/58e6ce32-c778-4728-8c51-bf2e4f5525d0-console-oauth-config\") on node \"ip-10-0-133-190.ec2.internal\" DevicePath \"\"" Mar 18 16:47:51.043599 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:47:51.043613 2576 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/58e6ce32-c778-4728-8c51-bf2e4f5525d0-console-config\") on node \"ip-10-0-133-190.ec2.internal\" DevicePath \"\"" Mar 18 16:47:51.043864 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:47:51.043622 2576 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/58e6ce32-c778-4728-8c51-bf2e4f5525d0-oauth-serving-cert\") on node \"ip-10-0-133-190.ec2.internal\" DevicePath \"\"" Mar 18 16:47:51.043864 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:47:51.043631 2576 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/58e6ce32-c778-4728-8c51-bf2e4f5525d0-console-serving-cert\") on node \"ip-10-0-133-190.ec2.internal\" DevicePath \"\"" Mar 18 16:47:51.043864 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:47:51.043639 2576 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/58e6ce32-c778-4728-8c51-bf2e4f5525d0-trusted-ca-bundle\") on node \"ip-10-0-133-190.ec2.internal\" DevicePath \"\"" Mar 18 16:47:51.512685 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:47:51.512658 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6b5f87f654-8m79q_58e6ce32-c778-4728-8c51-bf2e4f5525d0/console/0.log" Mar 18 16:47:51.512850 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:47:51.512704 2576 generic.go:358] "Generic (PLEG): container finished" podID="58e6ce32-c778-4728-8c51-bf2e4f5525d0" containerID="addab9183a59b18c2eef543b9a03cbe0d36c9b85669aabb035e9091fad36122e" exitCode=2 Mar 18 16:47:51.512850 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:47:51.512780 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6b5f87f654-8m79q" event={"ID":"58e6ce32-c778-4728-8c51-bf2e4f5525d0","Type":"ContainerDied","Data":"addab9183a59b18c2eef543b9a03cbe0d36c9b85669aabb035e9091fad36122e"} Mar 18 16:47:51.512850 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:47:51.512788 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6b5f87f654-8m79q" Mar 18 16:47:51.512850 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:47:51.512814 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6b5f87f654-8m79q" event={"ID":"58e6ce32-c778-4728-8c51-bf2e4f5525d0","Type":"ContainerDied","Data":"931531dd1475ed8343aa39dbca9aa572e851abf2b4fea6b34004ba4c301dc718"} Mar 18 16:47:51.512850 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:47:51.512835 2576 scope.go:117] "RemoveContainer" containerID="addab9183a59b18c2eef543b9a03cbe0d36c9b85669aabb035e9091fad36122e" Mar 18 16:47:51.520986 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:47:51.520968 2576 scope.go:117] "RemoveContainer" containerID="addab9183a59b18c2eef543b9a03cbe0d36c9b85669aabb035e9091fad36122e" Mar 18 16:47:51.521259 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:47:51.521239 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"addab9183a59b18c2eef543b9a03cbe0d36c9b85669aabb035e9091fad36122e\": container with ID starting with addab9183a59b18c2eef543b9a03cbe0d36c9b85669aabb035e9091fad36122e not found: ID does not exist" containerID="addab9183a59b18c2eef543b9a03cbe0d36c9b85669aabb035e9091fad36122e" Mar 18 16:47:51.521333 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:47:51.521272 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"addab9183a59b18c2eef543b9a03cbe0d36c9b85669aabb035e9091fad36122e"} err="failed to get container status \"addab9183a59b18c2eef543b9a03cbe0d36c9b85669aabb035e9091fad36122e\": rpc error: code = NotFound desc = could not find container \"addab9183a59b18c2eef543b9a03cbe0d36c9b85669aabb035e9091fad36122e\": container with ID starting with addab9183a59b18c2eef543b9a03cbe0d36c9b85669aabb035e9091fad36122e not found: ID does not exist" Mar 18 16:47:51.532446 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:47:51.532421 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6b5f87f654-8m79q"] Mar 18 16:47:51.536398 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:47:51.536374 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6b5f87f654-8m79q"] Mar 18 16:47:51.565028 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:47:51.564996 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58e6ce32-c778-4728-8c51-bf2e4f5525d0" path="/var/lib/kubelet/pods/58e6ce32-c778-4728-8c51-bf2e4f5525d0/volumes" Mar 18 16:48:20.155831 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:20.155789 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwrfc9"] Mar 18 16:48:20.156227 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:20.156100 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="58e6ce32-c778-4728-8c51-bf2e4f5525d0" containerName="console" Mar 18 16:48:20.156227 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:20.156112 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="58e6ce32-c778-4728-8c51-bf2e4f5525d0" containerName="console" Mar 18 16:48:20.156227 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:20.156160 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="58e6ce32-c778-4728-8c51-bf2e4f5525d0" containerName="console" Mar 18 16:48:20.159072 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:20.159052 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwrfc9" Mar 18 16:48:20.160920 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:20.160899 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Mar 18 16:48:20.161087 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:20.160899 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-rr5n6\"" Mar 18 16:48:20.161321 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:20.161305 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Mar 18 16:48:20.169662 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:20.169636 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwrfc9"] Mar 18 16:48:20.232219 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:20.232183 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/094df2f1-a012-48cf-9d21-121536972bae-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwrfc9\" (UID: \"094df2f1-a012-48cf-9d21-121536972bae\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwrfc9" Mar 18 16:48:20.232408 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:20.232230 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfn2r\" (UniqueName: \"kubernetes.io/projected/094df2f1-a012-48cf-9d21-121536972bae-kube-api-access-sfn2r\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwrfc9\" (UID: \"094df2f1-a012-48cf-9d21-121536972bae\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwrfc9" Mar 18 16:48:20.232408 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:20.232260 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/094df2f1-a012-48cf-9d21-121536972bae-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwrfc9\" (UID: \"094df2f1-a012-48cf-9d21-121536972bae\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwrfc9" Mar 18 16:48:20.333049 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:20.333020 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/094df2f1-a012-48cf-9d21-121536972bae-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwrfc9\" (UID: \"094df2f1-a012-48cf-9d21-121536972bae\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwrfc9" Mar 18 16:48:20.333163 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:20.333079 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/094df2f1-a012-48cf-9d21-121536972bae-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwrfc9\" (UID: \"094df2f1-a012-48cf-9d21-121536972bae\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwrfc9" Mar 18 16:48:20.333163 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:20.333118 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sfn2r\" (UniqueName: \"kubernetes.io/projected/094df2f1-a012-48cf-9d21-121536972bae-kube-api-access-sfn2r\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwrfc9\" (UID: \"094df2f1-a012-48cf-9d21-121536972bae\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwrfc9" Mar 18 16:48:20.333486 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:20.333461 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/094df2f1-a012-48cf-9d21-121536972bae-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwrfc9\" (UID: \"094df2f1-a012-48cf-9d21-121536972bae\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwrfc9" Mar 18 16:48:20.333556 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:20.333509 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/094df2f1-a012-48cf-9d21-121536972bae-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwrfc9\" (UID: \"094df2f1-a012-48cf-9d21-121536972bae\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwrfc9" Mar 18 16:48:20.341274 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:20.341249 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfn2r\" (UniqueName: \"kubernetes.io/projected/094df2f1-a012-48cf-9d21-121536972bae-kube-api-access-sfn2r\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwrfc9\" (UID: \"094df2f1-a012-48cf-9d21-121536972bae\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwrfc9" Mar 18 16:48:20.467573 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:20.467531 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwrfc9" Mar 18 16:48:20.589495 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:20.589454 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwrfc9"] Mar 18 16:48:20.598143 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:48:20.598110 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod094df2f1_a012_48cf_9d21_121536972bae.slice/crio-9b11311d8b9bbfdf900e2589ac57c3235b46f99d5103d9c3bfc34b1ba6b1ec3d WatchSource:0}: Error finding container 9b11311d8b9bbfdf900e2589ac57c3235b46f99d5103d9c3bfc34b1ba6b1ec3d: Status 404 returned error can't find the container with id 9b11311d8b9bbfdf900e2589ac57c3235b46f99d5103d9c3bfc34b1ba6b1ec3d Mar 18 16:48:21.589048 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:21.589008 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwrfc9" event={"ID":"094df2f1-a012-48cf-9d21-121536972bae","Type":"ContainerStarted","Data":"9b11311d8b9bbfdf900e2589ac57c3235b46f99d5103d9c3bfc34b1ba6b1ec3d"} Mar 18 16:48:28.609409 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:28.609375 2576 generic.go:358] "Generic (PLEG): container finished" podID="094df2f1-a012-48cf-9d21-121536972bae" containerID="ecfdc54242972ecd15dd46a29c4b179544485c106d66b84e57d137596c303668" exitCode=0 Mar 18 16:48:28.609781 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:28.609421 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwrfc9" event={"ID":"094df2f1-a012-48cf-9d21-121536972bae","Type":"ContainerDied","Data":"ecfdc54242972ecd15dd46a29c4b179544485c106d66b84e57d137596c303668"} Mar 18 16:48:29.430767 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:29.430723 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g2jdr_df16c6f7-55c8-4f26-80fc-ae0ba3fa368a/ovn-acl-logging/0.log" Mar 18 16:48:29.431362 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:29.431272 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g2jdr_df16c6f7-55c8-4f26-80fc-ae0ba3fa368a/ovn-acl-logging/0.log" Mar 18 16:48:29.444931 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:29.444900 2576 kubelet.go:1628] "Image garbage collection succeeded" Mar 18 16:48:31.618542 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:31.618503 2576 generic.go:358] "Generic (PLEG): container finished" podID="094df2f1-a012-48cf-9d21-121536972bae" containerID="d1c1b5c8446422304f889d0c27a20ec0c65d17d849839ba6bd2f493f8e235c06" exitCode=0 Mar 18 16:48:31.618999 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:31.618546 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwrfc9" event={"ID":"094df2f1-a012-48cf-9d21-121536972bae","Type":"ContainerDied","Data":"d1c1b5c8446422304f889d0c27a20ec0c65d17d849839ba6bd2f493f8e235c06"} Mar 18 16:48:31.619566 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:31.619548 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 16:48:37.639970 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:37.639915 2576 generic.go:358] "Generic (PLEG): container finished" podID="094df2f1-a012-48cf-9d21-121536972bae" containerID="e41a913cd2a77902baaf482751e643c1365329afeedff57fe17637cd3b8e1a9b" exitCode=0 Mar 18 16:48:37.640402 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:37.640006 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwrfc9" event={"ID":"094df2f1-a012-48cf-9d21-121536972bae","Type":"ContainerDied","Data":"e41a913cd2a77902baaf482751e643c1365329afeedff57fe17637cd3b8e1a9b"} Mar 18 16:48:38.758151 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:38.758128 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwrfc9" Mar 18 16:48:38.782455 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:38.782427 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sfn2r\" (UniqueName: \"kubernetes.io/projected/094df2f1-a012-48cf-9d21-121536972bae-kube-api-access-sfn2r\") pod \"094df2f1-a012-48cf-9d21-121536972bae\" (UID: \"094df2f1-a012-48cf-9d21-121536972bae\") " Mar 18 16:48:38.782631 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:38.782467 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/094df2f1-a012-48cf-9d21-121536972bae-bundle\") pod \"094df2f1-a012-48cf-9d21-121536972bae\" (UID: \"094df2f1-a012-48cf-9d21-121536972bae\") " Mar 18 16:48:38.782631 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:38.782502 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/094df2f1-a012-48cf-9d21-121536972bae-util\") pod \"094df2f1-a012-48cf-9d21-121536972bae\" (UID: \"094df2f1-a012-48cf-9d21-121536972bae\") " Mar 18 16:48:38.783098 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:38.783064 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/094df2f1-a012-48cf-9d21-121536972bae-bundle" (OuterVolumeSpecName: "bundle") pod "094df2f1-a012-48cf-9d21-121536972bae" (UID: "094df2f1-a012-48cf-9d21-121536972bae"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 18 16:48:38.784771 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:38.784735 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/094df2f1-a012-48cf-9d21-121536972bae-kube-api-access-sfn2r" (OuterVolumeSpecName: "kube-api-access-sfn2r") pod "094df2f1-a012-48cf-9d21-121536972bae" (UID: "094df2f1-a012-48cf-9d21-121536972bae"). InnerVolumeSpecName "kube-api-access-sfn2r". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 16:48:38.788036 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:38.788010 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/094df2f1-a012-48cf-9d21-121536972bae-util" (OuterVolumeSpecName: "util") pod "094df2f1-a012-48cf-9d21-121536972bae" (UID: "094df2f1-a012-48cf-9d21-121536972bae"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 18 16:48:38.883432 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:38.883393 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sfn2r\" (UniqueName: \"kubernetes.io/projected/094df2f1-a012-48cf-9d21-121536972bae-kube-api-access-sfn2r\") on node \"ip-10-0-133-190.ec2.internal\" DevicePath \"\"" Mar 18 16:48:38.883432 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:38.883425 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/094df2f1-a012-48cf-9d21-121536972bae-bundle\") on node \"ip-10-0-133-190.ec2.internal\" DevicePath \"\"" Mar 18 16:48:38.883432 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:38.883436 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/094df2f1-a012-48cf-9d21-121536972bae-util\") on node \"ip-10-0-133-190.ec2.internal\" DevicePath \"\"" Mar 18 16:48:39.651027 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:39.650988 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwrfc9" event={"ID":"094df2f1-a012-48cf-9d21-121536972bae","Type":"ContainerDied","Data":"9b11311d8b9bbfdf900e2589ac57c3235b46f99d5103d9c3bfc34b1ba6b1ec3d"} Mar 18 16:48:39.651027 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:39.651026 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b11311d8b9bbfdf900e2589ac57c3235b46f99d5103d9c3bfc34b1ba6b1ec3d" Mar 18 16:48:39.651227 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:39.651046 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwrfc9" Mar 18 16:48:41.745661 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:41.745621 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-bxcb2"] Mar 18 16:48:41.746082 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:41.745924 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="094df2f1-a012-48cf-9d21-121536972bae" containerName="util" Mar 18 16:48:41.746082 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:41.745952 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="094df2f1-a012-48cf-9d21-121536972bae" containerName="util" Mar 18 16:48:41.746082 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:41.745965 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="094df2f1-a012-48cf-9d21-121536972bae" containerName="pull" Mar 18 16:48:41.746082 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:41.745971 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="094df2f1-a012-48cf-9d21-121536972bae" containerName="pull" Mar 18 16:48:41.746082 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:41.745985 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="094df2f1-a012-48cf-9d21-121536972bae" containerName="extract" Mar 18 16:48:41.746082 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:41.745991 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="094df2f1-a012-48cf-9d21-121536972bae" containerName="extract" Mar 18 16:48:41.746082 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:41.746034 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="094df2f1-a012-48cf-9d21-121536972bae" containerName="extract" Mar 18 16:48:41.748607 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:41.748590 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-bxcb2" Mar 18 16:48:41.750607 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:41.750581 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Mar 18 16:48:41.750765 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:41.750645 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-s6bxw\"" Mar 18 16:48:41.750765 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:41.750703 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Mar 18 16:48:41.750927 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:41.750911 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Mar 18 16:48:41.761281 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:41.761258 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-bxcb2"] Mar 18 16:48:41.808683 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:41.808651 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sznf6\" (UniqueName: \"kubernetes.io/projected/04b92d10-a9c6-49b7-8363-dacc7f5cff27-kube-api-access-sznf6\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-bxcb2\" (UID: \"04b92d10-a9c6-49b7-8363-dacc7f5cff27\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-bxcb2" Mar 18 16:48:41.808871 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:41.808704 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/04b92d10-a9c6-49b7-8363-dacc7f5cff27-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-bxcb2\" (UID: \"04b92d10-a9c6-49b7-8363-dacc7f5cff27\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-bxcb2" Mar 18 16:48:41.909974 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:41.909919 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sznf6\" (UniqueName: \"kubernetes.io/projected/04b92d10-a9c6-49b7-8363-dacc7f5cff27-kube-api-access-sznf6\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-bxcb2\" (UID: \"04b92d10-a9c6-49b7-8363-dacc7f5cff27\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-bxcb2" Mar 18 16:48:41.910092 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:41.909997 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/04b92d10-a9c6-49b7-8363-dacc7f5cff27-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-bxcb2\" (UID: \"04b92d10-a9c6-49b7-8363-dacc7f5cff27\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-bxcb2" Mar 18 16:48:41.912422 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:41.912400 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/04b92d10-a9c6-49b7-8363-dacc7f5cff27-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-bxcb2\" (UID: \"04b92d10-a9c6-49b7-8363-dacc7f5cff27\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-bxcb2" Mar 18 16:48:41.919833 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:41.919808 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sznf6\" (UniqueName: \"kubernetes.io/projected/04b92d10-a9c6-49b7-8363-dacc7f5cff27-kube-api-access-sznf6\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-bxcb2\" (UID: \"04b92d10-a9c6-49b7-8363-dacc7f5cff27\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-bxcb2" Mar 18 16:48:42.057795 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:42.057711 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-bxcb2" Mar 18 16:48:42.185727 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:42.185694 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-bxcb2"] Mar 18 16:48:42.190330 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:48:42.190299 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04b92d10_a9c6_49b7_8363_dacc7f5cff27.slice/crio-7e2ef3c0dfc6eeed78c91a29623be5ea141ddc43c06e64dc5890b4e5206f891f WatchSource:0}: Error finding container 7e2ef3c0dfc6eeed78c91a29623be5ea141ddc43c06e64dc5890b4e5206f891f: Status 404 returned error can't find the container with id 7e2ef3c0dfc6eeed78c91a29623be5ea141ddc43c06e64dc5890b4e5206f891f Mar 18 16:48:42.659829 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:42.659790 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-bxcb2" event={"ID":"04b92d10-a9c6-49b7-8363-dacc7f5cff27","Type":"ContainerStarted","Data":"7e2ef3c0dfc6eeed78c91a29623be5ea141ddc43c06e64dc5890b4e5206f891f"} Mar 18 16:48:49.151151 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:49.151117 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-k9tkc"] Mar 18 16:48:49.165231 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:49.165203 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-k9tkc" Mar 18 16:48:49.165465 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:49.165441 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-k9tkc"] Mar 18 16:48:49.167100 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:49.167076 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Mar 18 16:48:49.167217 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:49.167076 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Mar 18 16:48:49.167217 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:49.167088 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-6vrgk\"" Mar 18 16:48:49.271802 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:49.271768 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2c9rf\" (UniqueName: \"kubernetes.io/projected/140481c5-d7d9-4ae0-9069-ce9555554fea-kube-api-access-2c9rf\") pod \"keda-operator-ffbb595cb-k9tkc\" (UID: \"140481c5-d7d9-4ae0-9069-ce9555554fea\") " pod="openshift-keda/keda-operator-ffbb595cb-k9tkc" Mar 18 16:48:49.271987 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:49.271842 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/140481c5-d7d9-4ae0-9069-ce9555554fea-cabundle0\") pod \"keda-operator-ffbb595cb-k9tkc\" (UID: \"140481c5-d7d9-4ae0-9069-ce9555554fea\") " pod="openshift-keda/keda-operator-ffbb595cb-k9tkc" Mar 18 16:48:49.271987 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:49.271913 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/140481c5-d7d9-4ae0-9069-ce9555554fea-certificates\") pod \"keda-operator-ffbb595cb-k9tkc\" (UID: \"140481c5-d7d9-4ae0-9069-ce9555554fea\") " pod="openshift-keda/keda-operator-ffbb595cb-k9tkc" Mar 18 16:48:49.372347 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:49.372312 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2c9rf\" (UniqueName: \"kubernetes.io/projected/140481c5-d7d9-4ae0-9069-ce9555554fea-kube-api-access-2c9rf\") pod \"keda-operator-ffbb595cb-k9tkc\" (UID: \"140481c5-d7d9-4ae0-9069-ce9555554fea\") " pod="openshift-keda/keda-operator-ffbb595cb-k9tkc" Mar 18 16:48:49.372553 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:49.372365 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/140481c5-d7d9-4ae0-9069-ce9555554fea-cabundle0\") pod \"keda-operator-ffbb595cb-k9tkc\" (UID: \"140481c5-d7d9-4ae0-9069-ce9555554fea\") " pod="openshift-keda/keda-operator-ffbb595cb-k9tkc" Mar 18 16:48:49.372553 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:49.372487 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/140481c5-d7d9-4ae0-9069-ce9555554fea-certificates\") pod \"keda-operator-ffbb595cb-k9tkc\" (UID: \"140481c5-d7d9-4ae0-9069-ce9555554fea\") " pod="openshift-keda/keda-operator-ffbb595cb-k9tkc" Mar 18 16:48:49.372672 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:48:49.372608 2576 secret.go:281] references non-existent secret key: ca.crt Mar 18 16:48:49.372672 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:48:49.372624 2576 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Mar 18 16:48:49.372672 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:48:49.372634 2576 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-k9tkc: references non-existent secret key: ca.crt Mar 18 16:48:49.372813 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:48:49.372697 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/140481c5-d7d9-4ae0-9069-ce9555554fea-certificates podName:140481c5-d7d9-4ae0-9069-ce9555554fea nodeName:}" failed. No retries permitted until 2026-03-18 16:48:49.872678861 +0000 UTC m=+320.922494267 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/140481c5-d7d9-4ae0-9069-ce9555554fea-certificates") pod "keda-operator-ffbb595cb-k9tkc" (UID: "140481c5-d7d9-4ae0-9069-ce9555554fea") : references non-existent secret key: ca.crt Mar 18 16:48:49.373158 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:49.373142 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/140481c5-d7d9-4ae0-9069-ce9555554fea-cabundle0\") pod \"keda-operator-ffbb595cb-k9tkc\" (UID: \"140481c5-d7d9-4ae0-9069-ce9555554fea\") " pod="openshift-keda/keda-operator-ffbb595cb-k9tkc" Mar 18 16:48:49.382447 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:49.382419 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2c9rf\" (UniqueName: \"kubernetes.io/projected/140481c5-d7d9-4ae0-9069-ce9555554fea-kube-api-access-2c9rf\") pod \"keda-operator-ffbb595cb-k9tkc\" (UID: \"140481c5-d7d9-4ae0-9069-ce9555554fea\") " pod="openshift-keda/keda-operator-ffbb595cb-k9tkc" Mar 18 16:48:49.437264 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:49.437188 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-zm7rf"] Mar 18 16:48:49.456405 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:49.456369 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-zm7rf"] Mar 18 16:48:49.456551 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:49.456510 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-zm7rf" Mar 18 16:48:49.458375 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:49.458349 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Mar 18 16:48:49.573672 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:49.573636 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/aca5b7ba-246a-44d8-9c40-655184405a3d-certificates\") pod \"keda-metrics-apiserver-7c9f485588-zm7rf\" (UID: \"aca5b7ba-246a-44d8-9c40-655184405a3d\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-zm7rf" Mar 18 16:48:49.573857 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:49.573683 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stphx\" (UniqueName: \"kubernetes.io/projected/aca5b7ba-246a-44d8-9c40-655184405a3d-kube-api-access-stphx\") pod \"keda-metrics-apiserver-7c9f485588-zm7rf\" (UID: \"aca5b7ba-246a-44d8-9c40-655184405a3d\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-zm7rf" Mar 18 16:48:49.573857 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:49.573782 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/aca5b7ba-246a-44d8-9c40-655184405a3d-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-zm7rf\" (UID: \"aca5b7ba-246a-44d8-9c40-655184405a3d\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-zm7rf" Mar 18 16:48:49.674969 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:49.674914 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/aca5b7ba-246a-44d8-9c40-655184405a3d-certificates\") pod \"keda-metrics-apiserver-7c9f485588-zm7rf\" (UID: \"aca5b7ba-246a-44d8-9c40-655184405a3d\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-zm7rf" Mar 18 16:48:49.675164 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:49.674983 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-stphx\" (UniqueName: \"kubernetes.io/projected/aca5b7ba-246a-44d8-9c40-655184405a3d-kube-api-access-stphx\") pod \"keda-metrics-apiserver-7c9f485588-zm7rf\" (UID: \"aca5b7ba-246a-44d8-9c40-655184405a3d\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-zm7rf" Mar 18 16:48:49.675164 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:48:49.675097 2576 secret.go:281] references non-existent secret key: tls.crt Mar 18 16:48:49.675164 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:48:49.675124 2576 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Mar 18 16:48:49.675164 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:48:49.675146 2576 projected.go:264] Couldn't get secret openshift-keda/keda-metrics-apiserver-certs: secret "keda-metrics-apiserver-certs" not found Mar 18 16:48:49.675164 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:49.675153 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/aca5b7ba-246a-44d8-9c40-655184405a3d-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-zm7rf\" (UID: \"aca5b7ba-246a-44d8-9c40-655184405a3d\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-zm7rf" Mar 18 16:48:49.675401 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:48:49.675171 2576 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-zm7rf: [references non-existent secret key: tls.crt, secret "keda-metrics-apiserver-certs" not found] Mar 18 16:48:49.675401 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:48:49.675255 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/aca5b7ba-246a-44d8-9c40-655184405a3d-certificates podName:aca5b7ba-246a-44d8-9c40-655184405a3d nodeName:}" failed. No retries permitted until 2026-03-18 16:48:50.175236038 +0000 UTC m=+321.225051460 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/aca5b7ba-246a-44d8-9c40-655184405a3d-certificates") pod "keda-metrics-apiserver-7c9f485588-zm7rf" (UID: "aca5b7ba-246a-44d8-9c40-655184405a3d") : [references non-existent secret key: tls.crt, secret "keda-metrics-apiserver-certs" not found] Mar 18 16:48:49.675609 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:49.675567 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/aca5b7ba-246a-44d8-9c40-655184405a3d-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-zm7rf\" (UID: \"aca5b7ba-246a-44d8-9c40-655184405a3d\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-zm7rf" Mar 18 16:48:49.682751 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:49.682710 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-bxcb2" event={"ID":"04b92d10-a9c6-49b7-8363-dacc7f5cff27","Type":"ContainerStarted","Data":"132a21831894485221848ed24178f3cd023c5db22cac974a22b54bc2b5530bbe"} Mar 18 16:48:49.683000 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:49.682981 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-bxcb2" Mar 18 16:48:49.685902 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:49.685873 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-stphx\" (UniqueName: \"kubernetes.io/projected/aca5b7ba-246a-44d8-9c40-655184405a3d-kube-api-access-stphx\") pod \"keda-metrics-apiserver-7c9f485588-zm7rf\" (UID: \"aca5b7ba-246a-44d8-9c40-655184405a3d\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-zm7rf" Mar 18 16:48:49.702690 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:49.702596 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-bxcb2" podStartSLOduration=2.252931541 podStartE2EDuration="8.702576954s" podCreationTimestamp="2026-03-18 16:48:41 +0000 UTC" firstStartedPulling="2026-03-18 16:48:42.192233798 +0000 UTC m=+313.242049219" lastFinishedPulling="2026-03-18 16:48:48.641879227 +0000 UTC m=+319.691694632" observedRunningTime="2026-03-18 16:48:49.701076587 +0000 UTC m=+320.750892015" watchObservedRunningTime="2026-03-18 16:48:49.702576954 +0000 UTC m=+320.752392382" Mar 18 16:48:49.725961 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:49.725915 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-89rv6"] Mar 18 16:48:49.748814 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:49.748785 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-89rv6"] Mar 18 16:48:49.748985 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:49.748918 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-89rv6" Mar 18 16:48:49.750598 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:49.750574 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Mar 18 16:48:49.876924 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:49.876887 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/8a6fac06-d564-40cf-ac8a-2cc567c7e70b-certificates\") pod \"keda-admission-cf49989db-89rv6\" (UID: \"8a6fac06-d564-40cf-ac8a-2cc567c7e70b\") " pod="openshift-keda/keda-admission-cf49989db-89rv6" Mar 18 16:48:49.877124 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:49.877046 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cn5tz\" (UniqueName: \"kubernetes.io/projected/8a6fac06-d564-40cf-ac8a-2cc567c7e70b-kube-api-access-cn5tz\") pod \"keda-admission-cf49989db-89rv6\" (UID: \"8a6fac06-d564-40cf-ac8a-2cc567c7e70b\") " pod="openshift-keda/keda-admission-cf49989db-89rv6" Mar 18 16:48:49.877124 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:49.877073 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/140481c5-d7d9-4ae0-9069-ce9555554fea-certificates\") pod \"keda-operator-ffbb595cb-k9tkc\" (UID: \"140481c5-d7d9-4ae0-9069-ce9555554fea\") " pod="openshift-keda/keda-operator-ffbb595cb-k9tkc" Mar 18 16:48:49.877227 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:48:49.877177 2576 secret.go:281] references non-existent secret key: ca.crt Mar 18 16:48:49.877227 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:48:49.877191 2576 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Mar 18 16:48:49.877227 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:48:49.877201 2576 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-k9tkc: references non-existent secret key: ca.crt Mar 18 16:48:49.877366 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:48:49.877249 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/140481c5-d7d9-4ae0-9069-ce9555554fea-certificates podName:140481c5-d7d9-4ae0-9069-ce9555554fea nodeName:}" failed. No retries permitted until 2026-03-18 16:48:50.877232731 +0000 UTC m=+321.927048150 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/140481c5-d7d9-4ae0-9069-ce9555554fea-certificates") pod "keda-operator-ffbb595cb-k9tkc" (UID: "140481c5-d7d9-4ae0-9069-ce9555554fea") : references non-existent secret key: ca.crt Mar 18 16:48:49.978156 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:49.978111 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cn5tz\" (UniqueName: \"kubernetes.io/projected/8a6fac06-d564-40cf-ac8a-2cc567c7e70b-kube-api-access-cn5tz\") pod \"keda-admission-cf49989db-89rv6\" (UID: \"8a6fac06-d564-40cf-ac8a-2cc567c7e70b\") " pod="openshift-keda/keda-admission-cf49989db-89rv6" Mar 18 16:48:49.978358 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:49.978182 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/8a6fac06-d564-40cf-ac8a-2cc567c7e70b-certificates\") pod \"keda-admission-cf49989db-89rv6\" (UID: \"8a6fac06-d564-40cf-ac8a-2cc567c7e70b\") " pod="openshift-keda/keda-admission-cf49989db-89rv6" Mar 18 16:48:49.978428 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:48:49.978355 2576 projected.go:264] Couldn't get secret openshift-keda/keda-admission-webhooks-certs: secret "keda-admission-webhooks-certs" not found Mar 18 16:48:49.978428 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:48:49.978380 2576 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-admission-cf49989db-89rv6: secret "keda-admission-webhooks-certs" not found Mar 18 16:48:49.978535 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:48:49.978432 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8a6fac06-d564-40cf-ac8a-2cc567c7e70b-certificates podName:8a6fac06-d564-40cf-ac8a-2cc567c7e70b nodeName:}" failed. No retries permitted until 2026-03-18 16:48:50.478414715 +0000 UTC m=+321.528230125 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/8a6fac06-d564-40cf-ac8a-2cc567c7e70b-certificates") pod "keda-admission-cf49989db-89rv6" (UID: "8a6fac06-d564-40cf-ac8a-2cc567c7e70b") : secret "keda-admission-webhooks-certs" not found Mar 18 16:48:49.987424 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:49.987393 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cn5tz\" (UniqueName: \"kubernetes.io/projected/8a6fac06-d564-40cf-ac8a-2cc567c7e70b-kube-api-access-cn5tz\") pod \"keda-admission-cf49989db-89rv6\" (UID: \"8a6fac06-d564-40cf-ac8a-2cc567c7e70b\") " pod="openshift-keda/keda-admission-cf49989db-89rv6" Mar 18 16:48:50.180107 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:50.180056 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/aca5b7ba-246a-44d8-9c40-655184405a3d-certificates\") pod \"keda-metrics-apiserver-7c9f485588-zm7rf\" (UID: \"aca5b7ba-246a-44d8-9c40-655184405a3d\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-zm7rf" Mar 18 16:48:50.180557 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:48:50.180215 2576 secret.go:281] references non-existent secret key: tls.crt Mar 18 16:48:50.180557 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:48:50.180241 2576 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Mar 18 16:48:50.180557 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:48:50.180260 2576 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-zm7rf: references non-existent secret key: tls.crt Mar 18 16:48:50.180557 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:48:50.180320 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/aca5b7ba-246a-44d8-9c40-655184405a3d-certificates podName:aca5b7ba-246a-44d8-9c40-655184405a3d nodeName:}" failed. No retries permitted until 2026-03-18 16:48:51.180301614 +0000 UTC m=+322.230117251 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/aca5b7ba-246a-44d8-9c40-655184405a3d-certificates") pod "keda-metrics-apiserver-7c9f485588-zm7rf" (UID: "aca5b7ba-246a-44d8-9c40-655184405a3d") : references non-existent secret key: tls.crt Mar 18 16:48:50.483119 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:50.483080 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/8a6fac06-d564-40cf-ac8a-2cc567c7e70b-certificates\") pod \"keda-admission-cf49989db-89rv6\" (UID: \"8a6fac06-d564-40cf-ac8a-2cc567c7e70b\") " pod="openshift-keda/keda-admission-cf49989db-89rv6" Mar 18 16:48:50.485511 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:50.485489 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/8a6fac06-d564-40cf-ac8a-2cc567c7e70b-certificates\") pod \"keda-admission-cf49989db-89rv6\" (UID: \"8a6fac06-d564-40cf-ac8a-2cc567c7e70b\") " pod="openshift-keda/keda-admission-cf49989db-89rv6" Mar 18 16:48:50.660790 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:50.660747 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-89rv6" Mar 18 16:48:50.787258 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:50.787087 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-89rv6"] Mar 18 16:48:50.789904 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:48:50.789872 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a6fac06_d564_40cf_ac8a_2cc567c7e70b.slice/crio-66b497d936595d653c36ea9752d1bba6cd11cdf01e9dfd7bc0491cbbcafaef67 WatchSource:0}: Error finding container 66b497d936595d653c36ea9752d1bba6cd11cdf01e9dfd7bc0491cbbcafaef67: Status 404 returned error can't find the container with id 66b497d936595d653c36ea9752d1bba6cd11cdf01e9dfd7bc0491cbbcafaef67 Mar 18 16:48:50.885387 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:50.885348 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/140481c5-d7d9-4ae0-9069-ce9555554fea-certificates\") pod \"keda-operator-ffbb595cb-k9tkc\" (UID: \"140481c5-d7d9-4ae0-9069-ce9555554fea\") " pod="openshift-keda/keda-operator-ffbb595cb-k9tkc" Mar 18 16:48:50.885549 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:48:50.885529 2576 secret.go:281] references non-existent secret key: ca.crt Mar 18 16:48:50.885586 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:48:50.885553 2576 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Mar 18 16:48:50.885586 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:48:50.885566 2576 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-k9tkc: references non-existent secret key: ca.crt Mar 18 16:48:50.885653 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:48:50.885635 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/140481c5-d7d9-4ae0-9069-ce9555554fea-certificates podName:140481c5-d7d9-4ae0-9069-ce9555554fea nodeName:}" failed. No retries permitted until 2026-03-18 16:48:52.885614733 +0000 UTC m=+323.935430156 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/140481c5-d7d9-4ae0-9069-ce9555554fea-certificates") pod "keda-operator-ffbb595cb-k9tkc" (UID: "140481c5-d7d9-4ae0-9069-ce9555554fea") : references non-existent secret key: ca.crt Mar 18 16:48:51.188222 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:51.188128 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/aca5b7ba-246a-44d8-9c40-655184405a3d-certificates\") pod \"keda-metrics-apiserver-7c9f485588-zm7rf\" (UID: \"aca5b7ba-246a-44d8-9c40-655184405a3d\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-zm7rf" Mar 18 16:48:51.188576 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:48:51.188281 2576 secret.go:281] references non-existent secret key: tls.crt Mar 18 16:48:51.188576 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:48:51.188302 2576 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Mar 18 16:48:51.188576 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:48:51.188322 2576 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-zm7rf: references non-existent secret key: tls.crt Mar 18 16:48:51.188576 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:48:51.188375 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/aca5b7ba-246a-44d8-9c40-655184405a3d-certificates podName:aca5b7ba-246a-44d8-9c40-655184405a3d nodeName:}" failed. No retries permitted until 2026-03-18 16:48:53.188360728 +0000 UTC m=+324.238176136 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/aca5b7ba-246a-44d8-9c40-655184405a3d-certificates") pod "keda-metrics-apiserver-7c9f485588-zm7rf" (UID: "aca5b7ba-246a-44d8-9c40-655184405a3d") : references non-existent secret key: tls.crt Mar 18 16:48:51.688784 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:51.688748 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-89rv6" event={"ID":"8a6fac06-d564-40cf-ac8a-2cc567c7e70b","Type":"ContainerStarted","Data":"66b497d936595d653c36ea9752d1bba6cd11cdf01e9dfd7bc0491cbbcafaef67"} Mar 18 16:48:52.903802 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:52.903763 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/140481c5-d7d9-4ae0-9069-ce9555554fea-certificates\") pod \"keda-operator-ffbb595cb-k9tkc\" (UID: \"140481c5-d7d9-4ae0-9069-ce9555554fea\") " pod="openshift-keda/keda-operator-ffbb595cb-k9tkc" Mar 18 16:48:52.904254 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:48:52.903968 2576 secret.go:281] references non-existent secret key: ca.crt Mar 18 16:48:52.904254 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:48:52.903992 2576 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Mar 18 16:48:52.904254 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:48:52.904007 2576 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-k9tkc: references non-existent secret key: ca.crt Mar 18 16:48:52.904254 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:48:52.904090 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/140481c5-d7d9-4ae0-9069-ce9555554fea-certificates podName:140481c5-d7d9-4ae0-9069-ce9555554fea nodeName:}" failed. No retries permitted until 2026-03-18 16:48:56.904068948 +0000 UTC m=+327.953884385 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/140481c5-d7d9-4ae0-9069-ce9555554fea-certificates") pod "keda-operator-ffbb595cb-k9tkc" (UID: "140481c5-d7d9-4ae0-9069-ce9555554fea") : references non-existent secret key: ca.crt Mar 18 16:48:53.207464 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:53.207368 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/aca5b7ba-246a-44d8-9c40-655184405a3d-certificates\") pod \"keda-metrics-apiserver-7c9f485588-zm7rf\" (UID: \"aca5b7ba-246a-44d8-9c40-655184405a3d\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-zm7rf" Mar 18 16:48:53.207632 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:48:53.207521 2576 secret.go:281] references non-existent secret key: tls.crt Mar 18 16:48:53.207632 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:48:53.207542 2576 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Mar 18 16:48:53.207632 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:48:53.207560 2576 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-zm7rf: references non-existent secret key: tls.crt Mar 18 16:48:53.207632 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:48:53.207614 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/aca5b7ba-246a-44d8-9c40-655184405a3d-certificates podName:aca5b7ba-246a-44d8-9c40-655184405a3d nodeName:}" failed. No retries permitted until 2026-03-18 16:48:57.207599801 +0000 UTC m=+328.257415211 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/aca5b7ba-246a-44d8-9c40-655184405a3d-certificates") pod "keda-metrics-apiserver-7c9f485588-zm7rf" (UID: "aca5b7ba-246a-44d8-9c40-655184405a3d") : references non-existent secret key: tls.crt Mar 18 16:48:53.695826 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:53.695793 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-89rv6" event={"ID":"8a6fac06-d564-40cf-ac8a-2cc567c7e70b","Type":"ContainerStarted","Data":"77831388a76f1c13328ccf5bbc1af0613456dd0c79a7c06de7e3d87a3e9d2b7e"} Mar 18 16:48:53.696023 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:53.695910 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-89rv6" Mar 18 16:48:53.711715 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:53.711669 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-89rv6" podStartSLOduration=2.289741912 podStartE2EDuration="4.711653364s" podCreationTimestamp="2026-03-18 16:48:49 +0000 UTC" firstStartedPulling="2026-03-18 16:48:50.791267753 +0000 UTC m=+321.841083173" lastFinishedPulling="2026-03-18 16:48:53.213179218 +0000 UTC m=+324.262994625" observedRunningTime="2026-03-18 16:48:53.710354935 +0000 UTC m=+324.760170361" watchObservedRunningTime="2026-03-18 16:48:53.711653364 +0000 UTC m=+324.761468791" Mar 18 16:48:56.939784 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:56.939728 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/140481c5-d7d9-4ae0-9069-ce9555554fea-certificates\") pod \"keda-operator-ffbb595cb-k9tkc\" (UID: \"140481c5-d7d9-4ae0-9069-ce9555554fea\") " pod="openshift-keda/keda-operator-ffbb595cb-k9tkc" Mar 18 16:48:56.942242 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:56.942223 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/140481c5-d7d9-4ae0-9069-ce9555554fea-certificates\") pod \"keda-operator-ffbb595cb-k9tkc\" (UID: \"140481c5-d7d9-4ae0-9069-ce9555554fea\") " pod="openshift-keda/keda-operator-ffbb595cb-k9tkc" Mar 18 16:48:56.976290 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:56.976257 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-k9tkc" Mar 18 16:48:57.097909 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:57.097882 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-k9tkc"] Mar 18 16:48:57.100355 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:48:57.100316 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod140481c5_d7d9_4ae0_9069_ce9555554fea.slice/crio-46519ceb1e10807b78f046bd3103cec54a6d67584fdf22ad2c40b87e5fb28e68 WatchSource:0}: Error finding container 46519ceb1e10807b78f046bd3103cec54a6d67584fdf22ad2c40b87e5fb28e68: Status 404 returned error can't find the container with id 46519ceb1e10807b78f046bd3103cec54a6d67584fdf22ad2c40b87e5fb28e68 Mar 18 16:48:57.243081 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:57.243044 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/aca5b7ba-246a-44d8-9c40-655184405a3d-certificates\") pod \"keda-metrics-apiserver-7c9f485588-zm7rf\" (UID: \"aca5b7ba-246a-44d8-9c40-655184405a3d\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-zm7rf" Mar 18 16:48:57.245813 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:57.245790 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/aca5b7ba-246a-44d8-9c40-655184405a3d-certificates\") pod \"keda-metrics-apiserver-7c9f485588-zm7rf\" (UID: \"aca5b7ba-246a-44d8-9c40-655184405a3d\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-zm7rf" Mar 18 16:48:57.270194 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:57.270160 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-zm7rf" Mar 18 16:48:57.385504 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:57.385475 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-zm7rf"] Mar 18 16:48:57.387603 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:48:57.387576 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaca5b7ba_246a_44d8_9c40_655184405a3d.slice/crio-9e33a7fa430211dedd7a23b6a1ad43331f5aea796e66d7af6f6147d466db8d34 WatchSource:0}: Error finding container 9e33a7fa430211dedd7a23b6a1ad43331f5aea796e66d7af6f6147d466db8d34: Status 404 returned error can't find the container with id 9e33a7fa430211dedd7a23b6a1ad43331f5aea796e66d7af6f6147d466db8d34 Mar 18 16:48:57.707673 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:57.707566 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-k9tkc" event={"ID":"140481c5-d7d9-4ae0-9069-ce9555554fea","Type":"ContainerStarted","Data":"46519ceb1e10807b78f046bd3103cec54a6d67584fdf22ad2c40b87e5fb28e68"} Mar 18 16:48:57.708433 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:48:57.708403 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-zm7rf" event={"ID":"aca5b7ba-246a-44d8-9c40-655184405a3d","Type":"ContainerStarted","Data":"9e33a7fa430211dedd7a23b6a1ad43331f5aea796e66d7af6f6147d466db8d34"} Mar 18 16:49:02.726504 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:49:02.726459 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-k9tkc" event={"ID":"140481c5-d7d9-4ae0-9069-ce9555554fea","Type":"ContainerStarted","Data":"212ccee11f58dd4020f47a804c0c9f603966d1d5865f8d96579b509e63795431"} Mar 18 16:49:02.726960 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:49:02.726578 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-k9tkc" Mar 18 16:49:02.727885 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:49:02.727853 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-zm7rf" event={"ID":"aca5b7ba-246a-44d8-9c40-655184405a3d","Type":"ContainerStarted","Data":"a3caf27cc50d9196a09da8460742e165e3ed764ff7438a77f11d8b002c216a7e"} Mar 18 16:49:02.728003 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:49:02.727990 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-zm7rf" Mar 18 16:49:02.743128 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:49:02.743073 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-k9tkc" podStartSLOduration=8.259084148 podStartE2EDuration="13.743054165s" podCreationTimestamp="2026-03-18 16:48:49 +0000 UTC" firstStartedPulling="2026-03-18 16:48:57.101626284 +0000 UTC m=+328.151441690" lastFinishedPulling="2026-03-18 16:49:02.58559629 +0000 UTC m=+333.635411707" observedRunningTime="2026-03-18 16:49:02.74256003 +0000 UTC m=+333.792375456" watchObservedRunningTime="2026-03-18 16:49:02.743054165 +0000 UTC m=+333.792869594" Mar 18 16:49:02.789874 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:49:02.789813 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-zm7rf" podStartSLOduration=8.597036316 podStartE2EDuration="13.789793231s" podCreationTimestamp="2026-03-18 16:48:49 +0000 UTC" firstStartedPulling="2026-03-18 16:48:57.388867011 +0000 UTC m=+328.438682417" lastFinishedPulling="2026-03-18 16:49:02.581623926 +0000 UTC m=+333.631439332" observedRunningTime="2026-03-18 16:49:02.788720402 +0000 UTC m=+333.838535828" watchObservedRunningTime="2026-03-18 16:49:02.789793231 +0000 UTC m=+333.839608658" Mar 18 16:49:10.687772 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:49:10.687739 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-bxcb2" Mar 18 16:49:13.735060 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:49:13.735030 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-zm7rf" Mar 18 16:49:14.701626 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:49:14.701597 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-89rv6" Mar 18 16:49:23.733186 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:49:23.733097 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-k9tkc" Mar 18 16:49:54.142387 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:49:54.142348 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-69d7c9bbdc-dnfmw"] Mar 18 16:49:54.145694 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:49:54.145675 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-69d7c9bbdc-dnfmw" Mar 18 16:49:54.147767 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:49:54.147743 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Mar 18 16:49:54.148106 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:49:54.148085 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Mar 18 16:49:54.148214 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:49:54.148170 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-2zc9n\"" Mar 18 16:49:54.148214 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:49:54.148174 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Mar 18 16:49:54.154529 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:49:54.154507 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-x44z4"] Mar 18 16:49:54.157775 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:49:54.157755 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-69d7c9bbdc-dnfmw"] Mar 18 16:49:54.157888 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:49:54.157874 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-x44z4" Mar 18 16:49:54.160143 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:49:54.160121 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Mar 18 16:49:54.160231 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:49:54.160124 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-lcjsd\"" Mar 18 16:49:54.168580 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:49:54.168558 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-x44z4"] Mar 18 16:49:54.282107 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:49:54.282061 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3681644e-43e3-4920-8b89-827871283593-cert\") pod \"kserve-controller-manager-69d7c9bbdc-dnfmw\" (UID: \"3681644e-43e3-4920-8b89-827871283593\") " pod="kserve/kserve-controller-manager-69d7c9bbdc-dnfmw" Mar 18 16:49:54.282290 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:49:54.282111 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddjw9\" (UniqueName: \"kubernetes.io/projected/3681644e-43e3-4920-8b89-827871283593-kube-api-access-ddjw9\") pod \"kserve-controller-manager-69d7c9bbdc-dnfmw\" (UID: \"3681644e-43e3-4920-8b89-827871283593\") " pod="kserve/kserve-controller-manager-69d7c9bbdc-dnfmw" Mar 18 16:49:54.282290 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:49:54.282219 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pphjj\" (UniqueName: \"kubernetes.io/projected/a141314c-4a0e-4f87-b919-d9f540c7e434-kube-api-access-pphjj\") pod \"llmisvc-controller-manager-68cc5db7c4-x44z4\" (UID: \"a141314c-4a0e-4f87-b919-d9f540c7e434\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-x44z4" Mar 18 16:49:54.282290 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:49:54.282259 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a141314c-4a0e-4f87-b919-d9f540c7e434-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-x44z4\" (UID: \"a141314c-4a0e-4f87-b919-d9f540c7e434\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-x44z4" Mar 18 16:49:54.383428 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:49:54.383384 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3681644e-43e3-4920-8b89-827871283593-cert\") pod \"kserve-controller-manager-69d7c9bbdc-dnfmw\" (UID: \"3681644e-43e3-4920-8b89-827871283593\") " pod="kserve/kserve-controller-manager-69d7c9bbdc-dnfmw" Mar 18 16:49:54.383428 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:49:54.383432 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ddjw9\" (UniqueName: \"kubernetes.io/projected/3681644e-43e3-4920-8b89-827871283593-kube-api-access-ddjw9\") pod \"kserve-controller-manager-69d7c9bbdc-dnfmw\" (UID: \"3681644e-43e3-4920-8b89-827871283593\") " pod="kserve/kserve-controller-manager-69d7c9bbdc-dnfmw" Mar 18 16:49:54.383674 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:49:54.383505 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pphjj\" (UniqueName: \"kubernetes.io/projected/a141314c-4a0e-4f87-b919-d9f540c7e434-kube-api-access-pphjj\") pod \"llmisvc-controller-manager-68cc5db7c4-x44z4\" (UID: \"a141314c-4a0e-4f87-b919-d9f540c7e434\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-x44z4" Mar 18 16:49:54.383674 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:49:54.383537 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a141314c-4a0e-4f87-b919-d9f540c7e434-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-x44z4\" (UID: \"a141314c-4a0e-4f87-b919-d9f540c7e434\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-x44z4" Mar 18 16:49:54.385794 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:49:54.385762 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3681644e-43e3-4920-8b89-827871283593-cert\") pod \"kserve-controller-manager-69d7c9bbdc-dnfmw\" (UID: \"3681644e-43e3-4920-8b89-827871283593\") " pod="kserve/kserve-controller-manager-69d7c9bbdc-dnfmw" Mar 18 16:49:54.386007 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:49:54.385985 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a141314c-4a0e-4f87-b919-d9f540c7e434-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-x44z4\" (UID: \"a141314c-4a0e-4f87-b919-d9f540c7e434\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-x44z4" Mar 18 16:49:54.391141 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:49:54.391120 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddjw9\" (UniqueName: \"kubernetes.io/projected/3681644e-43e3-4920-8b89-827871283593-kube-api-access-ddjw9\") pod \"kserve-controller-manager-69d7c9bbdc-dnfmw\" (UID: \"3681644e-43e3-4920-8b89-827871283593\") " pod="kserve/kserve-controller-manager-69d7c9bbdc-dnfmw" Mar 18 16:49:54.391463 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:49:54.391442 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pphjj\" (UniqueName: \"kubernetes.io/projected/a141314c-4a0e-4f87-b919-d9f540c7e434-kube-api-access-pphjj\") pod \"llmisvc-controller-manager-68cc5db7c4-x44z4\" (UID: \"a141314c-4a0e-4f87-b919-d9f540c7e434\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-x44z4" Mar 18 16:49:54.458342 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:49:54.458247 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-69d7c9bbdc-dnfmw" Mar 18 16:49:54.471133 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:49:54.471090 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-x44z4" Mar 18 16:49:54.599413 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:49:54.599384 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-69d7c9bbdc-dnfmw"] Mar 18 16:49:54.602047 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:49:54.602013 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3681644e_43e3_4920_8b89_827871283593.slice/crio-4bc861afac6ec18f4e657eee7052fc248b498d20a1ae6324b077710f9a5fc49d WatchSource:0}: Error finding container 4bc861afac6ec18f4e657eee7052fc248b498d20a1ae6324b077710f9a5fc49d: Status 404 returned error can't find the container with id 4bc861afac6ec18f4e657eee7052fc248b498d20a1ae6324b077710f9a5fc49d Mar 18 16:49:54.626693 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:49:54.626670 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-x44z4"] Mar 18 16:49:54.629165 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:49:54.629139 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-poda141314c_4a0e_4f87_b919_d9f540c7e434.slice/crio-949d93f1144be5a797d6b13a032fa00742bed10383e60de1c785f70fc2c26318 WatchSource:0}: Error finding container 949d93f1144be5a797d6b13a032fa00742bed10383e60de1c785f70fc2c26318: Status 404 returned error can't find the container with id 949d93f1144be5a797d6b13a032fa00742bed10383e60de1c785f70fc2c26318 Mar 18 16:49:54.890545 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:49:54.890494 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-69d7c9bbdc-dnfmw" event={"ID":"3681644e-43e3-4920-8b89-827871283593","Type":"ContainerStarted","Data":"4bc861afac6ec18f4e657eee7052fc248b498d20a1ae6324b077710f9a5fc49d"} Mar 18 16:49:54.891474 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:49:54.891435 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-x44z4" event={"ID":"a141314c-4a0e-4f87-b919-d9f540c7e434","Type":"ContainerStarted","Data":"949d93f1144be5a797d6b13a032fa00742bed10383e60de1c785f70fc2c26318"} Mar 18 16:49:55.502117 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:49:55.502057 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-69d7c9bbdc-dnfmw"] Mar 18 16:49:58.906969 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:49:58.906909 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-69d7c9bbdc-dnfmw" event={"ID":"3681644e-43e3-4920-8b89-827871283593","Type":"ContainerStarted","Data":"f73e37c7d65d2dade321935d9cd82d6a1597800c425e958706b4f61ff253c434"} Mar 18 16:49:58.907409 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:49:58.907043 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-69d7c9bbdc-dnfmw" podUID="3681644e-43e3-4920-8b89-827871283593" containerName="manager" containerID="cri-o://f73e37c7d65d2dade321935d9cd82d6a1597800c425e958706b4f61ff253c434" gracePeriod=10 Mar 18 16:49:58.907409 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:49:58.907095 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-69d7c9bbdc-dnfmw" Mar 18 16:49:58.908228 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:49:58.908205 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-x44z4" event={"ID":"a141314c-4a0e-4f87-b919-d9f540c7e434","Type":"ContainerStarted","Data":"e3c2c7cb25eaf1f9de8e5ad4d98150b24f51ccfe238c8e9be9457a7b776f6752"} Mar 18 16:49:58.908367 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:49:58.908355 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-x44z4" Mar 18 16:49:58.925766 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:49:58.925722 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-69d7c9bbdc-dnfmw" podStartSLOduration=1.469496878 podStartE2EDuration="4.925710523s" podCreationTimestamp="2026-03-18 16:49:54 +0000 UTC" firstStartedPulling="2026-03-18 16:49:54.603828724 +0000 UTC m=+385.653644129" lastFinishedPulling="2026-03-18 16:49:58.060042366 +0000 UTC m=+389.109857774" observedRunningTime="2026-03-18 16:49:58.924085677 +0000 UTC m=+389.973901105" watchObservedRunningTime="2026-03-18 16:49:58.925710523 +0000 UTC m=+389.975525950" Mar 18 16:49:58.940107 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:49:58.940068 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-68cc5db7c4-x44z4" podStartSLOduration=1.552225543 podStartE2EDuration="4.94005556s" podCreationTimestamp="2026-03-18 16:49:54 +0000 UTC" firstStartedPulling="2026-03-18 16:49:54.630416263 +0000 UTC m=+385.680231669" lastFinishedPulling="2026-03-18 16:49:58.018246281 +0000 UTC m=+389.068061686" observedRunningTime="2026-03-18 16:49:58.939685431 +0000 UTC m=+389.989500858" watchObservedRunningTime="2026-03-18 16:49:58.94005556 +0000 UTC m=+389.989870986" Mar 18 16:49:59.244930 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:49:59.244906 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-69d7c9bbdc-dnfmw" Mar 18 16:49:59.324398 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:49:59.324363 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ddjw9\" (UniqueName: \"kubernetes.io/projected/3681644e-43e3-4920-8b89-827871283593-kube-api-access-ddjw9\") pod \"3681644e-43e3-4920-8b89-827871283593\" (UID: \"3681644e-43e3-4920-8b89-827871283593\") " Mar 18 16:49:59.324398 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:49:59.324406 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3681644e-43e3-4920-8b89-827871283593-cert\") pod \"3681644e-43e3-4920-8b89-827871283593\" (UID: \"3681644e-43e3-4920-8b89-827871283593\") " Mar 18 16:49:59.326640 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:49:59.326603 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3681644e-43e3-4920-8b89-827871283593-kube-api-access-ddjw9" (OuterVolumeSpecName: "kube-api-access-ddjw9") pod "3681644e-43e3-4920-8b89-827871283593" (UID: "3681644e-43e3-4920-8b89-827871283593"). InnerVolumeSpecName "kube-api-access-ddjw9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 16:49:59.326758 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:49:59.326610 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3681644e-43e3-4920-8b89-827871283593-cert" (OuterVolumeSpecName: "cert") pod "3681644e-43e3-4920-8b89-827871283593" (UID: "3681644e-43e3-4920-8b89-827871283593"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:49:59.425628 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:49:59.425537 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ddjw9\" (UniqueName: \"kubernetes.io/projected/3681644e-43e3-4920-8b89-827871283593-kube-api-access-ddjw9\") on node \"ip-10-0-133-190.ec2.internal\" DevicePath \"\"" Mar 18 16:49:59.425628 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:49:59.425572 2576 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3681644e-43e3-4920-8b89-827871283593-cert\") on node \"ip-10-0-133-190.ec2.internal\" DevicePath \"\"" Mar 18 16:49:59.912244 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:49:59.912208 2576 generic.go:358] "Generic (PLEG): container finished" podID="3681644e-43e3-4920-8b89-827871283593" containerID="f73e37c7d65d2dade321935d9cd82d6a1597800c425e958706b4f61ff253c434" exitCode=0 Mar 18 16:49:59.912655 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:49:59.912270 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-69d7c9bbdc-dnfmw" Mar 18 16:49:59.912655 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:49:59.912297 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-69d7c9bbdc-dnfmw" event={"ID":"3681644e-43e3-4920-8b89-827871283593","Type":"ContainerDied","Data":"f73e37c7d65d2dade321935d9cd82d6a1597800c425e958706b4f61ff253c434"} Mar 18 16:49:59.912655 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:49:59.912335 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-69d7c9bbdc-dnfmw" event={"ID":"3681644e-43e3-4920-8b89-827871283593","Type":"ContainerDied","Data":"4bc861afac6ec18f4e657eee7052fc248b498d20a1ae6324b077710f9a5fc49d"} Mar 18 16:49:59.912655 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:49:59.912352 2576 scope.go:117] "RemoveContainer" containerID="f73e37c7d65d2dade321935d9cd82d6a1597800c425e958706b4f61ff253c434" Mar 18 16:49:59.920401 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:49:59.920378 2576 scope.go:117] "RemoveContainer" containerID="f73e37c7d65d2dade321935d9cd82d6a1597800c425e958706b4f61ff253c434" Mar 18 16:49:59.920657 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:49:59.920637 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f73e37c7d65d2dade321935d9cd82d6a1597800c425e958706b4f61ff253c434\": container with ID starting with f73e37c7d65d2dade321935d9cd82d6a1597800c425e958706b4f61ff253c434 not found: ID does not exist" containerID="f73e37c7d65d2dade321935d9cd82d6a1597800c425e958706b4f61ff253c434" Mar 18 16:49:59.920720 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:49:59.920665 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f73e37c7d65d2dade321935d9cd82d6a1597800c425e958706b4f61ff253c434"} err="failed to get container status \"f73e37c7d65d2dade321935d9cd82d6a1597800c425e958706b4f61ff253c434\": rpc error: code = NotFound desc = could not find container \"f73e37c7d65d2dade321935d9cd82d6a1597800c425e958706b4f61ff253c434\": container with ID starting with f73e37c7d65d2dade321935d9cd82d6a1597800c425e958706b4f61ff253c434 not found: ID does not exist" Mar 18 16:49:59.927659 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:49:59.927630 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-69d7c9bbdc-dnfmw"] Mar 18 16:49:59.931523 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:49:59.931501 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-69d7c9bbdc-dnfmw"] Mar 18 16:50:01.563297 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:50:01.563267 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3681644e-43e3-4920-8b89-827871283593" path="/var/lib/kubelet/pods/3681644e-43e3-4920-8b89-827871283593/volumes" Mar 18 16:50:29.915363 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:50:29.915327 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-x44z4" Mar 18 16:50:31.791479 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:50:31.791444 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-9699c8d45-jrcb2"] Mar 18 16:50:31.791862 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:50:31.791735 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3681644e-43e3-4920-8b89-827871283593" containerName="manager" Mar 18 16:50:31.791862 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:50:31.791746 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="3681644e-43e3-4920-8b89-827871283593" containerName="manager" Mar 18 16:50:31.791862 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:50:31.791795 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="3681644e-43e3-4920-8b89-827871283593" containerName="manager" Mar 18 16:50:31.838312 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:50:31.838270 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-vxtx7"] Mar 18 16:50:31.838465 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:50:31.838422 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-9699c8d45-jrcb2" Mar 18 16:50:31.840394 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:50:31.840356 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-424v9\"" Mar 18 16:50:31.840565 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:50:31.840413 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Mar 18 16:50:31.859486 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:50:31.859458 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-9699c8d45-jrcb2"] Mar 18 16:50:31.859486 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:50:31.859489 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-vxtx7"] Mar 18 16:50:31.859645 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:50:31.859589 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-vxtx7" Mar 18 16:50:31.861518 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:50:31.861498 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-smj6s\"" Mar 18 16:50:31.861625 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:50:31.861534 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Mar 18 16:50:31.880133 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:50:31.880109 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6p25\" (UniqueName: \"kubernetes.io/projected/c1eb4bd1-d0f6-4dd7-a063-638cff8ab0d3-kube-api-access-k6p25\") pod \"model-serving-api-9699c8d45-jrcb2\" (UID: \"c1eb4bd1-d0f6-4dd7-a063-638cff8ab0d3\") " pod="kserve/model-serving-api-9699c8d45-jrcb2" Mar 18 16:50:31.880262 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:50:31.880143 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c1eb4bd1-d0f6-4dd7-a063-638cff8ab0d3-tls-certs\") pod \"model-serving-api-9699c8d45-jrcb2\" (UID: \"c1eb4bd1-d0f6-4dd7-a063-638cff8ab0d3\") " pod="kserve/model-serving-api-9699c8d45-jrcb2" Mar 18 16:50:31.880262 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:50:31.880195 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7bd4203b-0848-472a-94ca-4d8cc6e6b1bf-cert\") pod \"odh-model-controller-696fc77849-vxtx7\" (UID: \"7bd4203b-0848-472a-94ca-4d8cc6e6b1bf\") " pod="kserve/odh-model-controller-696fc77849-vxtx7" Mar 18 16:50:31.880262 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:50:31.880217 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bb7p\" (UniqueName: \"kubernetes.io/projected/7bd4203b-0848-472a-94ca-4d8cc6e6b1bf-kube-api-access-8bb7p\") pod \"odh-model-controller-696fc77849-vxtx7\" (UID: \"7bd4203b-0848-472a-94ca-4d8cc6e6b1bf\") " pod="kserve/odh-model-controller-696fc77849-vxtx7" Mar 18 16:50:31.981041 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:50:31.981008 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k6p25\" (UniqueName: \"kubernetes.io/projected/c1eb4bd1-d0f6-4dd7-a063-638cff8ab0d3-kube-api-access-k6p25\") pod \"model-serving-api-9699c8d45-jrcb2\" (UID: \"c1eb4bd1-d0f6-4dd7-a063-638cff8ab0d3\") " pod="kserve/model-serving-api-9699c8d45-jrcb2" Mar 18 16:50:31.981041 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:50:31.981049 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c1eb4bd1-d0f6-4dd7-a063-638cff8ab0d3-tls-certs\") pod \"model-serving-api-9699c8d45-jrcb2\" (UID: \"c1eb4bd1-d0f6-4dd7-a063-638cff8ab0d3\") " pod="kserve/model-serving-api-9699c8d45-jrcb2" Mar 18 16:50:31.981259 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:50:31.981086 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7bd4203b-0848-472a-94ca-4d8cc6e6b1bf-cert\") pod \"odh-model-controller-696fc77849-vxtx7\" (UID: \"7bd4203b-0848-472a-94ca-4d8cc6e6b1bf\") " pod="kserve/odh-model-controller-696fc77849-vxtx7" Mar 18 16:50:31.981259 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:50:31.981105 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8bb7p\" (UniqueName: \"kubernetes.io/projected/7bd4203b-0848-472a-94ca-4d8cc6e6b1bf-kube-api-access-8bb7p\") pod \"odh-model-controller-696fc77849-vxtx7\" (UID: \"7bd4203b-0848-472a-94ca-4d8cc6e6b1bf\") " pod="kserve/odh-model-controller-696fc77849-vxtx7" Mar 18 16:50:31.981330 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:50:31.981292 2576 secret.go:189] Couldn't get secret kserve/model-serving-api-tls: secret "model-serving-api-tls" not found Mar 18 16:50:31.981370 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:50:31.981355 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c1eb4bd1-d0f6-4dd7-a063-638cff8ab0d3-tls-certs podName:c1eb4bd1-d0f6-4dd7-a063-638cff8ab0d3 nodeName:}" failed. No retries permitted until 2026-03-18 16:50:32.481337891 +0000 UTC m=+423.531153304 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/c1eb4bd1-d0f6-4dd7-a063-638cff8ab0d3-tls-certs") pod "model-serving-api-9699c8d45-jrcb2" (UID: "c1eb4bd1-d0f6-4dd7-a063-638cff8ab0d3") : secret "model-serving-api-tls" not found Mar 18 16:50:31.983685 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:50:31.983662 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7bd4203b-0848-472a-94ca-4d8cc6e6b1bf-cert\") pod \"odh-model-controller-696fc77849-vxtx7\" (UID: \"7bd4203b-0848-472a-94ca-4d8cc6e6b1bf\") " pod="kserve/odh-model-controller-696fc77849-vxtx7" Mar 18 16:50:31.989718 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:50:31.989692 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bb7p\" (UniqueName: \"kubernetes.io/projected/7bd4203b-0848-472a-94ca-4d8cc6e6b1bf-kube-api-access-8bb7p\") pod \"odh-model-controller-696fc77849-vxtx7\" (UID: \"7bd4203b-0848-472a-94ca-4d8cc6e6b1bf\") " pod="kserve/odh-model-controller-696fc77849-vxtx7" Mar 18 16:50:31.990357 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:50:31.990339 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6p25\" (UniqueName: \"kubernetes.io/projected/c1eb4bd1-d0f6-4dd7-a063-638cff8ab0d3-kube-api-access-k6p25\") pod \"model-serving-api-9699c8d45-jrcb2\" (UID: \"c1eb4bd1-d0f6-4dd7-a063-638cff8ab0d3\") " pod="kserve/model-serving-api-9699c8d45-jrcb2" Mar 18 16:50:32.169845 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:50:32.169751 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-vxtx7" Mar 18 16:50:32.299548 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:50:32.299526 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-vxtx7"] Mar 18 16:50:32.301572 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:50:32.301543 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7bd4203b_0848_472a_94ca_4d8cc6e6b1bf.slice/crio-a32353050ba6b9774065b4e2aa3cf2572b9a73d6bada6bb61a58acbcb57c11be WatchSource:0}: Error finding container a32353050ba6b9774065b4e2aa3cf2572b9a73d6bada6bb61a58acbcb57c11be: Status 404 returned error can't find the container with id a32353050ba6b9774065b4e2aa3cf2572b9a73d6bada6bb61a58acbcb57c11be Mar 18 16:50:32.485500 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:50:32.485472 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c1eb4bd1-d0f6-4dd7-a063-638cff8ab0d3-tls-certs\") pod \"model-serving-api-9699c8d45-jrcb2\" (UID: \"c1eb4bd1-d0f6-4dd7-a063-638cff8ab0d3\") " pod="kserve/model-serving-api-9699c8d45-jrcb2" Mar 18 16:50:32.487844 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:50:32.487824 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c1eb4bd1-d0f6-4dd7-a063-638cff8ab0d3-tls-certs\") pod \"model-serving-api-9699c8d45-jrcb2\" (UID: \"c1eb4bd1-d0f6-4dd7-a063-638cff8ab0d3\") " pod="kserve/model-serving-api-9699c8d45-jrcb2" Mar 18 16:50:32.748707 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:50:32.748613 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-9699c8d45-jrcb2" Mar 18 16:50:32.865959 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:50:32.865904 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-9699c8d45-jrcb2"] Mar 18 16:50:32.868738 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:50:32.868711 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc1eb4bd1_d0f6_4dd7_a063_638cff8ab0d3.slice/crio-4bc424e97b747531771b7fb38e28ff94832f45d0c123ac0680820d25a08cebf7 WatchSource:0}: Error finding container 4bc424e97b747531771b7fb38e28ff94832f45d0c123ac0680820d25a08cebf7: Status 404 returned error can't find the container with id 4bc424e97b747531771b7fb38e28ff94832f45d0c123ac0680820d25a08cebf7 Mar 18 16:50:33.020797 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:50:33.020761 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-vxtx7" event={"ID":"7bd4203b-0848-472a-94ca-4d8cc6e6b1bf","Type":"ContainerStarted","Data":"a32353050ba6b9774065b4e2aa3cf2572b9a73d6bada6bb61a58acbcb57c11be"} Mar 18 16:50:33.021665 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:50:33.021642 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-9699c8d45-jrcb2" event={"ID":"c1eb4bd1-d0f6-4dd7-a063-638cff8ab0d3","Type":"ContainerStarted","Data":"4bc424e97b747531771b7fb38e28ff94832f45d0c123ac0680820d25a08cebf7"} Mar 18 16:50:33.133643 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:50:33.133563 2576 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized" image="quay.io/opendatahub/odh-model-serving-api:fast" Mar 18 16:50:33.133869 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:50:33.133824 2576 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:server,Image:quay.io/opendatahub/odh-model-serving-api:fast,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},ContainerPort{Name:metrics,HostPort:0,ContainerPort:9090,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:TLS_CERT_DIR,Value:/tls,ValueFrom:nil,},EnvVar{Name:GATEWAY_LABEL_SELECTOR,Value:,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{50 -3} {} 50m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tls-certs,ReadOnly:true,MountPath:/tls,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-k6p25,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000640000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod model-serving-api-9699c8d45-jrcb2_kserve(c1eb4bd1-d0f6-4dd7-a063-638cff8ab0d3): ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized" logger="UnhandledError" Mar 18 16:50:33.135237 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:50:33.135201 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ErrImagePull: \"unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-jrcb2" podUID="c1eb4bd1-d0f6-4dd7-a063-638cff8ab0d3" Mar 18 16:50:34.025666 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:50:34.025636 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-jrcb2" podUID="c1eb4bd1-d0f6-4dd7-a063-638cff8ab0d3" Mar 18 16:50:36.032877 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:50:36.032841 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-vxtx7" event={"ID":"7bd4203b-0848-472a-94ca-4d8cc6e6b1bf","Type":"ContainerStarted","Data":"6150651092ec3b029227e4353157ac9a610fd5acd04d05fab0d455d74f38e8f0"} Mar 18 16:50:36.033298 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:50:36.032897 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-vxtx7" Mar 18 16:50:36.048750 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:50:36.048700 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-vxtx7" podStartSLOduration=1.6669676230000001 podStartE2EDuration="5.048685202s" podCreationTimestamp="2026-03-18 16:50:31 +0000 UTC" firstStartedPulling="2026-03-18 16:50:32.302876048 +0000 UTC m=+423.352691453" lastFinishedPulling="2026-03-18 16:50:35.684593624 +0000 UTC m=+426.734409032" observedRunningTime="2026-03-18 16:50:36.047513199 +0000 UTC m=+427.097328626" watchObservedRunningTime="2026-03-18 16:50:36.048685202 +0000 UTC m=+427.098500677" Mar 18 16:50:44.846851 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:50:44.846749 2576 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized" image="quay.io/opendatahub/odh-model-serving-api:fast" Mar 18 16:50:44.847241 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:50:44.846975 2576 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:server,Image:quay.io/opendatahub/odh-model-serving-api:fast,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},ContainerPort{Name:metrics,HostPort:0,ContainerPort:9090,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:TLS_CERT_DIR,Value:/tls,ValueFrom:nil,},EnvVar{Name:GATEWAY_LABEL_SELECTOR,Value:,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{50 -3} {} 50m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tls-certs,ReadOnly:true,MountPath:/tls,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-k6p25,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000640000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod model-serving-api-9699c8d45-jrcb2_kserve(c1eb4bd1-d0f6-4dd7-a063-638cff8ab0d3): ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized" logger="UnhandledError" Mar 18 16:50:44.848413 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:50:44.848387 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ErrImagePull: \"unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-jrcb2" podUID="c1eb4bd1-d0f6-4dd7-a063-638cff8ab0d3" Mar 18 16:50:47.037708 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:50:47.037677 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-vxtx7" Mar 18 16:50:55.560580 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:50:55.560535 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-jrcb2" podUID="c1eb4bd1-d0f6-4dd7-a063-638cff8ab0d3" Mar 18 16:51:06.206118 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:51:06.206081 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dbd25-predictor-5785fbfc74-vc5nc"] Mar 18 16:51:06.208971 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:51:06.208933 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dbd25-predictor-5785fbfc74-vc5nc" Mar 18 16:51:06.210962 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:51:06.210920 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Mar 18 16:51:06.211453 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:51:06.211434 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-2fnxb\"" Mar 18 16:51:06.211558 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:51:06.211439 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Mar 18 16:51:06.217765 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:51:06.217741 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dbd25-predictor-5785fbfc74-vc5nc"] Mar 18 16:51:06.362891 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:51:06.362846 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8htds\" (UniqueName: \"kubernetes.io/projected/b0def082-c96a-4df0-9ef8-12519113bcdb-kube-api-access-8htds\") pod \"isvc-raw-sklearn-batcher-dbd25-predictor-5785fbfc74-vc5nc\" (UID: \"b0def082-c96a-4df0-9ef8-12519113bcdb\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dbd25-predictor-5785fbfc74-vc5nc" Mar 18 16:51:06.363088 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:51:06.362999 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b0def082-c96a-4df0-9ef8-12519113bcdb-kserve-provision-location\") pod \"isvc-raw-sklearn-batcher-dbd25-predictor-5785fbfc74-vc5nc\" (UID: \"b0def082-c96a-4df0-9ef8-12519113bcdb\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dbd25-predictor-5785fbfc74-vc5nc" Mar 18 16:51:06.463481 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:51:06.463452 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b0def082-c96a-4df0-9ef8-12519113bcdb-kserve-provision-location\") pod \"isvc-raw-sklearn-batcher-dbd25-predictor-5785fbfc74-vc5nc\" (UID: \"b0def082-c96a-4df0-9ef8-12519113bcdb\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dbd25-predictor-5785fbfc74-vc5nc" Mar 18 16:51:06.463762 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:51:06.463505 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8htds\" (UniqueName: \"kubernetes.io/projected/b0def082-c96a-4df0-9ef8-12519113bcdb-kube-api-access-8htds\") pod \"isvc-raw-sklearn-batcher-dbd25-predictor-5785fbfc74-vc5nc\" (UID: \"b0def082-c96a-4df0-9ef8-12519113bcdb\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dbd25-predictor-5785fbfc74-vc5nc" Mar 18 16:51:06.463934 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:51:06.463911 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b0def082-c96a-4df0-9ef8-12519113bcdb-kserve-provision-location\") pod \"isvc-raw-sklearn-batcher-dbd25-predictor-5785fbfc74-vc5nc\" (UID: \"b0def082-c96a-4df0-9ef8-12519113bcdb\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dbd25-predictor-5785fbfc74-vc5nc" Mar 18 16:51:06.470893 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:51:06.470865 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8htds\" (UniqueName: \"kubernetes.io/projected/b0def082-c96a-4df0-9ef8-12519113bcdb-kube-api-access-8htds\") pod \"isvc-raw-sklearn-batcher-dbd25-predictor-5785fbfc74-vc5nc\" (UID: \"b0def082-c96a-4df0-9ef8-12519113bcdb\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dbd25-predictor-5785fbfc74-vc5nc" Mar 18 16:51:06.519933 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:51:06.519894 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dbd25-predictor-5785fbfc74-vc5nc" Mar 18 16:51:06.644429 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:51:06.644387 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dbd25-predictor-5785fbfc74-vc5nc"] Mar 18 16:51:06.648883 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:51:06.648847 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb0def082_c96a_4df0_9ef8_12519113bcdb.slice/crio-d22bb5b4c794d1508cf2098bc850f3e1e916571ebc68f007b9295867dfd7b0ff WatchSource:0}: Error finding container d22bb5b4c794d1508cf2098bc850f3e1e916571ebc68f007b9295867dfd7b0ff: Status 404 returned error can't find the container with id d22bb5b4c794d1508cf2098bc850f3e1e916571ebc68f007b9295867dfd7b0ff Mar 18 16:51:06.931525 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:51:06.931426 2576 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized" image="quay.io/opendatahub/odh-model-serving-api:fast" Mar 18 16:51:06.931681 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:51:06.931625 2576 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:server,Image:quay.io/opendatahub/odh-model-serving-api:fast,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},ContainerPort{Name:metrics,HostPort:0,ContainerPort:9090,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:TLS_CERT_DIR,Value:/tls,ValueFrom:nil,},EnvVar{Name:GATEWAY_LABEL_SELECTOR,Value:,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{50 -3} {} 50m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tls-certs,ReadOnly:true,MountPath:/tls,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-k6p25,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000640000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod model-serving-api-9699c8d45-jrcb2_kserve(c1eb4bd1-d0f6-4dd7-a063-638cff8ab0d3): ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized" logger="UnhandledError" Mar 18 16:51:06.932797 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:51:06.932771 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ErrImagePull: \"unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-jrcb2" podUID="c1eb4bd1-d0f6-4dd7-a063-638cff8ab0d3" Mar 18 16:51:07.130566 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:51:07.130530 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dbd25-predictor-5785fbfc74-vc5nc" event={"ID":"b0def082-c96a-4df0-9ef8-12519113bcdb","Type":"ContainerStarted","Data":"d22bb5b4c794d1508cf2098bc850f3e1e916571ebc68f007b9295867dfd7b0ff"} Mar 18 16:51:11.147085 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:51:11.146995 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dbd25-predictor-5785fbfc74-vc5nc" event={"ID":"b0def082-c96a-4df0-9ef8-12519113bcdb","Type":"ContainerStarted","Data":"901606270e4a5cc9732044eec204c602891975b372c848404f33bd0fdc76fcc0"} Mar 18 16:51:15.161062 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:51:15.160975 2576 generic.go:358] "Generic (PLEG): container finished" podID="b0def082-c96a-4df0-9ef8-12519113bcdb" containerID="901606270e4a5cc9732044eec204c602891975b372c848404f33bd0fdc76fcc0" exitCode=0 Mar 18 16:51:15.161062 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:51:15.161048 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dbd25-predictor-5785fbfc74-vc5nc" event={"ID":"b0def082-c96a-4df0-9ef8-12519113bcdb","Type":"ContainerDied","Data":"901606270e4a5cc9732044eec204c602891975b372c848404f33bd0fdc76fcc0"} Mar 18 16:51:18.560216 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:51:18.560158 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-jrcb2" podUID="c1eb4bd1-d0f6-4dd7-a063-638cff8ab0d3" Mar 18 16:51:20.298399 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:51:20.298358 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-857857b765-vm745"] Mar 18 16:51:24.195074 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:51:24.195029 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dbd25-predictor-5785fbfc74-vc5nc" event={"ID":"b0def082-c96a-4df0-9ef8-12519113bcdb","Type":"ContainerStarted","Data":"2d3947a44fd8ab2439c88fec45e3c2cda191ff4d8169ded6d2738e2801f58949"} Mar 18 16:51:26.203487 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:51:26.203454 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dbd25-predictor-5785fbfc74-vc5nc" event={"ID":"b0def082-c96a-4df0-9ef8-12519113bcdb","Type":"ContainerStarted","Data":"4317713af0f86452249589667518e248d3fe152066ee9623a62f1af5425452d1"} Mar 18 16:51:26.203894 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:51:26.203752 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dbd25-predictor-5785fbfc74-vc5nc" Mar 18 16:51:26.205194 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:51:26.205151 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dbd25-predictor-5785fbfc74-vc5nc" podUID="b0def082-c96a-4df0-9ef8-12519113bcdb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Mar 18 16:51:26.221663 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:51:26.221606 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dbd25-predictor-5785fbfc74-vc5nc" podStartSLOduration=0.747938796 podStartE2EDuration="20.221588804s" podCreationTimestamp="2026-03-18 16:51:06 +0000 UTC" firstStartedPulling="2026-03-18 16:51:06.651136956 +0000 UTC m=+457.700952362" lastFinishedPulling="2026-03-18 16:51:26.124786962 +0000 UTC m=+477.174602370" observedRunningTime="2026-03-18 16:51:26.219956002 +0000 UTC m=+477.269771419" watchObservedRunningTime="2026-03-18 16:51:26.221588804 +0000 UTC m=+477.271404232" Mar 18 16:51:27.208895 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:51:27.208863 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dbd25-predictor-5785fbfc74-vc5nc" Mar 18 16:51:27.209286 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:51:27.209004 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dbd25-predictor-5785fbfc74-vc5nc" podUID="b0def082-c96a-4df0-9ef8-12519113bcdb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Mar 18 16:51:27.210155 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:51:27.210125 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dbd25-predictor-5785fbfc74-vc5nc" podUID="b0def082-c96a-4df0-9ef8-12519113bcdb" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 16:51:28.211564 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:51:28.211524 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dbd25-predictor-5785fbfc74-vc5nc" podUID="b0def082-c96a-4df0-9ef8-12519113bcdb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Mar 18 16:51:28.211984 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:51:28.211921 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dbd25-predictor-5785fbfc74-vc5nc" podUID="b0def082-c96a-4df0-9ef8-12519113bcdb" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 16:51:32.560454 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:51:32.560421 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-jrcb2" podUID="c1eb4bd1-d0f6-4dd7-a063-638cff8ab0d3" Mar 18 16:51:38.211864 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:51:38.211806 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dbd25-predictor-5785fbfc74-vc5nc" podUID="b0def082-c96a-4df0-9ef8-12519113bcdb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Mar 18 16:51:38.212366 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:51:38.212289 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dbd25-predictor-5785fbfc74-vc5nc" podUID="b0def082-c96a-4df0-9ef8-12519113bcdb" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 16:51:44.559458 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:51:44.559423 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-jrcb2" podUID="c1eb4bd1-d0f6-4dd7-a063-638cff8ab0d3" Mar 18 16:51:45.318452 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:51:45.318411 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-857857b765-vm745" podUID="db06968c-5245-4ae6-9ba5-2c367b19044b" containerName="console" containerID="cri-o://b3a7d15bd3f780901b90169165530864e55f6802b51f41b76ac2a113c020959e" gracePeriod=15 Mar 18 16:51:45.444947 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:51:45.444900 2576 patch_prober.go:28] interesting pod/console-857857b765-vm745 container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.133.0.19:8443/health\": dial tcp 10.133.0.19:8443: connect: connection refused" start-of-body= Mar 18 16:51:45.445094 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:51:45.445005 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console/console-857857b765-vm745" podUID="db06968c-5245-4ae6-9ba5-2c367b19044b" containerName="console" probeResult="failure" output="Get \"https://10.133.0.19:8443/health\": dial tcp 10.133.0.19:8443: connect: connection refused" Mar 18 16:51:45.578340 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:51:45.578278 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-857857b765-vm745_db06968c-5245-4ae6-9ba5-2c367b19044b/console/0.log" Mar 18 16:51:45.578645 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:51:45.578345 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-857857b765-vm745" Mar 18 16:51:45.671826 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:51:45.671794 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/db06968c-5245-4ae6-9ba5-2c367b19044b-console-serving-cert\") pod \"db06968c-5245-4ae6-9ba5-2c367b19044b\" (UID: \"db06968c-5245-4ae6-9ba5-2c367b19044b\") " Mar 18 16:51:45.671826 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:51:45.671828 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/db06968c-5245-4ae6-9ba5-2c367b19044b-trusted-ca-bundle\") pod \"db06968c-5245-4ae6-9ba5-2c367b19044b\" (UID: \"db06968c-5245-4ae6-9ba5-2c367b19044b\") " Mar 18 16:51:45.672059 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:51:45.671851 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/db06968c-5245-4ae6-9ba5-2c367b19044b-console-config\") pod \"db06968c-5245-4ae6-9ba5-2c367b19044b\" (UID: \"db06968c-5245-4ae6-9ba5-2c367b19044b\") " Mar 18 16:51:45.672059 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:51:45.671882 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/db06968c-5245-4ae6-9ba5-2c367b19044b-oauth-serving-cert\") pod \"db06968c-5245-4ae6-9ba5-2c367b19044b\" (UID: \"db06968c-5245-4ae6-9ba5-2c367b19044b\") " Mar 18 16:51:45.672059 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:51:45.671917 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/db06968c-5245-4ae6-9ba5-2c367b19044b-console-oauth-config\") pod \"db06968c-5245-4ae6-9ba5-2c367b19044b\" (UID: \"db06968c-5245-4ae6-9ba5-2c367b19044b\") " Mar 18 16:51:45.672059 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:51:45.671982 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/db06968c-5245-4ae6-9ba5-2c367b19044b-service-ca\") pod \"db06968c-5245-4ae6-9ba5-2c367b19044b\" (UID: \"db06968c-5245-4ae6-9ba5-2c367b19044b\") " Mar 18 16:51:45.672059 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:51:45.672009 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b49dm\" (UniqueName: \"kubernetes.io/projected/db06968c-5245-4ae6-9ba5-2c367b19044b-kube-api-access-b49dm\") pod \"db06968c-5245-4ae6-9ba5-2c367b19044b\" (UID: \"db06968c-5245-4ae6-9ba5-2c367b19044b\") " Mar 18 16:51:45.672397 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:51:45.672368 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db06968c-5245-4ae6-9ba5-2c367b19044b-console-config" (OuterVolumeSpecName: "console-config") pod "db06968c-5245-4ae6-9ba5-2c367b19044b" (UID: "db06968c-5245-4ae6-9ba5-2c367b19044b"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 16:51:45.672517 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:51:45.672470 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db06968c-5245-4ae6-9ba5-2c367b19044b-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "db06968c-5245-4ae6-9ba5-2c367b19044b" (UID: "db06968c-5245-4ae6-9ba5-2c367b19044b"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 16:51:45.672577 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:51:45.672507 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db06968c-5245-4ae6-9ba5-2c367b19044b-service-ca" (OuterVolumeSpecName: "service-ca") pod "db06968c-5245-4ae6-9ba5-2c367b19044b" (UID: "db06968c-5245-4ae6-9ba5-2c367b19044b"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 16:51:45.672674 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:51:45.672638 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db06968c-5245-4ae6-9ba5-2c367b19044b-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "db06968c-5245-4ae6-9ba5-2c367b19044b" (UID: "db06968c-5245-4ae6-9ba5-2c367b19044b"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 16:51:45.674111 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:51:45.674071 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db06968c-5245-4ae6-9ba5-2c367b19044b-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "db06968c-5245-4ae6-9ba5-2c367b19044b" (UID: "db06968c-5245-4ae6-9ba5-2c367b19044b"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:51:45.674235 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:51:45.674157 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db06968c-5245-4ae6-9ba5-2c367b19044b-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "db06968c-5245-4ae6-9ba5-2c367b19044b" (UID: "db06968c-5245-4ae6-9ba5-2c367b19044b"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:51:45.674452 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:51:45.674435 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db06968c-5245-4ae6-9ba5-2c367b19044b-kube-api-access-b49dm" (OuterVolumeSpecName: "kube-api-access-b49dm") pod "db06968c-5245-4ae6-9ba5-2c367b19044b" (UID: "db06968c-5245-4ae6-9ba5-2c367b19044b"). InnerVolumeSpecName "kube-api-access-b49dm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 16:51:45.772799 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:51:45.772762 2576 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/db06968c-5245-4ae6-9ba5-2c367b19044b-service-ca\") on node \"ip-10-0-133-190.ec2.internal\" DevicePath \"\"" Mar 18 16:51:45.772799 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:51:45.772796 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-b49dm\" (UniqueName: \"kubernetes.io/projected/db06968c-5245-4ae6-9ba5-2c367b19044b-kube-api-access-b49dm\") on node \"ip-10-0-133-190.ec2.internal\" DevicePath \"\"" Mar 18 16:51:45.772799 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:51:45.772809 2576 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/db06968c-5245-4ae6-9ba5-2c367b19044b-console-serving-cert\") on node \"ip-10-0-133-190.ec2.internal\" DevicePath \"\"" Mar 18 16:51:45.773064 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:51:45.772822 2576 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/db06968c-5245-4ae6-9ba5-2c367b19044b-trusted-ca-bundle\") on node \"ip-10-0-133-190.ec2.internal\" DevicePath \"\"" Mar 18 16:51:45.773064 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:51:45.772834 2576 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/db06968c-5245-4ae6-9ba5-2c367b19044b-console-config\") on node \"ip-10-0-133-190.ec2.internal\" DevicePath \"\"" Mar 18 16:51:45.773064 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:51:45.772846 2576 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/db06968c-5245-4ae6-9ba5-2c367b19044b-oauth-serving-cert\") on node \"ip-10-0-133-190.ec2.internal\" DevicePath \"\"" Mar 18 16:51:45.773064 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:51:45.772860 2576 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/db06968c-5245-4ae6-9ba5-2c367b19044b-console-oauth-config\") on node \"ip-10-0-133-190.ec2.internal\" DevicePath \"\"" Mar 18 16:51:46.270540 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:51:46.270512 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-857857b765-vm745_db06968c-5245-4ae6-9ba5-2c367b19044b/console/0.log" Mar 18 16:51:46.270720 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:51:46.270554 2576 generic.go:358] "Generic (PLEG): container finished" podID="db06968c-5245-4ae6-9ba5-2c367b19044b" containerID="b3a7d15bd3f780901b90169165530864e55f6802b51f41b76ac2a113c020959e" exitCode=2 Mar 18 16:51:46.270720 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:51:46.270625 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-857857b765-vm745" event={"ID":"db06968c-5245-4ae6-9ba5-2c367b19044b","Type":"ContainerDied","Data":"b3a7d15bd3f780901b90169165530864e55f6802b51f41b76ac2a113c020959e"} Mar 18 16:51:46.270720 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:51:46.270635 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-857857b765-vm745" Mar 18 16:51:46.270720 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:51:46.270659 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-857857b765-vm745" event={"ID":"db06968c-5245-4ae6-9ba5-2c367b19044b","Type":"ContainerDied","Data":"da7ca0443fcfde4892ed852860204b28776153b8b5759b1493c192665c3eb4d5"} Mar 18 16:51:46.270720 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:51:46.270678 2576 scope.go:117] "RemoveContainer" containerID="b3a7d15bd3f780901b90169165530864e55f6802b51f41b76ac2a113c020959e" Mar 18 16:51:46.279066 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:51:46.279045 2576 scope.go:117] "RemoveContainer" containerID="b3a7d15bd3f780901b90169165530864e55f6802b51f41b76ac2a113c020959e" Mar 18 16:51:46.279315 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:51:46.279296 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3a7d15bd3f780901b90169165530864e55f6802b51f41b76ac2a113c020959e\": container with ID starting with b3a7d15bd3f780901b90169165530864e55f6802b51f41b76ac2a113c020959e not found: ID does not exist" containerID="b3a7d15bd3f780901b90169165530864e55f6802b51f41b76ac2a113c020959e" Mar 18 16:51:46.279374 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:51:46.279324 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3a7d15bd3f780901b90169165530864e55f6802b51f41b76ac2a113c020959e"} err="failed to get container status \"b3a7d15bd3f780901b90169165530864e55f6802b51f41b76ac2a113c020959e\": rpc error: code = NotFound desc = could not find container \"b3a7d15bd3f780901b90169165530864e55f6802b51f41b76ac2a113c020959e\": container with ID starting with b3a7d15bd3f780901b90169165530864e55f6802b51f41b76ac2a113c020959e not found: ID does not exist" Mar 18 16:51:46.290128 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:51:46.290103 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-857857b765-vm745"] Mar 18 16:51:46.295545 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:51:46.295520 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-857857b765-vm745"] Mar 18 16:51:47.563702 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:51:47.563667 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db06968c-5245-4ae6-9ba5-2c367b19044b" path="/var/lib/kubelet/pods/db06968c-5245-4ae6-9ba5-2c367b19044b/volumes" Mar 18 16:51:48.211894 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:51:48.211845 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dbd25-predictor-5785fbfc74-vc5nc" podUID="b0def082-c96a-4df0-9ef8-12519113bcdb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Mar 18 16:51:48.212306 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:51:48.212273 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dbd25-predictor-5785fbfc74-vc5nc" podUID="b0def082-c96a-4df0-9ef8-12519113bcdb" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 16:51:58.211794 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:51:58.211746 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dbd25-predictor-5785fbfc74-vc5nc" podUID="b0def082-c96a-4df0-9ef8-12519113bcdb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Mar 18 16:51:58.212309 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:51:58.212284 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dbd25-predictor-5785fbfc74-vc5nc" podUID="b0def082-c96a-4df0-9ef8-12519113bcdb" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 16:51:59.964367 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:51:59.964312 2576 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized" image="quay.io/opendatahub/odh-model-serving-api:fast" Mar 18 16:51:59.964884 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:51:59.964530 2576 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:server,Image:quay.io/opendatahub/odh-model-serving-api:fast,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},ContainerPort{Name:metrics,HostPort:0,ContainerPort:9090,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:TLS_CERT_DIR,Value:/tls,ValueFrom:nil,},EnvVar{Name:GATEWAY_LABEL_SELECTOR,Value:,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{50 -3} {} 50m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tls-certs,ReadOnly:true,MountPath:/tls,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-k6p25,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000640000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod model-serving-api-9699c8d45-jrcb2_kserve(c1eb4bd1-d0f6-4dd7-a063-638cff8ab0d3): ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized" logger="UnhandledError" Mar 18 16:51:59.965717 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:51:59.965686 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ErrImagePull: \"unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-jrcb2" podUID="c1eb4bd1-d0f6-4dd7-a063-638cff8ab0d3" Mar 18 16:52:08.211827 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:52:08.211777 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dbd25-predictor-5785fbfc74-vc5nc" podUID="b0def082-c96a-4df0-9ef8-12519113bcdb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Mar 18 16:52:08.212348 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:52:08.212301 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dbd25-predictor-5785fbfc74-vc5nc" podUID="b0def082-c96a-4df0-9ef8-12519113bcdb" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 16:52:13.560532 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:52:13.560499 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-jrcb2" podUID="c1eb4bd1-d0f6-4dd7-a063-638cff8ab0d3" Mar 18 16:52:18.211913 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:52:18.211857 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dbd25-predictor-5785fbfc74-vc5nc" podUID="b0def082-c96a-4df0-9ef8-12519113bcdb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Mar 18 16:52:18.212402 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:52:18.212374 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dbd25-predictor-5785fbfc74-vc5nc" podUID="b0def082-c96a-4df0-9ef8-12519113bcdb" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 16:52:27.559863 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:52:27.559731 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-jrcb2" podUID="c1eb4bd1-d0f6-4dd7-a063-638cff8ab0d3" Mar 18 16:52:28.211664 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:52:28.211618 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dbd25-predictor-5785fbfc74-vc5nc" podUID="b0def082-c96a-4df0-9ef8-12519113bcdb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Mar 18 16:52:28.212041 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:52:28.212011 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dbd25-predictor-5785fbfc74-vc5nc" podUID="b0def082-c96a-4df0-9ef8-12519113bcdb" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 16:52:38.211930 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:52:38.211873 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dbd25-predictor-5785fbfc74-vc5nc" podUID="b0def082-c96a-4df0-9ef8-12519113bcdb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Mar 18 16:52:38.212505 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:52:38.212362 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dbd25-predictor-5785fbfc74-vc5nc" podUID="b0def082-c96a-4df0-9ef8-12519113bcdb" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 16:52:38.560060 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:52:38.560028 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-jrcb2" podUID="c1eb4bd1-d0f6-4dd7-a063-638cff8ab0d3" Mar 18 16:52:48.212989 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:52:48.212929 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dbd25-predictor-5785fbfc74-vc5nc" Mar 18 16:52:48.213494 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:52:48.213092 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dbd25-predictor-5785fbfc74-vc5nc" Mar 18 16:52:51.560161 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:52:51.560130 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-jrcb2" podUID="c1eb4bd1-d0f6-4dd7-a063-638cff8ab0d3" Mar 18 16:53:01.421343 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:53:01.421302 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dbd25-predictor-5785fbfc74-vc5nc"] Mar 18 16:53:01.421756 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:53:01.421733 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dbd25-predictor-5785fbfc74-vc5nc" podUID="b0def082-c96a-4df0-9ef8-12519113bcdb" containerName="kserve-container" containerID="cri-o://2d3947a44fd8ab2439c88fec45e3c2cda191ff4d8169ded6d2738e2801f58949" gracePeriod=30 Mar 18 16:53:01.421922 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:53:01.421850 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dbd25-predictor-5785fbfc74-vc5nc" podUID="b0def082-c96a-4df0-9ef8-12519113bcdb" containerName="agent" containerID="cri-o://4317713af0f86452249589667518e248d3fe152066ee9623a62f1af5425452d1" gracePeriod=30 Mar 18 16:53:01.575802 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:53:01.575770 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-41715-predictor-b5c8f9648-jxx9g"] Mar 18 16:53:01.576128 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:53:01.576114 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="db06968c-5245-4ae6-9ba5-2c367b19044b" containerName="console" Mar 18 16:53:01.576208 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:53:01.576131 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="db06968c-5245-4ae6-9ba5-2c367b19044b" containerName="console" Mar 18 16:53:01.576208 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:53:01.576190 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="db06968c-5245-4ae6-9ba5-2c367b19044b" containerName="console" Mar 18 16:53:01.579546 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:53:01.579520 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-41715-predictor-b5c8f9648-jxx9g" Mar 18 16:53:01.586680 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:53:01.586455 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-41715-predictor-b5c8f9648-jxx9g"] Mar 18 16:53:01.691673 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:53:01.691589 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-41715-predictor-78fc4fb6ff-9f2fr"] Mar 18 16:53:01.694669 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:53:01.694643 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-41715-predictor-78fc4fb6ff-9f2fr" Mar 18 16:53:01.705275 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:53:01.705233 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-41715-predictor-78fc4fb6ff-9f2fr"] Mar 18 16:53:01.753723 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:53:01.753687 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vr5pr\" (UniqueName: \"kubernetes.io/projected/ecd7d2b5-2dbb-4d51-a7f7-952a2a857210-kube-api-access-vr5pr\") pod \"isvc-sklearn-graph-raw-41715-predictor-b5c8f9648-jxx9g\" (UID: \"ecd7d2b5-2dbb-4d51-a7f7-952a2a857210\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-41715-predictor-b5c8f9648-jxx9g" Mar 18 16:53:01.753893 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:53:01.753752 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ecd7d2b5-2dbb-4d51-a7f7-952a2a857210-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-41715-predictor-b5c8f9648-jxx9g\" (UID: \"ecd7d2b5-2dbb-4d51-a7f7-952a2a857210\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-41715-predictor-b5c8f9648-jxx9g" Mar 18 16:53:01.854457 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:53:01.854399 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ecd7d2b5-2dbb-4d51-a7f7-952a2a857210-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-41715-predictor-b5c8f9648-jxx9g\" (UID: \"ecd7d2b5-2dbb-4d51-a7f7-952a2a857210\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-41715-predictor-b5c8f9648-jxx9g" Mar 18 16:53:01.854457 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:53:01.854463 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/71b87b1b-2bf2-4c56-bf0b-c2793f6c4ad7-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-41715-predictor-78fc4fb6ff-9f2fr\" (UID: \"71b87b1b-2bf2-4c56-bf0b-c2793f6c4ad7\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-41715-predictor-78fc4fb6ff-9f2fr" Mar 18 16:53:01.854702 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:53:01.854506 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vr5pr\" (UniqueName: \"kubernetes.io/projected/ecd7d2b5-2dbb-4d51-a7f7-952a2a857210-kube-api-access-vr5pr\") pod \"isvc-sklearn-graph-raw-41715-predictor-b5c8f9648-jxx9g\" (UID: \"ecd7d2b5-2dbb-4d51-a7f7-952a2a857210\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-41715-predictor-b5c8f9648-jxx9g" Mar 18 16:53:01.854702 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:53:01.854525 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2sj9\" (UniqueName: \"kubernetes.io/projected/71b87b1b-2bf2-4c56-bf0b-c2793f6c4ad7-kube-api-access-h2sj9\") pod \"isvc-xgboost-graph-raw-41715-predictor-78fc4fb6ff-9f2fr\" (UID: \"71b87b1b-2bf2-4c56-bf0b-c2793f6c4ad7\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-41715-predictor-78fc4fb6ff-9f2fr" Mar 18 16:53:01.854820 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:53:01.854799 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ecd7d2b5-2dbb-4d51-a7f7-952a2a857210-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-41715-predictor-b5c8f9648-jxx9g\" (UID: \"ecd7d2b5-2dbb-4d51-a7f7-952a2a857210\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-41715-predictor-b5c8f9648-jxx9g" Mar 18 16:53:01.862735 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:53:01.862697 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vr5pr\" (UniqueName: \"kubernetes.io/projected/ecd7d2b5-2dbb-4d51-a7f7-952a2a857210-kube-api-access-vr5pr\") pod \"isvc-sklearn-graph-raw-41715-predictor-b5c8f9648-jxx9g\" (UID: \"ecd7d2b5-2dbb-4d51-a7f7-952a2a857210\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-41715-predictor-b5c8f9648-jxx9g" Mar 18 16:53:01.894701 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:53:01.894661 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-41715-predictor-b5c8f9648-jxx9g" Mar 18 16:53:01.955557 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:53:01.955511 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/71b87b1b-2bf2-4c56-bf0b-c2793f6c4ad7-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-41715-predictor-78fc4fb6ff-9f2fr\" (UID: \"71b87b1b-2bf2-4c56-bf0b-c2793f6c4ad7\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-41715-predictor-78fc4fb6ff-9f2fr" Mar 18 16:53:01.955697 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:53:01.955610 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h2sj9\" (UniqueName: \"kubernetes.io/projected/71b87b1b-2bf2-4c56-bf0b-c2793f6c4ad7-kube-api-access-h2sj9\") pod \"isvc-xgboost-graph-raw-41715-predictor-78fc4fb6ff-9f2fr\" (UID: \"71b87b1b-2bf2-4c56-bf0b-c2793f6c4ad7\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-41715-predictor-78fc4fb6ff-9f2fr" Mar 18 16:53:01.956003 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:53:01.955978 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/71b87b1b-2bf2-4c56-bf0b-c2793f6c4ad7-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-41715-predictor-78fc4fb6ff-9f2fr\" (UID: \"71b87b1b-2bf2-4c56-bf0b-c2793f6c4ad7\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-41715-predictor-78fc4fb6ff-9f2fr" Mar 18 16:53:01.964134 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:53:01.964102 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2sj9\" (UniqueName: \"kubernetes.io/projected/71b87b1b-2bf2-4c56-bf0b-c2793f6c4ad7-kube-api-access-h2sj9\") pod \"isvc-xgboost-graph-raw-41715-predictor-78fc4fb6ff-9f2fr\" (UID: \"71b87b1b-2bf2-4c56-bf0b-c2793f6c4ad7\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-41715-predictor-78fc4fb6ff-9f2fr" Mar 18 16:53:02.006124 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:53:02.006090 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-41715-predictor-78fc4fb6ff-9f2fr" Mar 18 16:53:02.021467 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:53:02.021442 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-41715-predictor-b5c8f9648-jxx9g"] Mar 18 16:53:02.024390 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:53:02.024360 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podecd7d2b5_2dbb_4d51_a7f7_952a2a857210.slice/crio-76084e2ecd71529b3c46f6cba586879962ce6a90d162263ff6ec7e4580182d10 WatchSource:0}: Error finding container 76084e2ecd71529b3c46f6cba586879962ce6a90d162263ff6ec7e4580182d10: Status 404 returned error can't find the container with id 76084e2ecd71529b3c46f6cba586879962ce6a90d162263ff6ec7e4580182d10 Mar 18 16:53:02.137838 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:53:02.137810 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-41715-predictor-78fc4fb6ff-9f2fr"] Mar 18 16:53:02.139934 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:53:02.139901 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71b87b1b_2bf2_4c56_bf0b_c2793f6c4ad7.slice/crio-8b77858c2c517a83c24030c3af64af23f42d9e7056de21b9e20609163bbd7868 WatchSource:0}: Error finding container 8b77858c2c517a83c24030c3af64af23f42d9e7056de21b9e20609163bbd7868: Status 404 returned error can't find the container with id 8b77858c2c517a83c24030c3af64af23f42d9e7056de21b9e20609163bbd7868 Mar 18 16:53:02.507821 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:53:02.507782 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-41715-predictor-78fc4fb6ff-9f2fr" event={"ID":"71b87b1b-2bf2-4c56-bf0b-c2793f6c4ad7","Type":"ContainerStarted","Data":"10f217d7c2105cbd43c49cc84ddc629bd4c05b76a69196991ba56f817676574e"} Mar 18 16:53:02.507821 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:53:02.507827 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-41715-predictor-78fc4fb6ff-9f2fr" event={"ID":"71b87b1b-2bf2-4c56-bf0b-c2793f6c4ad7","Type":"ContainerStarted","Data":"8b77858c2c517a83c24030c3af64af23f42d9e7056de21b9e20609163bbd7868"} Mar 18 16:53:02.509297 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:53:02.509267 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-41715-predictor-b5c8f9648-jxx9g" event={"ID":"ecd7d2b5-2dbb-4d51-a7f7-952a2a857210","Type":"ContainerStarted","Data":"de7a1ed0307500f849f58d39be018208d7a3b9bfaa203d354c376ce2553d087a"} Mar 18 16:53:02.509433 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:53:02.509304 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-41715-predictor-b5c8f9648-jxx9g" event={"ID":"ecd7d2b5-2dbb-4d51-a7f7-952a2a857210","Type":"ContainerStarted","Data":"76084e2ecd71529b3c46f6cba586879962ce6a90d162263ff6ec7e4580182d10"} Mar 18 16:53:03.560464 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:53:03.560431 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-jrcb2" podUID="c1eb4bd1-d0f6-4dd7-a063-638cff8ab0d3" Mar 18 16:53:06.524398 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:53:06.524360 2576 generic.go:358] "Generic (PLEG): container finished" podID="ecd7d2b5-2dbb-4d51-a7f7-952a2a857210" containerID="de7a1ed0307500f849f58d39be018208d7a3b9bfaa203d354c376ce2553d087a" exitCode=0 Mar 18 16:53:06.524818 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:53:06.524433 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-41715-predictor-b5c8f9648-jxx9g" event={"ID":"ecd7d2b5-2dbb-4d51-a7f7-952a2a857210","Type":"ContainerDied","Data":"de7a1ed0307500f849f58d39be018208d7a3b9bfaa203d354c376ce2553d087a"} Mar 18 16:53:06.526088 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:53:06.526066 2576 generic.go:358] "Generic (PLEG): container finished" podID="71b87b1b-2bf2-4c56-bf0b-c2793f6c4ad7" containerID="10f217d7c2105cbd43c49cc84ddc629bd4c05b76a69196991ba56f817676574e" exitCode=0 Mar 18 16:53:06.526188 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:53:06.526129 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-41715-predictor-78fc4fb6ff-9f2fr" event={"ID":"71b87b1b-2bf2-4c56-bf0b-c2793f6c4ad7","Type":"ContainerDied","Data":"10f217d7c2105cbd43c49cc84ddc629bd4c05b76a69196991ba56f817676574e"} Mar 18 16:53:07.533072 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:53:07.533037 2576 generic.go:358] "Generic (PLEG): container finished" podID="b0def082-c96a-4df0-9ef8-12519113bcdb" containerID="2d3947a44fd8ab2439c88fec45e3c2cda191ff4d8169ded6d2738e2801f58949" exitCode=0 Mar 18 16:53:07.533497 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:53:07.533116 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dbd25-predictor-5785fbfc74-vc5nc" event={"ID":"b0def082-c96a-4df0-9ef8-12519113bcdb","Type":"ContainerDied","Data":"2d3947a44fd8ab2439c88fec45e3c2cda191ff4d8169ded6d2738e2801f58949"} Mar 18 16:53:07.534977 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:53:07.534950 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-41715-predictor-b5c8f9648-jxx9g" event={"ID":"ecd7d2b5-2dbb-4d51-a7f7-952a2a857210","Type":"ContainerStarted","Data":"be57d860f0a35b2a6c686cf2965a79bba0f2c60300d0847150eb777723c5f3d7"} Mar 18 16:53:07.535263 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:53:07.535236 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-41715-predictor-b5c8f9648-jxx9g" Mar 18 16:53:07.536822 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:53:07.536790 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-41715-predictor-b5c8f9648-jxx9g" podUID="ecd7d2b5-2dbb-4d51-a7f7-952a2a857210" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Mar 18 16:53:07.553831 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:53:07.553649 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-41715-predictor-b5c8f9648-jxx9g" podStartSLOduration=6.553635317 podStartE2EDuration="6.553635317s" podCreationTimestamp="2026-03-18 16:53:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:53:07.553273935 +0000 UTC m=+578.603089363" watchObservedRunningTime="2026-03-18 16:53:07.553635317 +0000 UTC m=+578.603450726" Mar 18 16:53:08.211783 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:53:08.211734 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dbd25-predictor-5785fbfc74-vc5nc" podUID="b0def082-c96a-4df0-9ef8-12519113bcdb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Mar 18 16:53:08.212230 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:53:08.212201 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dbd25-predictor-5785fbfc74-vc5nc" podUID="b0def082-c96a-4df0-9ef8-12519113bcdb" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 16:53:08.539911 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:53:08.539870 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-41715-predictor-b5c8f9648-jxx9g" podUID="ecd7d2b5-2dbb-4d51-a7f7-952a2a857210" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Mar 18 16:53:17.560568 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:53:17.560514 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-jrcb2" podUID="c1eb4bd1-d0f6-4dd7-a063-638cff8ab0d3" Mar 18 16:53:18.211839 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:53:18.211784 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dbd25-predictor-5785fbfc74-vc5nc" podUID="b0def082-c96a-4df0-9ef8-12519113bcdb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Mar 18 16:53:18.212192 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:53:18.212162 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dbd25-predictor-5785fbfc74-vc5nc" podUID="b0def082-c96a-4df0-9ef8-12519113bcdb" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 16:53:18.540184 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:53:18.540137 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-41715-predictor-b5c8f9648-jxx9g" podUID="ecd7d2b5-2dbb-4d51-a7f7-952a2a857210" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Mar 18 16:53:27.604723 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:53:27.604635 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-41715-predictor-78fc4fb6ff-9f2fr" event={"ID":"71b87b1b-2bf2-4c56-bf0b-c2793f6c4ad7","Type":"ContainerStarted","Data":"0796818ba6dd9d6078d6b85c53c21a85c999c40aab2b668e6a708001a6235d6e"} Mar 18 16:53:27.605150 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:53:27.604912 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-41715-predictor-78fc4fb6ff-9f2fr" Mar 18 16:53:27.606287 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:53:27.606263 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-41715-predictor-78fc4fb6ff-9f2fr" podUID="71b87b1b-2bf2-4c56-bf0b-c2793f6c4ad7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Mar 18 16:53:27.623192 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:53:27.623131 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-41715-predictor-78fc4fb6ff-9f2fr" podStartSLOduration=5.813839241 podStartE2EDuration="26.623118326s" podCreationTimestamp="2026-03-18 16:53:01 +0000 UTC" firstStartedPulling="2026-03-18 16:53:06.527406189 +0000 UTC m=+577.577221594" lastFinishedPulling="2026-03-18 16:53:27.336685274 +0000 UTC m=+598.386500679" observedRunningTime="2026-03-18 16:53:27.622881181 +0000 UTC m=+598.672696608" watchObservedRunningTime="2026-03-18 16:53:27.623118326 +0000 UTC m=+598.672933752" Mar 18 16:53:28.212243 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:53:28.212193 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dbd25-predictor-5785fbfc74-vc5nc" podUID="b0def082-c96a-4df0-9ef8-12519113bcdb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Mar 18 16:53:28.212437 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:53:28.212330 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dbd25-predictor-5785fbfc74-vc5nc" Mar 18 16:53:28.212538 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:53:28.212510 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dbd25-predictor-5785fbfc74-vc5nc" podUID="b0def082-c96a-4df0-9ef8-12519113bcdb" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 16:53:28.212636 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:53:28.212624 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dbd25-predictor-5785fbfc74-vc5nc" Mar 18 16:53:28.540205 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:53:28.540113 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-41715-predictor-b5c8f9648-jxx9g" podUID="ecd7d2b5-2dbb-4d51-a7f7-952a2a857210" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Mar 18 16:53:28.607985 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:53:28.607927 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-41715-predictor-78fc4fb6ff-9f2fr" podUID="71b87b1b-2bf2-4c56-bf0b-c2793f6c4ad7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Mar 18 16:53:29.459058 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:53:29.459030 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g2jdr_df16c6f7-55c8-4f26-80fc-ae0ba3fa368a/ovn-acl-logging/0.log" Mar 18 16:53:29.461433 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:53:29.461412 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g2jdr_df16c6f7-55c8-4f26-80fc-ae0ba3fa368a/ovn-acl-logging/0.log" Mar 18 16:53:31.574694 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:53:31.574672 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dbd25-predictor-5785fbfc74-vc5nc" Mar 18 16:53:31.608178 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:53:31.608149 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b0def082-c96a-4df0-9ef8-12519113bcdb-kserve-provision-location\") pod \"b0def082-c96a-4df0-9ef8-12519113bcdb\" (UID: \"b0def082-c96a-4df0-9ef8-12519113bcdb\") " Mar 18 16:53:31.608355 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:53:31.608220 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8htds\" (UniqueName: \"kubernetes.io/projected/b0def082-c96a-4df0-9ef8-12519113bcdb-kube-api-access-8htds\") pod \"b0def082-c96a-4df0-9ef8-12519113bcdb\" (UID: \"b0def082-c96a-4df0-9ef8-12519113bcdb\") " Mar 18 16:53:31.608543 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:53:31.608515 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0def082-c96a-4df0-9ef8-12519113bcdb-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "b0def082-c96a-4df0-9ef8-12519113bcdb" (UID: "b0def082-c96a-4df0-9ef8-12519113bcdb"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 18 16:53:31.610374 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:53:31.610348 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0def082-c96a-4df0-9ef8-12519113bcdb-kube-api-access-8htds" (OuterVolumeSpecName: "kube-api-access-8htds") pod "b0def082-c96a-4df0-9ef8-12519113bcdb" (UID: "b0def082-c96a-4df0-9ef8-12519113bcdb"). InnerVolumeSpecName "kube-api-access-8htds". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 16:53:31.617636 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:53:31.617607 2576 generic.go:358] "Generic (PLEG): container finished" podID="b0def082-c96a-4df0-9ef8-12519113bcdb" containerID="4317713af0f86452249589667518e248d3fe152066ee9623a62f1af5425452d1" exitCode=0 Mar 18 16:53:31.617731 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:53:31.617685 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dbd25-predictor-5785fbfc74-vc5nc" Mar 18 16:53:31.617731 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:53:31.617692 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dbd25-predictor-5785fbfc74-vc5nc" event={"ID":"b0def082-c96a-4df0-9ef8-12519113bcdb","Type":"ContainerDied","Data":"4317713af0f86452249589667518e248d3fe152066ee9623a62f1af5425452d1"} Mar 18 16:53:31.617800 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:53:31.617730 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dbd25-predictor-5785fbfc74-vc5nc" event={"ID":"b0def082-c96a-4df0-9ef8-12519113bcdb","Type":"ContainerDied","Data":"d22bb5b4c794d1508cf2098bc850f3e1e916571ebc68f007b9295867dfd7b0ff"} Mar 18 16:53:31.617800 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:53:31.617752 2576 scope.go:117] "RemoveContainer" containerID="4317713af0f86452249589667518e248d3fe152066ee9623a62f1af5425452d1" Mar 18 16:53:31.625927 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:53:31.625910 2576 scope.go:117] "RemoveContainer" containerID="2d3947a44fd8ab2439c88fec45e3c2cda191ff4d8169ded6d2738e2801f58949" Mar 18 16:53:31.633236 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:53:31.633215 2576 scope.go:117] "RemoveContainer" containerID="901606270e4a5cc9732044eec204c602891975b372c848404f33bd0fdc76fcc0" Mar 18 16:53:31.637988 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:53:31.637960 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dbd25-predictor-5785fbfc74-vc5nc"] Mar 18 16:53:31.642069 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:53:31.642046 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-dbd25-predictor-5785fbfc74-vc5nc"] Mar 18 16:53:31.644905 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:53:31.644885 2576 scope.go:117] "RemoveContainer" containerID="4317713af0f86452249589667518e248d3fe152066ee9623a62f1af5425452d1" Mar 18 16:53:31.645220 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:53:31.645198 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4317713af0f86452249589667518e248d3fe152066ee9623a62f1af5425452d1\": container with ID starting with 4317713af0f86452249589667518e248d3fe152066ee9623a62f1af5425452d1 not found: ID does not exist" containerID="4317713af0f86452249589667518e248d3fe152066ee9623a62f1af5425452d1" Mar 18 16:53:31.645281 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:53:31.645230 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4317713af0f86452249589667518e248d3fe152066ee9623a62f1af5425452d1"} err="failed to get container status \"4317713af0f86452249589667518e248d3fe152066ee9623a62f1af5425452d1\": rpc error: code = NotFound desc = could not find container \"4317713af0f86452249589667518e248d3fe152066ee9623a62f1af5425452d1\": container with ID starting with 4317713af0f86452249589667518e248d3fe152066ee9623a62f1af5425452d1 not found: ID does not exist" Mar 18 16:53:31.645281 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:53:31.645255 2576 scope.go:117] "RemoveContainer" containerID="2d3947a44fd8ab2439c88fec45e3c2cda191ff4d8169ded6d2738e2801f58949" Mar 18 16:53:31.645524 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:53:31.645506 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d3947a44fd8ab2439c88fec45e3c2cda191ff4d8169ded6d2738e2801f58949\": container with ID starting with 2d3947a44fd8ab2439c88fec45e3c2cda191ff4d8169ded6d2738e2801f58949 not found: ID does not exist" containerID="2d3947a44fd8ab2439c88fec45e3c2cda191ff4d8169ded6d2738e2801f58949" Mar 18 16:53:31.645571 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:53:31.645533 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d3947a44fd8ab2439c88fec45e3c2cda191ff4d8169ded6d2738e2801f58949"} err="failed to get container status \"2d3947a44fd8ab2439c88fec45e3c2cda191ff4d8169ded6d2738e2801f58949\": rpc error: code = NotFound desc = could not find container \"2d3947a44fd8ab2439c88fec45e3c2cda191ff4d8169ded6d2738e2801f58949\": container with ID starting with 2d3947a44fd8ab2439c88fec45e3c2cda191ff4d8169ded6d2738e2801f58949 not found: ID does not exist" Mar 18 16:53:31.645571 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:53:31.645550 2576 scope.go:117] "RemoveContainer" containerID="901606270e4a5cc9732044eec204c602891975b372c848404f33bd0fdc76fcc0" Mar 18 16:53:31.645775 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:53:31.645760 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"901606270e4a5cc9732044eec204c602891975b372c848404f33bd0fdc76fcc0\": container with ID starting with 901606270e4a5cc9732044eec204c602891975b372c848404f33bd0fdc76fcc0 not found: ID does not exist" containerID="901606270e4a5cc9732044eec204c602891975b372c848404f33bd0fdc76fcc0" Mar 18 16:53:31.645822 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:53:31.645781 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"901606270e4a5cc9732044eec204c602891975b372c848404f33bd0fdc76fcc0"} err="failed to get container status \"901606270e4a5cc9732044eec204c602891975b372c848404f33bd0fdc76fcc0\": rpc error: code = NotFound desc = could not find container \"901606270e4a5cc9732044eec204c602891975b372c848404f33bd0fdc76fcc0\": container with ID starting with 901606270e4a5cc9732044eec204c602891975b372c848404f33bd0fdc76fcc0 not found: ID does not exist" Mar 18 16:53:31.709404 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:53:31.709324 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8htds\" (UniqueName: \"kubernetes.io/projected/b0def082-c96a-4df0-9ef8-12519113bcdb-kube-api-access-8htds\") on node \"ip-10-0-133-190.ec2.internal\" DevicePath \"\"" Mar 18 16:53:31.709404 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:53:31.709354 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b0def082-c96a-4df0-9ef8-12519113bcdb-kserve-provision-location\") on node \"ip-10-0-133-190.ec2.internal\" DevicePath \"\"" Mar 18 16:53:32.564636 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:53:32.564608 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 16:53:32.836106 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:53:32.836004 2576 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized" image="quay.io/opendatahub/odh-model-serving-api:fast" Mar 18 16:53:32.836547 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:53:32.836199 2576 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:server,Image:quay.io/opendatahub/odh-model-serving-api:fast,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},ContainerPort{Name:metrics,HostPort:0,ContainerPort:9090,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:TLS_CERT_DIR,Value:/tls,ValueFrom:nil,},EnvVar{Name:GATEWAY_LABEL_SELECTOR,Value:,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{50 -3} {} 50m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tls-certs,ReadOnly:true,MountPath:/tls,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-k6p25,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000640000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod model-serving-api-9699c8d45-jrcb2_kserve(c1eb4bd1-d0f6-4dd7-a063-638cff8ab0d3): ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized" logger="UnhandledError" Mar 18 16:53:32.837837 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:53:32.837809 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ErrImagePull: \"unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-jrcb2" podUID="c1eb4bd1-d0f6-4dd7-a063-638cff8ab0d3" Mar 18 16:53:33.563209 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:53:33.563176 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0def082-c96a-4df0-9ef8-12519113bcdb" path="/var/lib/kubelet/pods/b0def082-c96a-4df0-9ef8-12519113bcdb/volumes" Mar 18 16:53:38.540839 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:53:38.540797 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-41715-predictor-b5c8f9648-jxx9g" podUID="ecd7d2b5-2dbb-4d51-a7f7-952a2a857210" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Mar 18 16:53:38.608207 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:53:38.608161 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-41715-predictor-78fc4fb6ff-9f2fr" podUID="71b87b1b-2bf2-4c56-bf0b-c2793f6c4ad7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Mar 18 16:53:47.559570 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:53:47.559535 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-jrcb2" podUID="c1eb4bd1-d0f6-4dd7-a063-638cff8ab0d3" Mar 18 16:53:48.540096 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:53:48.540053 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-41715-predictor-b5c8f9648-jxx9g" podUID="ecd7d2b5-2dbb-4d51-a7f7-952a2a857210" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Mar 18 16:53:48.608842 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:53:48.608747 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-41715-predictor-78fc4fb6ff-9f2fr" podUID="71b87b1b-2bf2-4c56-bf0b-c2793f6c4ad7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Mar 18 16:53:58.540231 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:53:58.540185 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-41715-predictor-b5c8f9648-jxx9g" podUID="ecd7d2b5-2dbb-4d51-a7f7-952a2a857210" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Mar 18 16:53:58.608539 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:53:58.608492 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-41715-predictor-78fc4fb6ff-9f2fr" podUID="71b87b1b-2bf2-4c56-bf0b-c2793f6c4ad7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Mar 18 16:54:02.560596 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:54:02.560562 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-jrcb2" podUID="c1eb4bd1-d0f6-4dd7-a063-638cff8ab0d3" Mar 18 16:54:08.540202 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:54:08.540155 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-41715-predictor-b5c8f9648-jxx9g" podUID="ecd7d2b5-2dbb-4d51-a7f7-952a2a857210" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Mar 18 16:54:08.608345 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:54:08.608301 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-41715-predictor-78fc4fb6ff-9f2fr" podUID="71b87b1b-2bf2-4c56-bf0b-c2793f6c4ad7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Mar 18 16:54:15.559659 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:54:15.559621 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-41715-predictor-b5c8f9648-jxx9g" podUID="ecd7d2b5-2dbb-4d51-a7f7-952a2a857210" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Mar 18 16:54:15.560427 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:54:15.560280 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-jrcb2" podUID="c1eb4bd1-d0f6-4dd7-a063-638cff8ab0d3" Mar 18 16:54:18.608732 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:54:18.608682 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-41715-predictor-78fc4fb6ff-9f2fr" podUID="71b87b1b-2bf2-4c56-bf0b-c2793f6c4ad7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Mar 18 16:54:25.560396 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:54:25.560345 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-41715-predictor-b5c8f9648-jxx9g" podUID="ecd7d2b5-2dbb-4d51-a7f7-952a2a857210" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Mar 18 16:54:27.560051 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:54:27.560019 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-jrcb2" podUID="c1eb4bd1-d0f6-4dd7-a063-638cff8ab0d3" Mar 18 16:54:28.609012 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:54:28.608970 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-41715-predictor-78fc4fb6ff-9f2fr" Mar 18 16:54:35.562779 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:54:35.562752 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-41715-predictor-b5c8f9648-jxx9g" Mar 18 16:54:42.559915 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:54:42.559881 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-jrcb2" podUID="c1eb4bd1-d0f6-4dd7-a063-638cff8ab0d3" Mar 18 16:54:54.559973 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:54:54.559925 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-jrcb2" podUID="c1eb4bd1-d0f6-4dd7-a063-638cff8ab0d3" Mar 18 16:55:01.904208 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:55:01.904171 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-41715-predictor-b5c8f9648-jxx9g"] Mar 18 16:55:01.904719 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:55:01.904515 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-41715-predictor-b5c8f9648-jxx9g" podUID="ecd7d2b5-2dbb-4d51-a7f7-952a2a857210" containerName="kserve-container" containerID="cri-o://be57d860f0a35b2a6c686cf2965a79bba0f2c60300d0847150eb777723c5f3d7" gracePeriod=30 Mar 18 16:55:01.975767 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:55:01.975728 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-598eb-predictor-6f5f56fb5b-l8rkg"] Mar 18 16:55:01.976131 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:55:01.976115 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b0def082-c96a-4df0-9ef8-12519113bcdb" containerName="agent" Mar 18 16:55:01.976222 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:55:01.976134 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0def082-c96a-4df0-9ef8-12519113bcdb" containerName="agent" Mar 18 16:55:01.976222 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:55:01.976160 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b0def082-c96a-4df0-9ef8-12519113bcdb" containerName="kserve-container" Mar 18 16:55:01.976222 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:55:01.976170 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0def082-c96a-4df0-9ef8-12519113bcdb" containerName="kserve-container" Mar 18 16:55:01.976222 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:55:01.976201 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b0def082-c96a-4df0-9ef8-12519113bcdb" containerName="storage-initializer" Mar 18 16:55:01.976222 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:55:01.976210 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0def082-c96a-4df0-9ef8-12519113bcdb" containerName="storage-initializer" Mar 18 16:55:01.976472 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:55:01.976292 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="b0def082-c96a-4df0-9ef8-12519113bcdb" containerName="agent" Mar 18 16:55:01.976472 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:55:01.976309 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="b0def082-c96a-4df0-9ef8-12519113bcdb" containerName="kserve-container" Mar 18 16:55:01.979627 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:55:01.979606 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-598eb-predictor-6f5f56fb5b-l8rkg" Mar 18 16:55:01.986291 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:55:01.986259 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-598eb-predictor-6f5f56fb5b-l8rkg"] Mar 18 16:55:02.126269 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:55:02.126228 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7ec47a14-36c5-4c89-a104-5f58bd16048f-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-hpa-598eb-predictor-6f5f56fb5b-l8rkg\" (UID: \"7ec47a14-36c5-4c89-a104-5f58bd16048f\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-598eb-predictor-6f5f56fb5b-l8rkg" Mar 18 16:55:02.126446 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:55:02.126288 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4mjr\" (UniqueName: \"kubernetes.io/projected/7ec47a14-36c5-4c89-a104-5f58bd16048f-kube-api-access-k4mjr\") pod \"isvc-sklearn-graph-raw-hpa-598eb-predictor-6f5f56fb5b-l8rkg\" (UID: \"7ec47a14-36c5-4c89-a104-5f58bd16048f\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-598eb-predictor-6f5f56fb5b-l8rkg" Mar 18 16:55:02.178468 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:55:02.178370 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-598eb-predictor-77c45bdc67-7w5w8"] Mar 18 16:55:02.181777 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:55:02.181751 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-598eb-predictor-77c45bdc67-7w5w8" Mar 18 16:55:02.192181 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:55:02.192153 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-598eb-predictor-77c45bdc67-7w5w8"] Mar 18 16:55:02.207132 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:55:02.207092 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-41715-predictor-78fc4fb6ff-9f2fr"] Mar 18 16:55:02.207507 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:55:02.207447 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-41715-predictor-78fc4fb6ff-9f2fr" podUID="71b87b1b-2bf2-4c56-bf0b-c2793f6c4ad7" containerName="kserve-container" containerID="cri-o://0796818ba6dd9d6078d6b85c53c21a85c999c40aab2b668e6a708001a6235d6e" gracePeriod=30 Mar 18 16:55:02.227446 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:55:02.227410 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7ec47a14-36c5-4c89-a104-5f58bd16048f-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-hpa-598eb-predictor-6f5f56fb5b-l8rkg\" (UID: \"7ec47a14-36c5-4c89-a104-5f58bd16048f\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-598eb-predictor-6f5f56fb5b-l8rkg" Mar 18 16:55:02.227603 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:55:02.227463 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k4mjr\" (UniqueName: \"kubernetes.io/projected/7ec47a14-36c5-4c89-a104-5f58bd16048f-kube-api-access-k4mjr\") pod \"isvc-sklearn-graph-raw-hpa-598eb-predictor-6f5f56fb5b-l8rkg\" (UID: \"7ec47a14-36c5-4c89-a104-5f58bd16048f\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-598eb-predictor-6f5f56fb5b-l8rkg" Mar 18 16:55:02.227851 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:55:02.227827 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7ec47a14-36c5-4c89-a104-5f58bd16048f-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-hpa-598eb-predictor-6f5f56fb5b-l8rkg\" (UID: \"7ec47a14-36c5-4c89-a104-5f58bd16048f\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-598eb-predictor-6f5f56fb5b-l8rkg" Mar 18 16:55:02.234764 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:55:02.234691 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4mjr\" (UniqueName: \"kubernetes.io/projected/7ec47a14-36c5-4c89-a104-5f58bd16048f-kube-api-access-k4mjr\") pod \"isvc-sklearn-graph-raw-hpa-598eb-predictor-6f5f56fb5b-l8rkg\" (UID: \"7ec47a14-36c5-4c89-a104-5f58bd16048f\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-598eb-predictor-6f5f56fb5b-l8rkg" Mar 18 16:55:02.293682 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:55:02.293648 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-598eb-predictor-6f5f56fb5b-l8rkg" Mar 18 16:55:02.328757 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:55:02.328720 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5xhh\" (UniqueName: \"kubernetes.io/projected/8d320bb9-9907-4419-991a-81777e4b1906-kube-api-access-w5xhh\") pod \"isvc-xgboost-graph-raw-hpa-598eb-predictor-77c45bdc67-7w5w8\" (UID: \"8d320bb9-9907-4419-991a-81777e4b1906\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-598eb-predictor-77c45bdc67-7w5w8" Mar 18 16:55:02.328757 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:55:02.328759 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8d320bb9-9907-4419-991a-81777e4b1906-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-hpa-598eb-predictor-77c45bdc67-7w5w8\" (UID: \"8d320bb9-9907-4419-991a-81777e4b1906\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-598eb-predictor-77c45bdc67-7w5w8" Mar 18 16:55:02.418160 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:55:02.418134 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-598eb-predictor-6f5f56fb5b-l8rkg"] Mar 18 16:55:02.420289 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:55:02.420260 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ec47a14_36c5_4c89_a104_5f58bd16048f.slice/crio-a030637a527a5b4851a654d85b0a9ee0451fa8bc093b3b26805539e815bba802 WatchSource:0}: Error finding container a030637a527a5b4851a654d85b0a9ee0451fa8bc093b3b26805539e815bba802: Status 404 returned error can't find the container with id a030637a527a5b4851a654d85b0a9ee0451fa8bc093b3b26805539e815bba802 Mar 18 16:55:02.431534 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:55:02.431477 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8d320bb9-9907-4419-991a-81777e4b1906-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-hpa-598eb-predictor-77c45bdc67-7w5w8\" (UID: \"8d320bb9-9907-4419-991a-81777e4b1906\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-598eb-predictor-77c45bdc67-7w5w8" Mar 18 16:55:02.431629 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:55:02.431574 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w5xhh\" (UniqueName: \"kubernetes.io/projected/8d320bb9-9907-4419-991a-81777e4b1906-kube-api-access-w5xhh\") pod \"isvc-xgboost-graph-raw-hpa-598eb-predictor-77c45bdc67-7w5w8\" (UID: \"8d320bb9-9907-4419-991a-81777e4b1906\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-598eb-predictor-77c45bdc67-7w5w8" Mar 18 16:55:02.431856 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:55:02.431837 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8d320bb9-9907-4419-991a-81777e4b1906-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-hpa-598eb-predictor-77c45bdc67-7w5w8\" (UID: \"8d320bb9-9907-4419-991a-81777e4b1906\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-598eb-predictor-77c45bdc67-7w5w8" Mar 18 16:55:02.438826 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:55:02.438802 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5xhh\" (UniqueName: \"kubernetes.io/projected/8d320bb9-9907-4419-991a-81777e4b1906-kube-api-access-w5xhh\") pod \"isvc-xgboost-graph-raw-hpa-598eb-predictor-77c45bdc67-7w5w8\" (UID: \"8d320bb9-9907-4419-991a-81777e4b1906\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-598eb-predictor-77c45bdc67-7w5w8" Mar 18 16:55:02.496678 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:55:02.496648 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-598eb-predictor-77c45bdc67-7w5w8" Mar 18 16:55:02.631127 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:55:02.631103 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-598eb-predictor-77c45bdc67-7w5w8"] Mar 18 16:55:02.633461 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:55:02.633434 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8d320bb9_9907_4419_991a_81777e4b1906.slice/crio-cd16d4126e49079ce358eed61529d4cd182973fbf42a687ee988212f9c7d6430 WatchSource:0}: Error finding container cd16d4126e49079ce358eed61529d4cd182973fbf42a687ee988212f9c7d6430: Status 404 returned error can't find the container with id cd16d4126e49079ce358eed61529d4cd182973fbf42a687ee988212f9c7d6430 Mar 18 16:55:02.907376 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:55:02.907332 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-598eb-predictor-6f5f56fb5b-l8rkg" event={"ID":"7ec47a14-36c5-4c89-a104-5f58bd16048f","Type":"ContainerStarted","Data":"a2aa83a52cc790ab09190047403da5bb5731569568d605d6a4cf006249fc6dbf"} Mar 18 16:55:02.907816 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:55:02.907382 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-598eb-predictor-6f5f56fb5b-l8rkg" event={"ID":"7ec47a14-36c5-4c89-a104-5f58bd16048f","Type":"ContainerStarted","Data":"a030637a527a5b4851a654d85b0a9ee0451fa8bc093b3b26805539e815bba802"} Mar 18 16:55:02.908730 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:55:02.908699 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-598eb-predictor-77c45bdc67-7w5w8" event={"ID":"8d320bb9-9907-4419-991a-81777e4b1906","Type":"ContainerStarted","Data":"6c531b40335e101be358e6d7689cb30919cc922c0fed89ce9b4d319575e737e5"} Mar 18 16:55:02.908863 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:55:02.908737 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-598eb-predictor-77c45bdc67-7w5w8" event={"ID":"8d320bb9-9907-4419-991a-81777e4b1906","Type":"ContainerStarted","Data":"cd16d4126e49079ce358eed61529d4cd182973fbf42a687ee988212f9c7d6430"} Mar 18 16:55:05.560106 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:55:05.559601 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-41715-predictor-b5c8f9648-jxx9g" podUID="ecd7d2b5-2dbb-4d51-a7f7-952a2a857210" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Mar 18 16:55:05.560322 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:55:05.560121 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-jrcb2" podUID="c1eb4bd1-d0f6-4dd7-a063-638cff8ab0d3" Mar 18 16:55:05.921038 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:55:05.921000 2576 generic.go:358] "Generic (PLEG): container finished" podID="71b87b1b-2bf2-4c56-bf0b-c2793f6c4ad7" containerID="0796818ba6dd9d6078d6b85c53c21a85c999c40aab2b668e6a708001a6235d6e" exitCode=0 Mar 18 16:55:05.921193 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:55:05.921074 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-41715-predictor-78fc4fb6ff-9f2fr" event={"ID":"71b87b1b-2bf2-4c56-bf0b-c2793f6c4ad7","Type":"ContainerDied","Data":"0796818ba6dd9d6078d6b85c53c21a85c999c40aab2b668e6a708001a6235d6e"} Mar 18 16:55:05.949984 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:55:05.949962 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-41715-predictor-78fc4fb6ff-9f2fr" Mar 18 16:55:06.063292 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:55:06.063247 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h2sj9\" (UniqueName: \"kubernetes.io/projected/71b87b1b-2bf2-4c56-bf0b-c2793f6c4ad7-kube-api-access-h2sj9\") pod \"71b87b1b-2bf2-4c56-bf0b-c2793f6c4ad7\" (UID: \"71b87b1b-2bf2-4c56-bf0b-c2793f6c4ad7\") " Mar 18 16:55:06.063487 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:55:06.063320 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/71b87b1b-2bf2-4c56-bf0b-c2793f6c4ad7-kserve-provision-location\") pod \"71b87b1b-2bf2-4c56-bf0b-c2793f6c4ad7\" (UID: \"71b87b1b-2bf2-4c56-bf0b-c2793f6c4ad7\") " Mar 18 16:55:06.063662 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:55:06.063632 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71b87b1b-2bf2-4c56-bf0b-c2793f6c4ad7-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "71b87b1b-2bf2-4c56-bf0b-c2793f6c4ad7" (UID: "71b87b1b-2bf2-4c56-bf0b-c2793f6c4ad7"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 18 16:55:06.065445 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:55:06.065412 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71b87b1b-2bf2-4c56-bf0b-c2793f6c4ad7-kube-api-access-h2sj9" (OuterVolumeSpecName: "kube-api-access-h2sj9") pod "71b87b1b-2bf2-4c56-bf0b-c2793f6c4ad7" (UID: "71b87b1b-2bf2-4c56-bf0b-c2793f6c4ad7"). InnerVolumeSpecName "kube-api-access-h2sj9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 16:55:06.164622 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:55:06.164588 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/71b87b1b-2bf2-4c56-bf0b-c2793f6c4ad7-kserve-provision-location\") on node \"ip-10-0-133-190.ec2.internal\" DevicePath \"\"" Mar 18 16:55:06.164622 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:55:06.164615 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-h2sj9\" (UniqueName: \"kubernetes.io/projected/71b87b1b-2bf2-4c56-bf0b-c2793f6c4ad7-kube-api-access-h2sj9\") on node \"ip-10-0-133-190.ec2.internal\" DevicePath \"\"" Mar 18 16:55:06.463415 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:55:06.463381 2576 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ec47a14_36c5_4c89_a104_5f58bd16048f.slice/crio-a2aa83a52cc790ab09190047403da5bb5731569568d605d6a4cf006249fc6dbf.scope\": RecentStats: unable to find data in memory cache]" Mar 18 16:55:06.926062 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:55:06.926020 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-41715-predictor-78fc4fb6ff-9f2fr" event={"ID":"71b87b1b-2bf2-4c56-bf0b-c2793f6c4ad7","Type":"ContainerDied","Data":"8b77858c2c517a83c24030c3af64af23f42d9e7056de21b9e20609163bbd7868"} Mar 18 16:55:06.926484 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:55:06.926068 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-41715-predictor-78fc4fb6ff-9f2fr" Mar 18 16:55:06.926484 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:55:06.926080 2576 scope.go:117] "RemoveContainer" containerID="0796818ba6dd9d6078d6b85c53c21a85c999c40aab2b668e6a708001a6235d6e" Mar 18 16:55:06.927642 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:55:06.927612 2576 generic.go:358] "Generic (PLEG): container finished" podID="7ec47a14-36c5-4c89-a104-5f58bd16048f" containerID="a2aa83a52cc790ab09190047403da5bb5731569568d605d6a4cf006249fc6dbf" exitCode=0 Mar 18 16:55:06.927750 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:55:06.927700 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-598eb-predictor-6f5f56fb5b-l8rkg" event={"ID":"7ec47a14-36c5-4c89-a104-5f58bd16048f","Type":"ContainerDied","Data":"a2aa83a52cc790ab09190047403da5bb5731569568d605d6a4cf006249fc6dbf"} Mar 18 16:55:06.929286 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:55:06.929258 2576 generic.go:358] "Generic (PLEG): container finished" podID="8d320bb9-9907-4419-991a-81777e4b1906" containerID="6c531b40335e101be358e6d7689cb30919cc922c0fed89ce9b4d319575e737e5" exitCode=0 Mar 18 16:55:06.929398 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:55:06.929384 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-598eb-predictor-77c45bdc67-7w5w8" event={"ID":"8d320bb9-9907-4419-991a-81777e4b1906","Type":"ContainerDied","Data":"6c531b40335e101be358e6d7689cb30919cc922c0fed89ce9b4d319575e737e5"} Mar 18 16:55:06.931833 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:55:06.931810 2576 generic.go:358] "Generic (PLEG): container finished" podID="ecd7d2b5-2dbb-4d51-a7f7-952a2a857210" containerID="be57d860f0a35b2a6c686cf2965a79bba0f2c60300d0847150eb777723c5f3d7" exitCode=0 Mar 18 16:55:06.931931 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:55:06.931844 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-41715-predictor-b5c8f9648-jxx9g" event={"ID":"ecd7d2b5-2dbb-4d51-a7f7-952a2a857210","Type":"ContainerDied","Data":"be57d860f0a35b2a6c686cf2965a79bba0f2c60300d0847150eb777723c5f3d7"} Mar 18 16:55:06.943350 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:55:06.943262 2576 scope.go:117] "RemoveContainer" containerID="10f217d7c2105cbd43c49cc84ddc629bd4c05b76a69196991ba56f817676574e" Mar 18 16:55:06.980414 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:55:06.980358 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-41715-predictor-78fc4fb6ff-9f2fr"] Mar 18 16:55:06.981876 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:55:06.981854 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-41715-predictor-78fc4fb6ff-9f2fr"] Mar 18 16:55:07.057417 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:55:07.057394 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-41715-predictor-b5c8f9648-jxx9g" Mar 18 16:55:07.173459 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:55:07.173366 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vr5pr\" (UniqueName: \"kubernetes.io/projected/ecd7d2b5-2dbb-4d51-a7f7-952a2a857210-kube-api-access-vr5pr\") pod \"ecd7d2b5-2dbb-4d51-a7f7-952a2a857210\" (UID: \"ecd7d2b5-2dbb-4d51-a7f7-952a2a857210\") " Mar 18 16:55:07.173459 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:55:07.173425 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ecd7d2b5-2dbb-4d51-a7f7-952a2a857210-kserve-provision-location\") pod \"ecd7d2b5-2dbb-4d51-a7f7-952a2a857210\" (UID: \"ecd7d2b5-2dbb-4d51-a7f7-952a2a857210\") " Mar 18 16:55:07.173819 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:55:07.173788 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ecd7d2b5-2dbb-4d51-a7f7-952a2a857210-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "ecd7d2b5-2dbb-4d51-a7f7-952a2a857210" (UID: "ecd7d2b5-2dbb-4d51-a7f7-952a2a857210"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 18 16:55:07.175409 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:55:07.175386 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ecd7d2b5-2dbb-4d51-a7f7-952a2a857210-kube-api-access-vr5pr" (OuterVolumeSpecName: "kube-api-access-vr5pr") pod "ecd7d2b5-2dbb-4d51-a7f7-952a2a857210" (UID: "ecd7d2b5-2dbb-4d51-a7f7-952a2a857210"). InnerVolumeSpecName "kube-api-access-vr5pr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 16:55:07.274465 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:55:07.274435 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vr5pr\" (UniqueName: \"kubernetes.io/projected/ecd7d2b5-2dbb-4d51-a7f7-952a2a857210-kube-api-access-vr5pr\") on node \"ip-10-0-133-190.ec2.internal\" DevicePath \"\"" Mar 18 16:55:07.274465 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:55:07.274462 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ecd7d2b5-2dbb-4d51-a7f7-952a2a857210-kserve-provision-location\") on node \"ip-10-0-133-190.ec2.internal\" DevicePath \"\"" Mar 18 16:55:07.562736 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:55:07.562693 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71b87b1b-2bf2-4c56-bf0b-c2793f6c4ad7" path="/var/lib/kubelet/pods/71b87b1b-2bf2-4c56-bf0b-c2793f6c4ad7/volumes" Mar 18 16:55:07.937265 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:55:07.937232 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-598eb-predictor-6f5f56fb5b-l8rkg" event={"ID":"7ec47a14-36c5-4c89-a104-5f58bd16048f","Type":"ContainerStarted","Data":"ba9caa4bb4178b887cb83fbcd1ee644faf96f07188e415f856f65c0c4bb22b59"} Mar 18 16:55:07.937655 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:55:07.937551 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-598eb-predictor-6f5f56fb5b-l8rkg" Mar 18 16:55:07.938927 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:55:07.938901 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-598eb-predictor-77c45bdc67-7w5w8" event={"ID":"8d320bb9-9907-4419-991a-81777e4b1906","Type":"ContainerStarted","Data":"f17c3e99bb7e32d82b58c8918dceeba9d47e9c65d7b8486a60afd5bcb9222f16"} Mar 18 16:55:07.939211 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:55:07.939192 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-598eb-predictor-77c45bdc67-7w5w8" Mar 18 16:55:07.939346 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:55:07.939319 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-598eb-predictor-6f5f56fb5b-l8rkg" podUID="7ec47a14-36c5-4c89-a104-5f58bd16048f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Mar 18 16:55:07.940322 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:55:07.940302 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-41715-predictor-b5c8f9648-jxx9g" event={"ID":"ecd7d2b5-2dbb-4d51-a7f7-952a2a857210","Type":"ContainerDied","Data":"76084e2ecd71529b3c46f6cba586879962ce6a90d162263ff6ec7e4580182d10"} Mar 18 16:55:07.940434 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:55:07.940336 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-41715-predictor-b5c8f9648-jxx9g" Mar 18 16:55:07.940434 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:55:07.940379 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-598eb-predictor-77c45bdc67-7w5w8" podUID="8d320bb9-9907-4419-991a-81777e4b1906" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Mar 18 16:55:07.940434 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:55:07.940337 2576 scope.go:117] "RemoveContainer" containerID="be57d860f0a35b2a6c686cf2965a79bba0f2c60300d0847150eb777723c5f3d7" Mar 18 16:55:07.948572 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:55:07.948554 2576 scope.go:117] "RemoveContainer" containerID="de7a1ed0307500f849f58d39be018208d7a3b9bfaa203d354c376ce2553d087a" Mar 18 16:55:07.954603 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:55:07.954557 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-598eb-predictor-6f5f56fb5b-l8rkg" podStartSLOduration=6.954541864 podStartE2EDuration="6.954541864s" podCreationTimestamp="2026-03-18 16:55:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:55:07.953458256 +0000 UTC m=+699.003273683" watchObservedRunningTime="2026-03-18 16:55:07.954541864 +0000 UTC m=+699.004357292" Mar 18 16:55:07.968365 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:55:07.968316 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-598eb-predictor-77c45bdc67-7w5w8" podStartSLOduration=5.968302838 podStartE2EDuration="5.968302838s" podCreationTimestamp="2026-03-18 16:55:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:55:07.966467511 +0000 UTC m=+699.016282935" watchObservedRunningTime="2026-03-18 16:55:07.968302838 +0000 UTC m=+699.018118264" Mar 18 16:55:07.977029 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:55:07.976986 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-41715-predictor-b5c8f9648-jxx9g"] Mar 18 16:55:07.978538 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:55:07.978515 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-41715-predictor-b5c8f9648-jxx9g"] Mar 18 16:55:08.946208 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:55:08.946164 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-598eb-predictor-77c45bdc67-7w5w8" podUID="8d320bb9-9907-4419-991a-81777e4b1906" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Mar 18 16:55:08.946606 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:55:08.946295 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-598eb-predictor-6f5f56fb5b-l8rkg" podUID="7ec47a14-36c5-4c89-a104-5f58bd16048f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Mar 18 16:55:09.562983 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:55:09.562928 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ecd7d2b5-2dbb-4d51-a7f7-952a2a857210" path="/var/lib/kubelet/pods/ecd7d2b5-2dbb-4d51-a7f7-952a2a857210/volumes" Mar 18 16:55:16.560246 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:55:16.560203 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-jrcb2" podUID="c1eb4bd1-d0f6-4dd7-a063-638cff8ab0d3" Mar 18 16:55:18.946816 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:55:18.946727 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-598eb-predictor-6f5f56fb5b-l8rkg" podUID="7ec47a14-36c5-4c89-a104-5f58bd16048f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Mar 18 16:55:18.947208 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:55:18.946727 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-598eb-predictor-77c45bdc67-7w5w8" podUID="8d320bb9-9907-4419-991a-81777e4b1906" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Mar 18 16:55:28.946659 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:55:28.946611 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-598eb-predictor-77c45bdc67-7w5w8" podUID="8d320bb9-9907-4419-991a-81777e4b1906" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Mar 18 16:55:28.947064 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:55:28.946611 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-598eb-predictor-6f5f56fb5b-l8rkg" podUID="7ec47a14-36c5-4c89-a104-5f58bd16048f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Mar 18 16:55:29.561282 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:55:29.561246 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-jrcb2" podUID="c1eb4bd1-d0f6-4dd7-a063-638cff8ab0d3" Mar 18 16:55:38.946379 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:55:38.946331 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-598eb-predictor-77c45bdc67-7w5w8" podUID="8d320bb9-9907-4419-991a-81777e4b1906" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Mar 18 16:55:38.946857 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:55:38.946331 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-598eb-predictor-6f5f56fb5b-l8rkg" podUID="7ec47a14-36c5-4c89-a104-5f58bd16048f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Mar 18 16:55:42.559743 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:55:42.559711 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-jrcb2" podUID="c1eb4bd1-d0f6-4dd7-a063-638cff8ab0d3" Mar 18 16:55:48.946580 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:55:48.946533 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-598eb-predictor-77c45bdc67-7w5w8" podUID="8d320bb9-9907-4419-991a-81777e4b1906" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Mar 18 16:55:48.947114 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:55:48.946533 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-598eb-predictor-6f5f56fb5b-l8rkg" podUID="7ec47a14-36c5-4c89-a104-5f58bd16048f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Mar 18 16:55:56.560111 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:55:56.560073 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-jrcb2" podUID="c1eb4bd1-d0f6-4dd7-a063-638cff8ab0d3" Mar 18 16:55:58.946866 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:55:58.946825 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-598eb-predictor-77c45bdc67-7w5w8" podUID="8d320bb9-9907-4419-991a-81777e4b1906" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Mar 18 16:55:58.947293 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:55:58.946825 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-598eb-predictor-6f5f56fb5b-l8rkg" podUID="7ec47a14-36c5-4c89-a104-5f58bd16048f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Mar 18 16:56:08.559992 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:56:08.559959 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-jrcb2" podUID="c1eb4bd1-d0f6-4dd7-a063-638cff8ab0d3" Mar 18 16:56:08.947084 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:56:08.946984 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-598eb-predictor-6f5f56fb5b-l8rkg" podUID="7ec47a14-36c5-4c89-a104-5f58bd16048f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Mar 18 16:56:08.947972 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:56:08.947909 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-598eb-predictor-77c45bdc67-7w5w8" Mar 18 16:56:18.947249 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:56:18.947195 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-598eb-predictor-6f5f56fb5b-l8rkg" podUID="7ec47a14-36c5-4c89-a104-5f58bd16048f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Mar 18 16:56:20.847554 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:56:20.847403 2576 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized" image="quay.io/opendatahub/odh-model-serving-api:fast" Mar 18 16:56:20.848005 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:56:20.847585 2576 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:server,Image:quay.io/opendatahub/odh-model-serving-api:fast,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},ContainerPort{Name:metrics,HostPort:0,ContainerPort:9090,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:TLS_CERT_DIR,Value:/tls,ValueFrom:nil,},EnvVar{Name:GATEWAY_LABEL_SELECTOR,Value:,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{50 -3} {} 50m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tls-certs,ReadOnly:true,MountPath:/tls,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-k6p25,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000640000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod model-serving-api-9699c8d45-jrcb2_kserve(c1eb4bd1-d0f6-4dd7-a063-638cff8ab0d3): ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized" logger="UnhandledError" Mar 18 16:56:20.848764 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:56:20.848735 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ErrImagePull: \"unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-jrcb2" podUID="c1eb4bd1-d0f6-4dd7-a063-638cff8ab0d3" Mar 18 16:56:28.946405 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:56:28.946356 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-598eb-predictor-6f5f56fb5b-l8rkg" podUID="7ec47a14-36c5-4c89-a104-5f58bd16048f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Mar 18 16:56:30.559557 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:56:30.559508 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-598eb-predictor-6f5f56fb5b-l8rkg" podUID="7ec47a14-36c5-4c89-a104-5f58bd16048f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Mar 18 16:56:34.559618 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:56:34.559482 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-jrcb2" podUID="c1eb4bd1-d0f6-4dd7-a063-638cff8ab0d3" Mar 18 16:56:40.560430 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:56:40.560394 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-598eb-predictor-6f5f56fb5b-l8rkg" Mar 18 16:56:46.559826 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:56:46.559716 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-jrcb2" podUID="c1eb4bd1-d0f6-4dd7-a063-638cff8ab0d3" Mar 18 16:56:59.562010 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:56:59.561851 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-jrcb2" podUID="c1eb4bd1-d0f6-4dd7-a063-638cff8ab0d3" Mar 18 16:57:02.194746 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:57:02.194711 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-598eb-predictor-6f5f56fb5b-l8rkg"] Mar 18 16:57:02.195245 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:57:02.195175 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-598eb-predictor-6f5f56fb5b-l8rkg" podUID="7ec47a14-36c5-4c89-a104-5f58bd16048f" containerName="kserve-container" containerID="cri-o://ba9caa4bb4178b887cb83fbcd1ee644faf96f07188e415f856f65c0c4bb22b59" gracePeriod=30 Mar 18 16:57:02.210668 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:57:02.210634 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-31778-predictor-86547b75b6-96h9w"] Mar 18 16:57:02.211034 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:57:02.211006 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ecd7d2b5-2dbb-4d51-a7f7-952a2a857210" containerName="kserve-container" Mar 18 16:57:02.211034 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:57:02.211027 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecd7d2b5-2dbb-4d51-a7f7-952a2a857210" containerName="kserve-container" Mar 18 16:57:02.211396 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:57:02.211063 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="71b87b1b-2bf2-4c56-bf0b-c2793f6c4ad7" containerName="storage-initializer" Mar 18 16:57:02.211396 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:57:02.211073 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="71b87b1b-2bf2-4c56-bf0b-c2793f6c4ad7" containerName="storage-initializer" Mar 18 16:57:02.211396 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:57:02.211309 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ecd7d2b5-2dbb-4d51-a7f7-952a2a857210" containerName="storage-initializer" Mar 18 16:57:02.211396 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:57:02.211326 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecd7d2b5-2dbb-4d51-a7f7-952a2a857210" containerName="storage-initializer" Mar 18 16:57:02.211396 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:57:02.211341 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="71b87b1b-2bf2-4c56-bf0b-c2793f6c4ad7" containerName="kserve-container" Mar 18 16:57:02.211396 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:57:02.211350 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="71b87b1b-2bf2-4c56-bf0b-c2793f6c4ad7" containerName="kserve-container" Mar 18 16:57:02.211712 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:57:02.211434 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="ecd7d2b5-2dbb-4d51-a7f7-952a2a857210" containerName="kserve-container" Mar 18 16:57:02.211712 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:57:02.211449 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="71b87b1b-2bf2-4c56-bf0b-c2793f6c4ad7" containerName="kserve-container" Mar 18 16:57:02.214473 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:57:02.214452 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-31778-predictor-86547b75b6-96h9w"] Mar 18 16:57:02.214580 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:57:02.214564 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-raw-31778-predictor-86547b75b6-96h9w" Mar 18 16:57:02.287267 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:57:02.287224 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5lsz\" (UniqueName: \"kubernetes.io/projected/b6715332-90cd-43ab-b914-58744152dfea-kube-api-access-n5lsz\") pod \"message-dumper-raw-31778-predictor-86547b75b6-96h9w\" (UID: \"b6715332-90cd-43ab-b914-58744152dfea\") " pod="kserve-ci-e2e-test/message-dumper-raw-31778-predictor-86547b75b6-96h9w" Mar 18 16:57:02.298815 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:57:02.298762 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-598eb-predictor-77c45bdc67-7w5w8"] Mar 18 16:57:02.299227 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:57:02.299174 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-598eb-predictor-77c45bdc67-7w5w8" podUID="8d320bb9-9907-4419-991a-81777e4b1906" containerName="kserve-container" containerID="cri-o://f17c3e99bb7e32d82b58c8918dceeba9d47e9c65d7b8486a60afd5bcb9222f16" gracePeriod=30 Mar 18 16:57:02.388079 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:57:02.388039 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n5lsz\" (UniqueName: \"kubernetes.io/projected/b6715332-90cd-43ab-b914-58744152dfea-kube-api-access-n5lsz\") pod \"message-dumper-raw-31778-predictor-86547b75b6-96h9w\" (UID: \"b6715332-90cd-43ab-b914-58744152dfea\") " pod="kserve-ci-e2e-test/message-dumper-raw-31778-predictor-86547b75b6-96h9w" Mar 18 16:57:02.398091 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:57:02.398065 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5lsz\" (UniqueName: \"kubernetes.io/projected/b6715332-90cd-43ab-b914-58744152dfea-kube-api-access-n5lsz\") pod \"message-dumper-raw-31778-predictor-86547b75b6-96h9w\" (UID: \"b6715332-90cd-43ab-b914-58744152dfea\") " pod="kserve-ci-e2e-test/message-dumper-raw-31778-predictor-86547b75b6-96h9w" Mar 18 16:57:02.526327 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:57:02.526292 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-raw-31778-predictor-86547b75b6-96h9w" Mar 18 16:57:02.648204 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:57:02.648180 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-31778-predictor-86547b75b6-96h9w"] Mar 18 16:57:02.650773 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:57:02.650738 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb6715332_90cd_43ab_b914_58744152dfea.slice/crio-73551c6770d9d9d134bd312cc69d9e99fa54a15813672f98c0099d1b6c56e196 WatchSource:0}: Error finding container 73551c6770d9d9d134bd312cc69d9e99fa54a15813672f98c0099d1b6c56e196: Status 404 returned error can't find the container with id 73551c6770d9d9d134bd312cc69d9e99fa54a15813672f98c0099d1b6c56e196 Mar 18 16:57:03.306822 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:57:03.306791 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-31778-predictor-86547b75b6-96h9w" event={"ID":"b6715332-90cd-43ab-b914-58744152dfea","Type":"ContainerStarted","Data":"73551c6770d9d9d134bd312cc69d9e99fa54a15813672f98c0099d1b6c56e196"} Mar 18 16:57:04.310679 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:57:04.310643 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-31778-predictor-86547b75b6-96h9w" event={"ID":"b6715332-90cd-43ab-b914-58744152dfea","Type":"ContainerStarted","Data":"0abfb0936f5080cad47005b8cdced835dd04259e56f7925571cdc9e9b6049f5f"} Mar 18 16:57:04.311096 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:57:04.310798 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/message-dumper-raw-31778-predictor-86547b75b6-96h9w" Mar 18 16:57:04.312554 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:57:04.312530 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/message-dumper-raw-31778-predictor-86547b75b6-96h9w" Mar 18 16:57:04.326317 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:57:04.326276 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/message-dumper-raw-31778-predictor-86547b75b6-96h9w" podStartSLOduration=1.287127338 podStartE2EDuration="2.326263482s" podCreationTimestamp="2026-03-18 16:57:02 +0000 UTC" firstStartedPulling="2026-03-18 16:57:02.652687949 +0000 UTC m=+813.702503358" lastFinishedPulling="2026-03-18 16:57:03.691824086 +0000 UTC m=+814.741639502" observedRunningTime="2026-03-18 16:57:04.324461504 +0000 UTC m=+815.374276931" watchObservedRunningTime="2026-03-18 16:57:04.326263482 +0000 UTC m=+815.376078909" Mar 18 16:57:06.242859 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:57:06.242835 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-598eb-predictor-77c45bdc67-7w5w8" Mar 18 16:57:06.318235 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:57:06.318143 2576 generic.go:358] "Generic (PLEG): container finished" podID="8d320bb9-9907-4419-991a-81777e4b1906" containerID="f17c3e99bb7e32d82b58c8918dceeba9d47e9c65d7b8486a60afd5bcb9222f16" exitCode=0 Mar 18 16:57:06.318235 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:57:06.318215 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-598eb-predictor-77c45bdc67-7w5w8" Mar 18 16:57:06.318414 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:57:06.318226 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-598eb-predictor-77c45bdc67-7w5w8" event={"ID":"8d320bb9-9907-4419-991a-81777e4b1906","Type":"ContainerDied","Data":"f17c3e99bb7e32d82b58c8918dceeba9d47e9c65d7b8486a60afd5bcb9222f16"} Mar 18 16:57:06.318414 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:57:06.318272 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-598eb-predictor-77c45bdc67-7w5w8" event={"ID":"8d320bb9-9907-4419-991a-81777e4b1906","Type":"ContainerDied","Data":"cd16d4126e49079ce358eed61529d4cd182973fbf42a687ee988212f9c7d6430"} Mar 18 16:57:06.318414 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:57:06.318290 2576 scope.go:117] "RemoveContainer" containerID="f17c3e99bb7e32d82b58c8918dceeba9d47e9c65d7b8486a60afd5bcb9222f16" Mar 18 16:57:06.320603 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:57:06.320582 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8d320bb9-9907-4419-991a-81777e4b1906-kserve-provision-location\") pod \"8d320bb9-9907-4419-991a-81777e4b1906\" (UID: \"8d320bb9-9907-4419-991a-81777e4b1906\") " Mar 18 16:57:06.320729 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:57:06.320616 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w5xhh\" (UniqueName: \"kubernetes.io/projected/8d320bb9-9907-4419-991a-81777e4b1906-kube-api-access-w5xhh\") pod \"8d320bb9-9907-4419-991a-81777e4b1906\" (UID: \"8d320bb9-9907-4419-991a-81777e4b1906\") " Mar 18 16:57:06.320977 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:57:06.320932 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d320bb9-9907-4419-991a-81777e4b1906-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "8d320bb9-9907-4419-991a-81777e4b1906" (UID: "8d320bb9-9907-4419-991a-81777e4b1906"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 18 16:57:06.322691 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:57:06.322665 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d320bb9-9907-4419-991a-81777e4b1906-kube-api-access-w5xhh" (OuterVolumeSpecName: "kube-api-access-w5xhh") pod "8d320bb9-9907-4419-991a-81777e4b1906" (UID: "8d320bb9-9907-4419-991a-81777e4b1906"). InnerVolumeSpecName "kube-api-access-w5xhh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 16:57:06.326630 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:57:06.326608 2576 scope.go:117] "RemoveContainer" containerID="6c531b40335e101be358e6d7689cb30919cc922c0fed89ce9b4d319575e737e5" Mar 18 16:57:06.337437 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:57:06.337419 2576 scope.go:117] "RemoveContainer" containerID="f17c3e99bb7e32d82b58c8918dceeba9d47e9c65d7b8486a60afd5bcb9222f16" Mar 18 16:57:06.337720 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:57:06.337694 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f17c3e99bb7e32d82b58c8918dceeba9d47e9c65d7b8486a60afd5bcb9222f16\": container with ID starting with f17c3e99bb7e32d82b58c8918dceeba9d47e9c65d7b8486a60afd5bcb9222f16 not found: ID does not exist" containerID="f17c3e99bb7e32d82b58c8918dceeba9d47e9c65d7b8486a60afd5bcb9222f16" Mar 18 16:57:06.337765 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:57:06.337731 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f17c3e99bb7e32d82b58c8918dceeba9d47e9c65d7b8486a60afd5bcb9222f16"} err="failed to get container status \"f17c3e99bb7e32d82b58c8918dceeba9d47e9c65d7b8486a60afd5bcb9222f16\": rpc error: code = NotFound desc = could not find container \"f17c3e99bb7e32d82b58c8918dceeba9d47e9c65d7b8486a60afd5bcb9222f16\": container with ID starting with f17c3e99bb7e32d82b58c8918dceeba9d47e9c65d7b8486a60afd5bcb9222f16 not found: ID does not exist" Mar 18 16:57:06.337765 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:57:06.337749 2576 scope.go:117] "RemoveContainer" containerID="6c531b40335e101be358e6d7689cb30919cc922c0fed89ce9b4d319575e737e5" Mar 18 16:57:06.338046 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:57:06.338018 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c531b40335e101be358e6d7689cb30919cc922c0fed89ce9b4d319575e737e5\": container with ID starting with 6c531b40335e101be358e6d7689cb30919cc922c0fed89ce9b4d319575e737e5 not found: ID does not exist" containerID="6c531b40335e101be358e6d7689cb30919cc922c0fed89ce9b4d319575e737e5" Mar 18 16:57:06.338122 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:57:06.338047 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c531b40335e101be358e6d7689cb30919cc922c0fed89ce9b4d319575e737e5"} err="failed to get container status \"6c531b40335e101be358e6d7689cb30919cc922c0fed89ce9b4d319575e737e5\": rpc error: code = NotFound desc = could not find container \"6c531b40335e101be358e6d7689cb30919cc922c0fed89ce9b4d319575e737e5\": container with ID starting with 6c531b40335e101be358e6d7689cb30919cc922c0fed89ce9b4d319575e737e5 not found: ID does not exist" Mar 18 16:57:06.421867 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:57:06.421828 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8d320bb9-9907-4419-991a-81777e4b1906-kserve-provision-location\") on node \"ip-10-0-133-190.ec2.internal\" DevicePath \"\"" Mar 18 16:57:06.421867 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:57:06.421856 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-w5xhh\" (UniqueName: \"kubernetes.io/projected/8d320bb9-9907-4419-991a-81777e4b1906-kube-api-access-w5xhh\") on node \"ip-10-0-133-190.ec2.internal\" DevicePath \"\"" Mar 18 16:57:06.639219 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:57:06.639184 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-598eb-predictor-77c45bdc67-7w5w8"] Mar 18 16:57:06.642453 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:57:06.642427 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-598eb-predictor-77c45bdc67-7w5w8"] Mar 18 16:57:07.323644 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:57:07.323615 2576 generic.go:358] "Generic (PLEG): container finished" podID="7ec47a14-36c5-4c89-a104-5f58bd16048f" containerID="ba9caa4bb4178b887cb83fbcd1ee644faf96f07188e415f856f65c0c4bb22b59" exitCode=0 Mar 18 16:57:07.323975 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:57:07.323645 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-598eb-predictor-6f5f56fb5b-l8rkg" event={"ID":"7ec47a14-36c5-4c89-a104-5f58bd16048f","Type":"ContainerDied","Data":"ba9caa4bb4178b887cb83fbcd1ee644faf96f07188e415f856f65c0c4bb22b59"} Mar 18 16:57:07.323975 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:57:07.323689 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-598eb-predictor-6f5f56fb5b-l8rkg" event={"ID":"7ec47a14-36c5-4c89-a104-5f58bd16048f","Type":"ContainerDied","Data":"a030637a527a5b4851a654d85b0a9ee0451fa8bc093b3b26805539e815bba802"} Mar 18 16:57:07.323975 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:57:07.323704 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a030637a527a5b4851a654d85b0a9ee0451fa8bc093b3b26805539e815bba802" Mar 18 16:57:07.325935 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:57:07.325915 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-598eb-predictor-6f5f56fb5b-l8rkg" Mar 18 16:57:07.430990 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:57:07.430874 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k4mjr\" (UniqueName: \"kubernetes.io/projected/7ec47a14-36c5-4c89-a104-5f58bd16048f-kube-api-access-k4mjr\") pod \"7ec47a14-36c5-4c89-a104-5f58bd16048f\" (UID: \"7ec47a14-36c5-4c89-a104-5f58bd16048f\") " Mar 18 16:57:07.430990 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:57:07.430961 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7ec47a14-36c5-4c89-a104-5f58bd16048f-kserve-provision-location\") pod \"7ec47a14-36c5-4c89-a104-5f58bd16048f\" (UID: \"7ec47a14-36c5-4c89-a104-5f58bd16048f\") " Mar 18 16:57:07.431301 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:57:07.431274 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ec47a14-36c5-4c89-a104-5f58bd16048f-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "7ec47a14-36c5-4c89-a104-5f58bd16048f" (UID: "7ec47a14-36c5-4c89-a104-5f58bd16048f"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 18 16:57:07.433148 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:57:07.433114 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ec47a14-36c5-4c89-a104-5f58bd16048f-kube-api-access-k4mjr" (OuterVolumeSpecName: "kube-api-access-k4mjr") pod "7ec47a14-36c5-4c89-a104-5f58bd16048f" (UID: "7ec47a14-36c5-4c89-a104-5f58bd16048f"). InnerVolumeSpecName "kube-api-access-k4mjr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 16:57:07.531590 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:57:07.531549 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-k4mjr\" (UniqueName: \"kubernetes.io/projected/7ec47a14-36c5-4c89-a104-5f58bd16048f-kube-api-access-k4mjr\") on node \"ip-10-0-133-190.ec2.internal\" DevicePath \"\"" Mar 18 16:57:07.531590 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:57:07.531583 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7ec47a14-36c5-4c89-a104-5f58bd16048f-kserve-provision-location\") on node \"ip-10-0-133-190.ec2.internal\" DevicePath \"\"" Mar 18 16:57:07.562798 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:57:07.562756 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d320bb9-9907-4419-991a-81777e4b1906" path="/var/lib/kubelet/pods/8d320bb9-9907-4419-991a-81777e4b1906/volumes" Mar 18 16:57:08.326477 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:57:08.326445 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-598eb-predictor-6f5f56fb5b-l8rkg" Mar 18 16:57:08.341404 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:57:08.341373 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-598eb-predictor-6f5f56fb5b-l8rkg"] Mar 18 16:57:08.346554 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:57:08.346527 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-598eb-predictor-6f5f56fb5b-l8rkg"] Mar 18 16:57:09.563123 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:57:09.563093 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ec47a14-36c5-4c89-a104-5f58bd16048f" path="/var/lib/kubelet/pods/7ec47a14-36c5-4c89-a104-5f58bd16048f/volumes" Mar 18 16:57:10.559839 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:57:10.559810 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-jrcb2" podUID="c1eb4bd1-d0f6-4dd7-a063-638cff8ab0d3" Mar 18 16:57:12.090997 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:57:12.090967 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-31778-predictor-d454ff497-sxzd6"] Mar 18 16:57:12.091440 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:57:12.091290 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8d320bb9-9907-4419-991a-81777e4b1906" containerName="kserve-container" Mar 18 16:57:12.091440 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:57:12.091305 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d320bb9-9907-4419-991a-81777e4b1906" containerName="kserve-container" Mar 18 16:57:12.091440 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:57:12.091315 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7ec47a14-36c5-4c89-a104-5f58bd16048f" containerName="storage-initializer" Mar 18 16:57:12.091440 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:57:12.091321 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ec47a14-36c5-4c89-a104-5f58bd16048f" containerName="storage-initializer" Mar 18 16:57:12.091440 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:57:12.091326 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8d320bb9-9907-4419-991a-81777e4b1906" containerName="storage-initializer" Mar 18 16:57:12.091440 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:57:12.091332 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d320bb9-9907-4419-991a-81777e4b1906" containerName="storage-initializer" Mar 18 16:57:12.091440 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:57:12.091343 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7ec47a14-36c5-4c89-a104-5f58bd16048f" containerName="kserve-container" Mar 18 16:57:12.091440 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:57:12.091350 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ec47a14-36c5-4c89-a104-5f58bd16048f" containerName="kserve-container" Mar 18 16:57:12.091440 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:57:12.091400 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="8d320bb9-9907-4419-991a-81777e4b1906" containerName="kserve-container" Mar 18 16:57:12.091440 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:57:12.091410 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="7ec47a14-36c5-4c89-a104-5f58bd16048f" containerName="kserve-container" Mar 18 16:57:12.095797 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:57:12.095766 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-raw-31778-predictor-d454ff497-sxzd6" Mar 18 16:57:12.102275 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:57:12.102248 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-31778-predictor-d454ff497-sxzd6"] Mar 18 16:57:12.167198 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:57:12.167157 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/be43e454-6c78-4ba5-bddf-b3c5c0bb2626-kserve-provision-location\") pod \"isvc-logger-raw-31778-predictor-d454ff497-sxzd6\" (UID: \"be43e454-6c78-4ba5-bddf-b3c5c0bb2626\") " pod="kserve-ci-e2e-test/isvc-logger-raw-31778-predictor-d454ff497-sxzd6" Mar 18 16:57:12.167198 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:57:12.167205 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rzs6\" (UniqueName: \"kubernetes.io/projected/be43e454-6c78-4ba5-bddf-b3c5c0bb2626-kube-api-access-4rzs6\") pod \"isvc-logger-raw-31778-predictor-d454ff497-sxzd6\" (UID: \"be43e454-6c78-4ba5-bddf-b3c5c0bb2626\") " pod="kserve-ci-e2e-test/isvc-logger-raw-31778-predictor-d454ff497-sxzd6" Mar 18 16:57:12.268040 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:57:12.267985 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/be43e454-6c78-4ba5-bddf-b3c5c0bb2626-kserve-provision-location\") pod \"isvc-logger-raw-31778-predictor-d454ff497-sxzd6\" (UID: \"be43e454-6c78-4ba5-bddf-b3c5c0bb2626\") " pod="kserve-ci-e2e-test/isvc-logger-raw-31778-predictor-d454ff497-sxzd6" Mar 18 16:57:12.268221 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:57:12.268063 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4rzs6\" (UniqueName: \"kubernetes.io/projected/be43e454-6c78-4ba5-bddf-b3c5c0bb2626-kube-api-access-4rzs6\") pod \"isvc-logger-raw-31778-predictor-d454ff497-sxzd6\" (UID: \"be43e454-6c78-4ba5-bddf-b3c5c0bb2626\") " pod="kserve-ci-e2e-test/isvc-logger-raw-31778-predictor-d454ff497-sxzd6" Mar 18 16:57:12.268467 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:57:12.268444 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/be43e454-6c78-4ba5-bddf-b3c5c0bb2626-kserve-provision-location\") pod \"isvc-logger-raw-31778-predictor-d454ff497-sxzd6\" (UID: \"be43e454-6c78-4ba5-bddf-b3c5c0bb2626\") " pod="kserve-ci-e2e-test/isvc-logger-raw-31778-predictor-d454ff497-sxzd6" Mar 18 16:57:12.275539 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:57:12.275517 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rzs6\" (UniqueName: \"kubernetes.io/projected/be43e454-6c78-4ba5-bddf-b3c5c0bb2626-kube-api-access-4rzs6\") pod \"isvc-logger-raw-31778-predictor-d454ff497-sxzd6\" (UID: \"be43e454-6c78-4ba5-bddf-b3c5c0bb2626\") " pod="kserve-ci-e2e-test/isvc-logger-raw-31778-predictor-d454ff497-sxzd6" Mar 18 16:57:12.408094 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:57:12.408007 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-raw-31778-predictor-d454ff497-sxzd6" Mar 18 16:57:12.529926 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:57:12.529902 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-31778-predictor-d454ff497-sxzd6"] Mar 18 16:57:12.532632 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:57:12.532604 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe43e454_6c78_4ba5_bddf_b3c5c0bb2626.slice/crio-9ec01c7eced47bcbcb5518255c85ee8033266d80d9bf7c2a092aff74c410d9e8 WatchSource:0}: Error finding container 9ec01c7eced47bcbcb5518255c85ee8033266d80d9bf7c2a092aff74c410d9e8: Status 404 returned error can't find the container with id 9ec01c7eced47bcbcb5518255c85ee8033266d80d9bf7c2a092aff74c410d9e8 Mar 18 16:57:13.343168 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:57:13.343132 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-31778-predictor-d454ff497-sxzd6" event={"ID":"be43e454-6c78-4ba5-bddf-b3c5c0bb2626","Type":"ContainerStarted","Data":"58018c862416dbf7ee5f6845092b99164d2cc0549ec2d092065fe763aed0f334"} Mar 18 16:57:13.343168 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:57:13.343172 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-31778-predictor-d454ff497-sxzd6" event={"ID":"be43e454-6c78-4ba5-bddf-b3c5c0bb2626","Type":"ContainerStarted","Data":"9ec01c7eced47bcbcb5518255c85ee8033266d80d9bf7c2a092aff74c410d9e8"} Mar 18 16:57:17.356930 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:57:17.356895 2576 generic.go:358] "Generic (PLEG): container finished" podID="be43e454-6c78-4ba5-bddf-b3c5c0bb2626" containerID="58018c862416dbf7ee5f6845092b99164d2cc0549ec2d092065fe763aed0f334" exitCode=0 Mar 18 16:57:17.357298 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:57:17.356968 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-31778-predictor-d454ff497-sxzd6" event={"ID":"be43e454-6c78-4ba5-bddf-b3c5c0bb2626","Type":"ContainerDied","Data":"58018c862416dbf7ee5f6845092b99164d2cc0549ec2d092065fe763aed0f334"} Mar 18 16:57:18.362185 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:57:18.362149 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-31778-predictor-d454ff497-sxzd6" event={"ID":"be43e454-6c78-4ba5-bddf-b3c5c0bb2626","Type":"ContainerStarted","Data":"d01157130c0c5040c59b400d0f42beda4784d1e0f1c56c85548ff5b66f4012b2"} Mar 18 16:57:18.362185 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:57:18.362186 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-31778-predictor-d454ff497-sxzd6" event={"ID":"be43e454-6c78-4ba5-bddf-b3c5c0bb2626","Type":"ContainerStarted","Data":"7bffd6983c662e014bd98232b1f067e0e95e744b5f2cbf664e7bc246a555c09f"} Mar 18 16:57:18.362648 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:57:18.362483 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-31778-predictor-d454ff497-sxzd6" Mar 18 16:57:18.362648 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:57:18.362512 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-31778-predictor-d454ff497-sxzd6" Mar 18 16:57:18.363913 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:57:18.363882 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-31778-predictor-d454ff497-sxzd6" podUID="be43e454-6c78-4ba5-bddf-b3c5c0bb2626" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Mar 18 16:57:18.364533 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:57:18.364508 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-31778-predictor-d454ff497-sxzd6" podUID="be43e454-6c78-4ba5-bddf-b3c5c0bb2626" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 16:57:18.378567 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:57:18.378523 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-logger-raw-31778-predictor-d454ff497-sxzd6" podStartSLOduration=6.378509543 podStartE2EDuration="6.378509543s" podCreationTimestamp="2026-03-18 16:57:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:57:18.377195513 +0000 UTC m=+829.427010940" watchObservedRunningTime="2026-03-18 16:57:18.378509543 +0000 UTC m=+829.428324970" Mar 18 16:57:19.366121 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:57:19.366074 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-31778-predictor-d454ff497-sxzd6" podUID="be43e454-6c78-4ba5-bddf-b3c5c0bb2626" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Mar 18 16:57:19.366525 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:57:19.366503 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-31778-predictor-d454ff497-sxzd6" podUID="be43e454-6c78-4ba5-bddf-b3c5c0bb2626" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 16:57:24.559856 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:57:24.559825 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-jrcb2" podUID="c1eb4bd1-d0f6-4dd7-a063-638cff8ab0d3" Mar 18 16:57:29.366096 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:57:29.366056 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-31778-predictor-d454ff497-sxzd6" podUID="be43e454-6c78-4ba5-bddf-b3c5c0bb2626" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Mar 18 16:57:29.366589 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:57:29.366522 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-31778-predictor-d454ff497-sxzd6" podUID="be43e454-6c78-4ba5-bddf-b3c5c0bb2626" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 16:57:35.560028 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:57:35.559994 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-jrcb2" podUID="c1eb4bd1-d0f6-4dd7-a063-638cff8ab0d3" Mar 18 16:57:39.366856 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:57:39.366806 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-31778-predictor-d454ff497-sxzd6" podUID="be43e454-6c78-4ba5-bddf-b3c5c0bb2626" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Mar 18 16:57:39.367298 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:57:39.367253 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-31778-predictor-d454ff497-sxzd6" podUID="be43e454-6c78-4ba5-bddf-b3c5c0bb2626" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 16:57:46.560409 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:57:46.560372 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-jrcb2" podUID="c1eb4bd1-d0f6-4dd7-a063-638cff8ab0d3" Mar 18 16:57:49.366664 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:57:49.366620 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-31778-predictor-d454ff497-sxzd6" podUID="be43e454-6c78-4ba5-bddf-b3c5c0bb2626" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Mar 18 16:57:49.367233 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:57:49.367205 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-31778-predictor-d454ff497-sxzd6" podUID="be43e454-6c78-4ba5-bddf-b3c5c0bb2626" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 16:57:59.366631 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:57:59.366578 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-31778-predictor-d454ff497-sxzd6" podUID="be43e454-6c78-4ba5-bddf-b3c5c0bb2626" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Mar 18 16:57:59.367055 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:57:59.367026 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-31778-predictor-d454ff497-sxzd6" podUID="be43e454-6c78-4ba5-bddf-b3c5c0bb2626" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 16:58:00.559396 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:58:00.559349 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-jrcb2" podUID="c1eb4bd1-d0f6-4dd7-a063-638cff8ab0d3" Mar 18 16:58:09.366145 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:58:09.366089 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-31778-predictor-d454ff497-sxzd6" podUID="be43e454-6c78-4ba5-bddf-b3c5c0bb2626" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Mar 18 16:58:09.366629 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:58:09.366568 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-31778-predictor-d454ff497-sxzd6" podUID="be43e454-6c78-4ba5-bddf-b3c5c0bb2626" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 16:58:15.560274 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:58:15.560245 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-jrcb2" podUID="c1eb4bd1-d0f6-4dd7-a063-638cff8ab0d3" Mar 18 16:58:19.366858 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:58:19.366742 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-31778-predictor-d454ff497-sxzd6" podUID="be43e454-6c78-4ba5-bddf-b3c5c0bb2626" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Mar 18 16:58:19.367342 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:58:19.367257 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-31778-predictor-d454ff497-sxzd6" podUID="be43e454-6c78-4ba5-bddf-b3c5c0bb2626" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 16:58:28.560437 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:58:28.560395 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-jrcb2" podUID="c1eb4bd1-d0f6-4dd7-a063-638cff8ab0d3" Mar 18 16:58:29.366592 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:58:29.366538 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-31778-predictor-d454ff497-sxzd6" podUID="be43e454-6c78-4ba5-bddf-b3c5c0bb2626" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Mar 18 16:58:29.366997 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:58:29.366970 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-31778-predictor-d454ff497-sxzd6" podUID="be43e454-6c78-4ba5-bddf-b3c5c0bb2626" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 16:58:29.483427 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:58:29.483390 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g2jdr_df16c6f7-55c8-4f26-80fc-ae0ba3fa368a/ovn-acl-logging/0.log" Mar 18 16:58:29.487703 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:58:29.487683 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g2jdr_df16c6f7-55c8-4f26-80fc-ae0ba3fa368a/ovn-acl-logging/0.log" Mar 18 16:58:39.366224 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:58:39.366172 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-31778-predictor-d454ff497-sxzd6" podUID="be43e454-6c78-4ba5-bddf-b3c5c0bb2626" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Mar 18 16:58:39.368373 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:58:39.366588 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-31778-predictor-d454ff497-sxzd6" podUID="be43e454-6c78-4ba5-bddf-b3c5c0bb2626" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 16:58:40.559319 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:58:40.559269 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-31778-predictor-d454ff497-sxzd6" podUID="be43e454-6c78-4ba5-bddf-b3c5c0bb2626" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Mar 18 16:58:40.559745 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:58:40.559588 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-31778-predictor-d454ff497-sxzd6" podUID="be43e454-6c78-4ba5-bddf-b3c5c0bb2626" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 16:58:41.560365 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:58:41.560310 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 16:58:41.560663 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:58:41.560573 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-jrcb2" podUID="c1eb4bd1-d0f6-4dd7-a063-638cff8ab0d3" Mar 18 16:58:50.560181 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:58:50.560141 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-raw-31778-predictor-d454ff497-sxzd6" Mar 18 16:58:50.560596 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:58:50.560225 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-raw-31778-predictor-d454ff497-sxzd6" Mar 18 16:58:56.560399 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:58:56.560360 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-jrcb2" podUID="c1eb4bd1-d0f6-4dd7-a063-638cff8ab0d3" Mar 18 16:58:57.095958 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:58:57.095914 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_message-dumper-raw-31778-predictor-86547b75b6-96h9w_b6715332-90cd-43ab-b914-58744152dfea/kserve-container/0.log" Mar 18 16:58:57.527356 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:58:57.527325 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-31778-predictor-d454ff497-sxzd6"] Mar 18 16:58:57.527647 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:58:57.527625 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-raw-31778-predictor-d454ff497-sxzd6" podUID="be43e454-6c78-4ba5-bddf-b3c5c0bb2626" containerName="kserve-container" containerID="cri-o://7bffd6983c662e014bd98232b1f067e0e95e744b5f2cbf664e7bc246a555c09f" gracePeriod=30 Mar 18 16:58:57.527784 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:58:57.527729 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-raw-31778-predictor-d454ff497-sxzd6" podUID="be43e454-6c78-4ba5-bddf-b3c5c0bb2626" containerName="agent" containerID="cri-o://d01157130c0c5040c59b400d0f42beda4784d1e0f1c56c85548ff5b66f4012b2" gracePeriod=30 Mar 18 16:58:57.708252 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:58:57.708206 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-31778-predictor-86547b75b6-96h9w"] Mar 18 16:58:57.709026 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:58:57.708970 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/message-dumper-raw-31778-predictor-86547b75b6-96h9w" podUID="b6715332-90cd-43ab-b914-58744152dfea" containerName="kserve-container" containerID="cri-o://0abfb0936f5080cad47005b8cdced835dd04259e56f7925571cdc9e9b6049f5f" gracePeriod=30 Mar 18 16:58:57.777468 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:58:57.777382 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-342bc-predictor-54858cdc9d-lbqp4"] Mar 18 16:58:57.780541 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:58:57.780519 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-342bc-predictor-54858cdc9d-lbqp4" Mar 18 16:58:57.787980 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:58:57.787934 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-342bc-predictor-54858cdc9d-lbqp4"] Mar 18 16:58:57.802320 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:58:57.802284 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5b4856d3-fbfd-4a47-aff7-1c797aea2b16-kserve-provision-location\") pod \"isvc-sklearn-scale-raw-342bc-predictor-54858cdc9d-lbqp4\" (UID: \"5b4856d3-fbfd-4a47-aff7-1c797aea2b16\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-342bc-predictor-54858cdc9d-lbqp4" Mar 18 16:58:57.802474 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:58:57.802329 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwpbm\" (UniqueName: \"kubernetes.io/projected/5b4856d3-fbfd-4a47-aff7-1c797aea2b16-kube-api-access-fwpbm\") pod \"isvc-sklearn-scale-raw-342bc-predictor-54858cdc9d-lbqp4\" (UID: \"5b4856d3-fbfd-4a47-aff7-1c797aea2b16\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-342bc-predictor-54858cdc9d-lbqp4" Mar 18 16:58:57.903268 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:58:57.903231 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5b4856d3-fbfd-4a47-aff7-1c797aea2b16-kserve-provision-location\") pod \"isvc-sklearn-scale-raw-342bc-predictor-54858cdc9d-lbqp4\" (UID: \"5b4856d3-fbfd-4a47-aff7-1c797aea2b16\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-342bc-predictor-54858cdc9d-lbqp4" Mar 18 16:58:57.903268 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:58:57.903271 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fwpbm\" (UniqueName: \"kubernetes.io/projected/5b4856d3-fbfd-4a47-aff7-1c797aea2b16-kube-api-access-fwpbm\") pod \"isvc-sklearn-scale-raw-342bc-predictor-54858cdc9d-lbqp4\" (UID: \"5b4856d3-fbfd-4a47-aff7-1c797aea2b16\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-342bc-predictor-54858cdc9d-lbqp4" Mar 18 16:58:57.903722 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:58:57.903698 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5b4856d3-fbfd-4a47-aff7-1c797aea2b16-kserve-provision-location\") pod \"isvc-sklearn-scale-raw-342bc-predictor-54858cdc9d-lbqp4\" (UID: \"5b4856d3-fbfd-4a47-aff7-1c797aea2b16\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-342bc-predictor-54858cdc9d-lbqp4" Mar 18 16:58:57.911246 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:58:57.911219 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwpbm\" (UniqueName: \"kubernetes.io/projected/5b4856d3-fbfd-4a47-aff7-1c797aea2b16-kube-api-access-fwpbm\") pod \"isvc-sklearn-scale-raw-342bc-predictor-54858cdc9d-lbqp4\" (UID: \"5b4856d3-fbfd-4a47-aff7-1c797aea2b16\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-342bc-predictor-54858cdc9d-lbqp4" Mar 18 16:58:57.968275 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:58:57.968245 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-raw-31778-predictor-86547b75b6-96h9w" Mar 18 16:58:58.003760 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:58:58.003727 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n5lsz\" (UniqueName: \"kubernetes.io/projected/b6715332-90cd-43ab-b914-58744152dfea-kube-api-access-n5lsz\") pod \"b6715332-90cd-43ab-b914-58744152dfea\" (UID: \"b6715332-90cd-43ab-b914-58744152dfea\") " Mar 18 16:58:58.005781 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:58:58.005748 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6715332-90cd-43ab-b914-58744152dfea-kube-api-access-n5lsz" (OuterVolumeSpecName: "kube-api-access-n5lsz") pod "b6715332-90cd-43ab-b914-58744152dfea" (UID: "b6715332-90cd-43ab-b914-58744152dfea"). InnerVolumeSpecName "kube-api-access-n5lsz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 16:58:58.093428 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:58:58.093329 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-342bc-predictor-54858cdc9d-lbqp4" Mar 18 16:58:58.104606 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:58:58.104574 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-n5lsz\" (UniqueName: \"kubernetes.io/projected/b6715332-90cd-43ab-b914-58744152dfea-kube-api-access-n5lsz\") on node \"ip-10-0-133-190.ec2.internal\" DevicePath \"\"" Mar 18 16:58:58.219965 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:58:58.219915 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-342bc-predictor-54858cdc9d-lbqp4"] Mar 18 16:58:58.222980 ip-10-0-133-190 kubenswrapper[2576]: W0318 16:58:58.222952 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b4856d3_fbfd_4a47_aff7_1c797aea2b16.slice/crio-0abbc9390f9fc6333b60a609d3fddd9188d49a7a7d287158f50445897694f7c5 WatchSource:0}: Error finding container 0abbc9390f9fc6333b60a609d3fddd9188d49a7a7d287158f50445897694f7c5: Status 404 returned error can't find the container with id 0abbc9390f9fc6333b60a609d3fddd9188d49a7a7d287158f50445897694f7c5 Mar 18 16:58:58.681265 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:58:58.681232 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-342bc-predictor-54858cdc9d-lbqp4" event={"ID":"5b4856d3-fbfd-4a47-aff7-1c797aea2b16","Type":"ContainerStarted","Data":"2ff82353a116810c818128afdc108e5197c904cd21eceff76ec0a28130787ed0"} Mar 18 16:58:58.681265 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:58:58.681272 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-342bc-predictor-54858cdc9d-lbqp4" event={"ID":"5b4856d3-fbfd-4a47-aff7-1c797aea2b16","Type":"ContainerStarted","Data":"0abbc9390f9fc6333b60a609d3fddd9188d49a7a7d287158f50445897694f7c5"} Mar 18 16:58:58.682366 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:58:58.682342 2576 generic.go:358] "Generic (PLEG): container finished" podID="b6715332-90cd-43ab-b914-58744152dfea" containerID="0abfb0936f5080cad47005b8cdced835dd04259e56f7925571cdc9e9b6049f5f" exitCode=2 Mar 18 16:58:58.682464 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:58:58.682385 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-31778-predictor-86547b75b6-96h9w" event={"ID":"b6715332-90cd-43ab-b914-58744152dfea","Type":"ContainerDied","Data":"0abfb0936f5080cad47005b8cdced835dd04259e56f7925571cdc9e9b6049f5f"} Mar 18 16:58:58.682464 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:58:58.682394 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-raw-31778-predictor-86547b75b6-96h9w" Mar 18 16:58:58.682464 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:58:58.682404 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-31778-predictor-86547b75b6-96h9w" event={"ID":"b6715332-90cd-43ab-b914-58744152dfea","Type":"ContainerDied","Data":"73551c6770d9d9d134bd312cc69d9e99fa54a15813672f98c0099d1b6c56e196"} Mar 18 16:58:58.682464 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:58:58.682419 2576 scope.go:117] "RemoveContainer" containerID="0abfb0936f5080cad47005b8cdced835dd04259e56f7925571cdc9e9b6049f5f" Mar 18 16:58:58.690726 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:58:58.690704 2576 scope.go:117] "RemoveContainer" containerID="0abfb0936f5080cad47005b8cdced835dd04259e56f7925571cdc9e9b6049f5f" Mar 18 16:58:58.691031 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:58:58.691009 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0abfb0936f5080cad47005b8cdced835dd04259e56f7925571cdc9e9b6049f5f\": container with ID starting with 0abfb0936f5080cad47005b8cdced835dd04259e56f7925571cdc9e9b6049f5f not found: ID does not exist" containerID="0abfb0936f5080cad47005b8cdced835dd04259e56f7925571cdc9e9b6049f5f" Mar 18 16:58:58.691129 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:58:58.691038 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0abfb0936f5080cad47005b8cdced835dd04259e56f7925571cdc9e9b6049f5f"} err="failed to get container status \"0abfb0936f5080cad47005b8cdced835dd04259e56f7925571cdc9e9b6049f5f\": rpc error: code = NotFound desc = could not find container \"0abfb0936f5080cad47005b8cdced835dd04259e56f7925571cdc9e9b6049f5f\": container with ID starting with 0abfb0936f5080cad47005b8cdced835dd04259e56f7925571cdc9e9b6049f5f not found: ID does not exist" Mar 18 16:58:58.709438 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:58:58.709407 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-31778-predictor-86547b75b6-96h9w"] Mar 18 16:58:58.713612 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:58:58.713585 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-31778-predictor-86547b75b6-96h9w"] Mar 18 16:58:59.562437 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:58:59.562408 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6715332-90cd-43ab-b914-58744152dfea" path="/var/lib/kubelet/pods/b6715332-90cd-43ab-b914-58744152dfea/volumes" Mar 18 16:59:00.559324 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:59:00.559274 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-31778-predictor-d454ff497-sxzd6" podUID="be43e454-6c78-4ba5-bddf-b3c5c0bb2626" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Mar 18 16:59:00.559748 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:59:00.559605 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-31778-predictor-d454ff497-sxzd6" podUID="be43e454-6c78-4ba5-bddf-b3c5c0bb2626" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 16:59:02.696419 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:59:02.696388 2576 generic.go:358] "Generic (PLEG): container finished" podID="be43e454-6c78-4ba5-bddf-b3c5c0bb2626" containerID="7bffd6983c662e014bd98232b1f067e0e95e744b5f2cbf664e7bc246a555c09f" exitCode=0 Mar 18 16:59:02.696873 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:59:02.696466 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-31778-predictor-d454ff497-sxzd6" event={"ID":"be43e454-6c78-4ba5-bddf-b3c5c0bb2626","Type":"ContainerDied","Data":"7bffd6983c662e014bd98232b1f067e0e95e744b5f2cbf664e7bc246a555c09f"} Mar 18 16:59:02.697717 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:59:02.697698 2576 generic.go:358] "Generic (PLEG): container finished" podID="5b4856d3-fbfd-4a47-aff7-1c797aea2b16" containerID="2ff82353a116810c818128afdc108e5197c904cd21eceff76ec0a28130787ed0" exitCode=0 Mar 18 16:59:02.697815 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:59:02.697740 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-342bc-predictor-54858cdc9d-lbqp4" event={"ID":"5b4856d3-fbfd-4a47-aff7-1c797aea2b16","Type":"ContainerDied","Data":"2ff82353a116810c818128afdc108e5197c904cd21eceff76ec0a28130787ed0"} Mar 18 16:59:03.704742 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:59:03.704707 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-342bc-predictor-54858cdc9d-lbqp4" event={"ID":"5b4856d3-fbfd-4a47-aff7-1c797aea2b16","Type":"ContainerStarted","Data":"20f464e71770ce2fc8b4b344d2e3b3452c0523ef8dc7eaafc6b0b48570dd07af"} Mar 18 16:59:03.705177 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:59:03.704973 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-342bc-predictor-54858cdc9d-lbqp4" Mar 18 16:59:03.706491 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:59:03.706464 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-342bc-predictor-54858cdc9d-lbqp4" podUID="5b4856d3-fbfd-4a47-aff7-1c797aea2b16" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Mar 18 16:59:03.722967 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:59:03.722908 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-342bc-predictor-54858cdc9d-lbqp4" podStartSLOduration=6.722895204 podStartE2EDuration="6.722895204s" podCreationTimestamp="2026-03-18 16:58:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:59:03.72090929 +0000 UTC m=+934.770724717" watchObservedRunningTime="2026-03-18 16:59:03.722895204 +0000 UTC m=+934.772710636" Mar 18 16:59:04.708665 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:59:04.708624 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-342bc-predictor-54858cdc9d-lbqp4" podUID="5b4856d3-fbfd-4a47-aff7-1c797aea2b16" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Mar 18 16:59:09.561672 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:59:09.561605 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-jrcb2" podUID="c1eb4bd1-d0f6-4dd7-a063-638cff8ab0d3" Mar 18 16:59:10.559656 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:59:10.559607 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-31778-predictor-d454ff497-sxzd6" podUID="be43e454-6c78-4ba5-bddf-b3c5c0bb2626" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Mar 18 16:59:10.559967 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:59:10.559927 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-31778-predictor-d454ff497-sxzd6" podUID="be43e454-6c78-4ba5-bddf-b3c5c0bb2626" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 16:59:14.709147 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:59:14.709100 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-342bc-predictor-54858cdc9d-lbqp4" podUID="5b4856d3-fbfd-4a47-aff7-1c797aea2b16" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Mar 18 16:59:20.559376 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:59:20.559318 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-31778-predictor-d454ff497-sxzd6" podUID="be43e454-6c78-4ba5-bddf-b3c5c0bb2626" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Mar 18 16:59:20.559862 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:59:20.559474 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-31778-predictor-d454ff497-sxzd6" Mar 18 16:59:20.559913 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:59:20.559857 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-31778-predictor-d454ff497-sxzd6" podUID="be43e454-6c78-4ba5-bddf-b3c5c0bb2626" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 16:59:20.559990 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:59:20.559975 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-31778-predictor-d454ff497-sxzd6" Mar 18 16:59:21.559685 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:59:21.559656 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-jrcb2" podUID="c1eb4bd1-d0f6-4dd7-a063-638cff8ab0d3" Mar 18 16:59:24.709141 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:59:24.709093 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-342bc-predictor-54858cdc9d-lbqp4" podUID="5b4856d3-fbfd-4a47-aff7-1c797aea2b16" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Mar 18 16:59:27.676753 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:59:27.676729 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-raw-31778-predictor-d454ff497-sxzd6" Mar 18 16:59:27.744994 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:59:27.744960 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4rzs6\" (UniqueName: \"kubernetes.io/projected/be43e454-6c78-4ba5-bddf-b3c5c0bb2626-kube-api-access-4rzs6\") pod \"be43e454-6c78-4ba5-bddf-b3c5c0bb2626\" (UID: \"be43e454-6c78-4ba5-bddf-b3c5c0bb2626\") " Mar 18 16:59:27.745159 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:59:27.745016 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/be43e454-6c78-4ba5-bddf-b3c5c0bb2626-kserve-provision-location\") pod \"be43e454-6c78-4ba5-bddf-b3c5c0bb2626\" (UID: \"be43e454-6c78-4ba5-bddf-b3c5c0bb2626\") " Mar 18 16:59:27.745364 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:59:27.745340 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be43e454-6c78-4ba5-bddf-b3c5c0bb2626-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "be43e454-6c78-4ba5-bddf-b3c5c0bb2626" (UID: "be43e454-6c78-4ba5-bddf-b3c5c0bb2626"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 18 16:59:27.747109 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:59:27.747081 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be43e454-6c78-4ba5-bddf-b3c5c0bb2626-kube-api-access-4rzs6" (OuterVolumeSpecName: "kube-api-access-4rzs6") pod "be43e454-6c78-4ba5-bddf-b3c5c0bb2626" (UID: "be43e454-6c78-4ba5-bddf-b3c5c0bb2626"). InnerVolumeSpecName "kube-api-access-4rzs6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 16:59:27.778391 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:59:27.778361 2576 generic.go:358] "Generic (PLEG): container finished" podID="be43e454-6c78-4ba5-bddf-b3c5c0bb2626" containerID="d01157130c0c5040c59b400d0f42beda4784d1e0f1c56c85548ff5b66f4012b2" exitCode=0 Mar 18 16:59:27.778543 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:59:27.778399 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-31778-predictor-d454ff497-sxzd6" event={"ID":"be43e454-6c78-4ba5-bddf-b3c5c0bb2626","Type":"ContainerDied","Data":"d01157130c0c5040c59b400d0f42beda4784d1e0f1c56c85548ff5b66f4012b2"} Mar 18 16:59:27.778543 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:59:27.778422 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-31778-predictor-d454ff497-sxzd6" event={"ID":"be43e454-6c78-4ba5-bddf-b3c5c0bb2626","Type":"ContainerDied","Data":"9ec01c7eced47bcbcb5518255c85ee8033266d80d9bf7c2a092aff74c410d9e8"} Mar 18 16:59:27.778543 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:59:27.778437 2576 scope.go:117] "RemoveContainer" containerID="d01157130c0c5040c59b400d0f42beda4784d1e0f1c56c85548ff5b66f4012b2" Mar 18 16:59:27.778543 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:59:27.778443 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-raw-31778-predictor-d454ff497-sxzd6" Mar 18 16:59:27.792435 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:59:27.792415 2576 scope.go:117] "RemoveContainer" containerID="7bffd6983c662e014bd98232b1f067e0e95e744b5f2cbf664e7bc246a555c09f" Mar 18 16:59:27.799763 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:59:27.799746 2576 scope.go:117] "RemoveContainer" containerID="58018c862416dbf7ee5f6845092b99164d2cc0549ec2d092065fe763aed0f334" Mar 18 16:59:27.803025 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:59:27.802999 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-31778-predictor-d454ff497-sxzd6"] Mar 18 16:59:27.804604 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:59:27.804585 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-31778-predictor-d454ff497-sxzd6"] Mar 18 16:59:27.812617 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:59:27.812599 2576 scope.go:117] "RemoveContainer" containerID="d01157130c0c5040c59b400d0f42beda4784d1e0f1c56c85548ff5b66f4012b2" Mar 18 16:59:27.812901 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:59:27.812880 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d01157130c0c5040c59b400d0f42beda4784d1e0f1c56c85548ff5b66f4012b2\": container with ID starting with d01157130c0c5040c59b400d0f42beda4784d1e0f1c56c85548ff5b66f4012b2 not found: ID does not exist" containerID="d01157130c0c5040c59b400d0f42beda4784d1e0f1c56c85548ff5b66f4012b2" Mar 18 16:59:27.812973 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:59:27.812910 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d01157130c0c5040c59b400d0f42beda4784d1e0f1c56c85548ff5b66f4012b2"} err="failed to get container status \"d01157130c0c5040c59b400d0f42beda4784d1e0f1c56c85548ff5b66f4012b2\": rpc error: code = NotFound desc = could not find container \"d01157130c0c5040c59b400d0f42beda4784d1e0f1c56c85548ff5b66f4012b2\": container with ID starting with d01157130c0c5040c59b400d0f42beda4784d1e0f1c56c85548ff5b66f4012b2 not found: ID does not exist" Mar 18 16:59:27.812973 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:59:27.812929 2576 scope.go:117] "RemoveContainer" containerID="7bffd6983c662e014bd98232b1f067e0e95e744b5f2cbf664e7bc246a555c09f" Mar 18 16:59:27.813235 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:59:27.813219 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7bffd6983c662e014bd98232b1f067e0e95e744b5f2cbf664e7bc246a555c09f\": container with ID starting with 7bffd6983c662e014bd98232b1f067e0e95e744b5f2cbf664e7bc246a555c09f not found: ID does not exist" containerID="7bffd6983c662e014bd98232b1f067e0e95e744b5f2cbf664e7bc246a555c09f" Mar 18 16:59:27.813297 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:59:27.813240 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7bffd6983c662e014bd98232b1f067e0e95e744b5f2cbf664e7bc246a555c09f"} err="failed to get container status \"7bffd6983c662e014bd98232b1f067e0e95e744b5f2cbf664e7bc246a555c09f\": rpc error: code = NotFound desc = could not find container \"7bffd6983c662e014bd98232b1f067e0e95e744b5f2cbf664e7bc246a555c09f\": container with ID starting with 7bffd6983c662e014bd98232b1f067e0e95e744b5f2cbf664e7bc246a555c09f not found: ID does not exist" Mar 18 16:59:27.813297 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:59:27.813256 2576 scope.go:117] "RemoveContainer" containerID="58018c862416dbf7ee5f6845092b99164d2cc0549ec2d092065fe763aed0f334" Mar 18 16:59:27.813501 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:59:27.813484 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58018c862416dbf7ee5f6845092b99164d2cc0549ec2d092065fe763aed0f334\": container with ID starting with 58018c862416dbf7ee5f6845092b99164d2cc0549ec2d092065fe763aed0f334 not found: ID does not exist" containerID="58018c862416dbf7ee5f6845092b99164d2cc0549ec2d092065fe763aed0f334" Mar 18 16:59:27.813538 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:59:27.813505 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58018c862416dbf7ee5f6845092b99164d2cc0549ec2d092065fe763aed0f334"} err="failed to get container status \"58018c862416dbf7ee5f6845092b99164d2cc0549ec2d092065fe763aed0f334\": rpc error: code = NotFound desc = could not find container \"58018c862416dbf7ee5f6845092b99164d2cc0549ec2d092065fe763aed0f334\": container with ID starting with 58018c862416dbf7ee5f6845092b99164d2cc0549ec2d092065fe763aed0f334 not found: ID does not exist" Mar 18 16:59:27.846124 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:59:27.846090 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4rzs6\" (UniqueName: \"kubernetes.io/projected/be43e454-6c78-4ba5-bddf-b3c5c0bb2626-kube-api-access-4rzs6\") on node \"ip-10-0-133-190.ec2.internal\" DevicePath \"\"" Mar 18 16:59:27.846124 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:59:27.846123 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/be43e454-6c78-4ba5-bddf-b3c5c0bb2626-kserve-provision-location\") on node \"ip-10-0-133-190.ec2.internal\" DevicePath \"\"" Mar 18 16:59:29.562484 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:59:29.562459 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be43e454-6c78-4ba5-bddf-b3c5c0bb2626" path="/var/lib/kubelet/pods/be43e454-6c78-4ba5-bddf-b3c5c0bb2626/volumes" Mar 18 16:59:34.709549 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:59:34.709508 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-342bc-predictor-54858cdc9d-lbqp4" podUID="5b4856d3-fbfd-4a47-aff7-1c797aea2b16" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Mar 18 16:59:35.560279 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:59:35.560247 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-jrcb2" podUID="c1eb4bd1-d0f6-4dd7-a063-638cff8ab0d3" Mar 18 16:59:44.709152 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:59:44.709106 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-342bc-predictor-54858cdc9d-lbqp4" podUID="5b4856d3-fbfd-4a47-aff7-1c797aea2b16" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Mar 18 16:59:50.559746 ip-10-0-133-190 kubenswrapper[2576]: E0318 16:59:50.559710 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-jrcb2" podUID="c1eb4bd1-d0f6-4dd7-a063-638cff8ab0d3" Mar 18 16:59:54.709000 ip-10-0-133-190 kubenswrapper[2576]: I0318 16:59:54.708935 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-342bc-predictor-54858cdc9d-lbqp4" podUID="5b4856d3-fbfd-4a47-aff7-1c797aea2b16" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Mar 18 17:00:03.560104 ip-10-0-133-190 kubenswrapper[2576]: E0318 17:00:03.560050 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-jrcb2" podUID="c1eb4bd1-d0f6-4dd7-a063-638cff8ab0d3" Mar 18 17:00:04.708624 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:00:04.708585 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-342bc-predictor-54858cdc9d-lbqp4" podUID="5b4856d3-fbfd-4a47-aff7-1c797aea2b16" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Mar 18 17:00:14.709430 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:00:14.709386 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-342bc-predictor-54858cdc9d-lbqp4" podUID="5b4856d3-fbfd-4a47-aff7-1c797aea2b16" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Mar 18 17:00:15.559995 ip-10-0-133-190 kubenswrapper[2576]: E0318 17:00:15.559961 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-jrcb2" podUID="c1eb4bd1-d0f6-4dd7-a063-638cff8ab0d3" Mar 18 17:00:22.558802 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:00:22.558745 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-342bc-predictor-54858cdc9d-lbqp4" podUID="5b4856d3-fbfd-4a47-aff7-1c797aea2b16" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Mar 18 17:00:27.560100 ip-10-0-133-190 kubenswrapper[2576]: E0318 17:00:27.560068 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-jrcb2" podUID="c1eb4bd1-d0f6-4dd7-a063-638cff8ab0d3" Mar 18 17:00:32.559134 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:00:32.559086 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-342bc-predictor-54858cdc9d-lbqp4" podUID="5b4856d3-fbfd-4a47-aff7-1c797aea2b16" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Mar 18 17:00:42.559079 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:00:42.559026 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-342bc-predictor-54858cdc9d-lbqp4" podUID="5b4856d3-fbfd-4a47-aff7-1c797aea2b16" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Mar 18 17:00:42.560153 ip-10-0-133-190 kubenswrapper[2576]: E0318 17:00:42.560122 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-jrcb2" podUID="c1eb4bd1-d0f6-4dd7-a063-638cff8ab0d3" Mar 18 17:00:52.559664 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:00:52.559621 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-342bc-predictor-54858cdc9d-lbqp4" podUID="5b4856d3-fbfd-4a47-aff7-1c797aea2b16" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Mar 18 17:00:56.560000 ip-10-0-133-190 kubenswrapper[2576]: E0318 17:00:56.559933 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-jrcb2" podUID="c1eb4bd1-d0f6-4dd7-a063-638cff8ab0d3" Mar 18 17:01:02.559140 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:01:02.559092 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-342bc-predictor-54858cdc9d-lbqp4" podUID="5b4856d3-fbfd-4a47-aff7-1c797aea2b16" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Mar 18 17:01:07.560034 ip-10-0-133-190 kubenswrapper[2576]: E0318 17:01:07.560005 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-jrcb2" podUID="c1eb4bd1-d0f6-4dd7-a063-638cff8ab0d3" Mar 18 17:01:12.559136 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:01:12.559090 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-342bc-predictor-54858cdc9d-lbqp4" podUID="5b4856d3-fbfd-4a47-aff7-1c797aea2b16" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Mar 18 17:01:19.561446 ip-10-0-133-190 kubenswrapper[2576]: E0318 17:01:19.561341 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-jrcb2" podUID="c1eb4bd1-d0f6-4dd7-a063-638cff8ab0d3" Mar 18 17:01:22.559455 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:01:22.559411 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-342bc-predictor-54858cdc9d-lbqp4" podUID="5b4856d3-fbfd-4a47-aff7-1c797aea2b16" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Mar 18 17:01:29.542046 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:01:29.542001 2576 scope.go:117] "RemoveContainer" containerID="a2aa83a52cc790ab09190047403da5bb5731569568d605d6a4cf006249fc6dbf" Mar 18 17:01:29.551547 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:01:29.551417 2576 scope.go:117] "RemoveContainer" containerID="ba9caa4bb4178b887cb83fbcd1ee644faf96f07188e415f856f65c0c4bb22b59" Mar 18 17:01:32.559488 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:01:32.559440 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-342bc-predictor-54858cdc9d-lbqp4" podUID="5b4856d3-fbfd-4a47-aff7-1c797aea2b16" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Mar 18 17:01:33.846730 ip-10-0-133-190 kubenswrapper[2576]: E0318 17:01:33.846626 2576 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized" image="quay.io/opendatahub/odh-model-serving-api:fast" Mar 18 17:01:33.847175 ip-10-0-133-190 kubenswrapper[2576]: E0318 17:01:33.846866 2576 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:server,Image:quay.io/opendatahub/odh-model-serving-api:fast,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},ContainerPort{Name:metrics,HostPort:0,ContainerPort:9090,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:TLS_CERT_DIR,Value:/tls,ValueFrom:nil,},EnvVar{Name:GATEWAY_LABEL_SELECTOR,Value:,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{50 -3} {} 50m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tls-certs,ReadOnly:true,MountPath:/tls,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-k6p25,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000640000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod model-serving-api-9699c8d45-jrcb2_kserve(c1eb4bd1-d0f6-4dd7-a063-638cff8ab0d3): ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized" logger="UnhandledError" Mar 18 17:01:33.848127 ip-10-0-133-190 kubenswrapper[2576]: E0318 17:01:33.848095 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ErrImagePull: \"unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-jrcb2" podUID="c1eb4bd1-d0f6-4dd7-a063-638cff8ab0d3" Mar 18 17:01:40.559516 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:01:40.559476 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-342bc-predictor-54858cdc9d-lbqp4" podUID="5b4856d3-fbfd-4a47-aff7-1c797aea2b16" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Mar 18 17:01:48.559845 ip-10-0-133-190 kubenswrapper[2576]: E0318 17:01:48.559810 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-jrcb2" podUID="c1eb4bd1-d0f6-4dd7-a063-638cff8ab0d3" Mar 18 17:01:50.560114 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:01:50.560070 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-342bc-predictor-54858cdc9d-lbqp4" podUID="5b4856d3-fbfd-4a47-aff7-1c797aea2b16" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Mar 18 17:02:00.561141 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:02:00.561110 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-342bc-predictor-54858cdc9d-lbqp4" Mar 18 17:02:01.564088 ip-10-0-133-190 kubenswrapper[2576]: E0318 17:02:01.564029 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-jrcb2" podUID="c1eb4bd1-d0f6-4dd7-a063-638cff8ab0d3" Mar 18 17:02:08.011753 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:02:08.011717 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-342bc-predictor-54858cdc9d-lbqp4"] Mar 18 17:02:08.012228 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:02:08.012016 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-342bc-predictor-54858cdc9d-lbqp4" podUID="5b4856d3-fbfd-4a47-aff7-1c797aea2b16" containerName="kserve-container" containerID="cri-o://20f464e71770ce2fc8b4b344d2e3b3452c0523ef8dc7eaafc6b0b48570dd07af" gracePeriod=30 Mar 18 17:02:08.178293 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:02:08.178258 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-primary-b99a18-predictor-5974475d47-hlswc"] Mar 18 17:02:08.178567 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:02:08.178555 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="be43e454-6c78-4ba5-bddf-b3c5c0bb2626" containerName="agent" Mar 18 17:02:08.178627 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:02:08.178568 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="be43e454-6c78-4ba5-bddf-b3c5c0bb2626" containerName="agent" Mar 18 17:02:08.178627 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:02:08.178576 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b6715332-90cd-43ab-b914-58744152dfea" containerName="kserve-container" Mar 18 17:02:08.178627 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:02:08.178582 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6715332-90cd-43ab-b914-58744152dfea" containerName="kserve-container" Mar 18 17:02:08.178627 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:02:08.178591 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="be43e454-6c78-4ba5-bddf-b3c5c0bb2626" containerName="storage-initializer" Mar 18 17:02:08.178627 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:02:08.178597 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="be43e454-6c78-4ba5-bddf-b3c5c0bb2626" containerName="storage-initializer" Mar 18 17:02:08.178627 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:02:08.178605 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="be43e454-6c78-4ba5-bddf-b3c5c0bb2626" containerName="kserve-container" Mar 18 17:02:08.178627 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:02:08.178610 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="be43e454-6c78-4ba5-bddf-b3c5c0bb2626" containerName="kserve-container" Mar 18 17:02:08.179012 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:02:08.178653 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="b6715332-90cd-43ab-b914-58744152dfea" containerName="kserve-container" Mar 18 17:02:08.179012 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:02:08.178661 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="be43e454-6c78-4ba5-bddf-b3c5c0bb2626" containerName="kserve-container" Mar 18 17:02:08.179012 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:02:08.178668 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="be43e454-6c78-4ba5-bddf-b3c5c0bb2626" containerName="agent" Mar 18 17:02:08.181863 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:02:08.181840 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-b99a18-predictor-5974475d47-hlswc" Mar 18 17:02:08.190678 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:02:08.190647 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-b99a18-predictor-5974475d47-hlswc"] Mar 18 17:02:08.277713 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:02:08.277620 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47mqq\" (UniqueName: \"kubernetes.io/projected/fe86ed4c-c91f-487b-9530-8aa1a9c230f2-kube-api-access-47mqq\") pod \"isvc-primary-b99a18-predictor-5974475d47-hlswc\" (UID: \"fe86ed4c-c91f-487b-9530-8aa1a9c230f2\") " pod="kserve-ci-e2e-test/isvc-primary-b99a18-predictor-5974475d47-hlswc" Mar 18 17:02:08.277713 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:02:08.277684 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fe86ed4c-c91f-487b-9530-8aa1a9c230f2-kserve-provision-location\") pod \"isvc-primary-b99a18-predictor-5974475d47-hlswc\" (UID: \"fe86ed4c-c91f-487b-9530-8aa1a9c230f2\") " pod="kserve-ci-e2e-test/isvc-primary-b99a18-predictor-5974475d47-hlswc" Mar 18 17:02:08.378392 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:02:08.378357 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-47mqq\" (UniqueName: \"kubernetes.io/projected/fe86ed4c-c91f-487b-9530-8aa1a9c230f2-kube-api-access-47mqq\") pod \"isvc-primary-b99a18-predictor-5974475d47-hlswc\" (UID: \"fe86ed4c-c91f-487b-9530-8aa1a9c230f2\") " pod="kserve-ci-e2e-test/isvc-primary-b99a18-predictor-5974475d47-hlswc" Mar 18 17:02:08.378571 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:02:08.378404 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fe86ed4c-c91f-487b-9530-8aa1a9c230f2-kserve-provision-location\") pod \"isvc-primary-b99a18-predictor-5974475d47-hlswc\" (UID: \"fe86ed4c-c91f-487b-9530-8aa1a9c230f2\") " pod="kserve-ci-e2e-test/isvc-primary-b99a18-predictor-5974475d47-hlswc" Mar 18 17:02:08.378837 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:02:08.378815 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fe86ed4c-c91f-487b-9530-8aa1a9c230f2-kserve-provision-location\") pod \"isvc-primary-b99a18-predictor-5974475d47-hlswc\" (UID: \"fe86ed4c-c91f-487b-9530-8aa1a9c230f2\") " pod="kserve-ci-e2e-test/isvc-primary-b99a18-predictor-5974475d47-hlswc" Mar 18 17:02:08.388611 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:02:08.388575 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-47mqq\" (UniqueName: \"kubernetes.io/projected/fe86ed4c-c91f-487b-9530-8aa1a9c230f2-kube-api-access-47mqq\") pod \"isvc-primary-b99a18-predictor-5974475d47-hlswc\" (UID: \"fe86ed4c-c91f-487b-9530-8aa1a9c230f2\") " pod="kserve-ci-e2e-test/isvc-primary-b99a18-predictor-5974475d47-hlswc" Mar 18 17:02:08.492577 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:02:08.492537 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-b99a18-predictor-5974475d47-hlswc" Mar 18 17:02:08.610803 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:02:08.610776 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-b99a18-predictor-5974475d47-hlswc"] Mar 18 17:02:08.613293 ip-10-0-133-190 kubenswrapper[2576]: W0318 17:02:08.613265 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe86ed4c_c91f_487b_9530_8aa1a9c230f2.slice/crio-a1cc12ce85c4b7191aeb2284b55af9f5be17ff6b0a85b99e094f9ee287a698f3 WatchSource:0}: Error finding container a1cc12ce85c4b7191aeb2284b55af9f5be17ff6b0a85b99e094f9ee287a698f3: Status 404 returned error can't find the container with id a1cc12ce85c4b7191aeb2284b55af9f5be17ff6b0a85b99e094f9ee287a698f3 Mar 18 17:02:09.285752 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:02:09.285714 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-b99a18-predictor-5974475d47-hlswc" event={"ID":"fe86ed4c-c91f-487b-9530-8aa1a9c230f2","Type":"ContainerStarted","Data":"c7825f427708007ae92df5c9665d785797356df205e3a3d5a7617e0fd4da8030"} Mar 18 17:02:09.285752 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:02:09.285755 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-b99a18-predictor-5974475d47-hlswc" event={"ID":"fe86ed4c-c91f-487b-9530-8aa1a9c230f2","Type":"ContainerStarted","Data":"a1cc12ce85c4b7191aeb2284b55af9f5be17ff6b0a85b99e094f9ee287a698f3"} Mar 18 17:02:10.560247 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:02:10.560202 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-342bc-predictor-54858cdc9d-lbqp4" podUID="5b4856d3-fbfd-4a47-aff7-1c797aea2b16" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Mar 18 17:02:13.303217 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:02:13.303181 2576 generic.go:358] "Generic (PLEG): container finished" podID="fe86ed4c-c91f-487b-9530-8aa1a9c230f2" containerID="c7825f427708007ae92df5c9665d785797356df205e3a3d5a7617e0fd4da8030" exitCode=0 Mar 18 17:02:13.303577 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:02:13.303244 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-b99a18-predictor-5974475d47-hlswc" event={"ID":"fe86ed4c-c91f-487b-9530-8aa1a9c230f2","Type":"ContainerDied","Data":"c7825f427708007ae92df5c9665d785797356df205e3a3d5a7617e0fd4da8030"} Mar 18 17:02:14.308072 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:02:14.308041 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-b99a18-predictor-5974475d47-hlswc" event={"ID":"fe86ed4c-c91f-487b-9530-8aa1a9c230f2","Type":"ContainerStarted","Data":"7e4051dc025f10dd48eec2f3360b69952cfcab7350af0abc5d2d15ed1503b398"} Mar 18 17:02:14.308480 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:02:14.308330 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-primary-b99a18-predictor-5974475d47-hlswc" Mar 18 17:02:14.309724 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:02:14.309697 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-b99a18-predictor-5974475d47-hlswc" podUID="fe86ed4c-c91f-487b-9530-8aa1a9c230f2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Mar 18 17:02:14.325474 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:02:14.325430 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-primary-b99a18-predictor-5974475d47-hlswc" podStartSLOduration=6.325416334 podStartE2EDuration="6.325416334s" podCreationTimestamp="2026-03-18 17:02:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:02:14.3236376 +0000 UTC m=+1125.373453026" watchObservedRunningTime="2026-03-18 17:02:14.325416334 +0000 UTC m=+1125.375231761" Mar 18 17:02:15.311771 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:02:15.311734 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-b99a18-predictor-5974475d47-hlswc" podUID="fe86ed4c-c91f-487b-9530-8aa1a9c230f2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Mar 18 17:02:15.560232 ip-10-0-133-190 kubenswrapper[2576]: E0318 17:02:15.560203 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-jrcb2" podUID="c1eb4bd1-d0f6-4dd7-a063-638cff8ab0d3" Mar 18 17:02:18.453394 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:02:18.453365 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-342bc-predictor-54858cdc9d-lbqp4" Mar 18 17:02:18.559331 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:02:18.559226 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5b4856d3-fbfd-4a47-aff7-1c797aea2b16-kserve-provision-location\") pod \"5b4856d3-fbfd-4a47-aff7-1c797aea2b16\" (UID: \"5b4856d3-fbfd-4a47-aff7-1c797aea2b16\") " Mar 18 17:02:18.559331 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:02:18.559292 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fwpbm\" (UniqueName: \"kubernetes.io/projected/5b4856d3-fbfd-4a47-aff7-1c797aea2b16-kube-api-access-fwpbm\") pod \"5b4856d3-fbfd-4a47-aff7-1c797aea2b16\" (UID: \"5b4856d3-fbfd-4a47-aff7-1c797aea2b16\") " Mar 18 17:02:18.559591 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:02:18.559571 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b4856d3-fbfd-4a47-aff7-1c797aea2b16-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "5b4856d3-fbfd-4a47-aff7-1c797aea2b16" (UID: "5b4856d3-fbfd-4a47-aff7-1c797aea2b16"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 18 17:02:18.561360 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:02:18.561340 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b4856d3-fbfd-4a47-aff7-1c797aea2b16-kube-api-access-fwpbm" (OuterVolumeSpecName: "kube-api-access-fwpbm") pod "5b4856d3-fbfd-4a47-aff7-1c797aea2b16" (UID: "5b4856d3-fbfd-4a47-aff7-1c797aea2b16"). InnerVolumeSpecName "kube-api-access-fwpbm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 17:02:18.660634 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:02:18.660583 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5b4856d3-fbfd-4a47-aff7-1c797aea2b16-kserve-provision-location\") on node \"ip-10-0-133-190.ec2.internal\" DevicePath \"\"" Mar 18 17:02:18.660634 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:02:18.660627 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fwpbm\" (UniqueName: \"kubernetes.io/projected/5b4856d3-fbfd-4a47-aff7-1c797aea2b16-kube-api-access-fwpbm\") on node \"ip-10-0-133-190.ec2.internal\" DevicePath \"\"" Mar 18 17:02:19.329958 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:02:19.329905 2576 generic.go:358] "Generic (PLEG): container finished" podID="5b4856d3-fbfd-4a47-aff7-1c797aea2b16" containerID="20f464e71770ce2fc8b4b344d2e3b3452c0523ef8dc7eaafc6b0b48570dd07af" exitCode=0 Mar 18 17:02:19.330116 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:02:19.329974 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-342bc-predictor-54858cdc9d-lbqp4" event={"ID":"5b4856d3-fbfd-4a47-aff7-1c797aea2b16","Type":"ContainerDied","Data":"20f464e71770ce2fc8b4b344d2e3b3452c0523ef8dc7eaafc6b0b48570dd07af"} Mar 18 17:02:19.330116 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:02:19.330017 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-342bc-predictor-54858cdc9d-lbqp4" event={"ID":"5b4856d3-fbfd-4a47-aff7-1c797aea2b16","Type":"ContainerDied","Data":"0abbc9390f9fc6333b60a609d3fddd9188d49a7a7d287158f50445897694f7c5"} Mar 18 17:02:19.330116 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:02:19.330031 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-342bc-predictor-54858cdc9d-lbqp4" Mar 18 17:02:19.330280 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:02:19.330036 2576 scope.go:117] "RemoveContainer" containerID="20f464e71770ce2fc8b4b344d2e3b3452c0523ef8dc7eaafc6b0b48570dd07af" Mar 18 17:02:19.338465 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:02:19.338447 2576 scope.go:117] "RemoveContainer" containerID="2ff82353a116810c818128afdc108e5197c904cd21eceff76ec0a28130787ed0" Mar 18 17:02:19.347083 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:02:19.347066 2576 scope.go:117] "RemoveContainer" containerID="20f464e71770ce2fc8b4b344d2e3b3452c0523ef8dc7eaafc6b0b48570dd07af" Mar 18 17:02:19.347459 ip-10-0-133-190 kubenswrapper[2576]: E0318 17:02:19.347320 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20f464e71770ce2fc8b4b344d2e3b3452c0523ef8dc7eaafc6b0b48570dd07af\": container with ID starting with 20f464e71770ce2fc8b4b344d2e3b3452c0523ef8dc7eaafc6b0b48570dd07af not found: ID does not exist" containerID="20f464e71770ce2fc8b4b344d2e3b3452c0523ef8dc7eaafc6b0b48570dd07af" Mar 18 17:02:19.347459 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:02:19.347345 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20f464e71770ce2fc8b4b344d2e3b3452c0523ef8dc7eaafc6b0b48570dd07af"} err="failed to get container status \"20f464e71770ce2fc8b4b344d2e3b3452c0523ef8dc7eaafc6b0b48570dd07af\": rpc error: code = NotFound desc = could not find container \"20f464e71770ce2fc8b4b344d2e3b3452c0523ef8dc7eaafc6b0b48570dd07af\": container with ID starting with 20f464e71770ce2fc8b4b344d2e3b3452c0523ef8dc7eaafc6b0b48570dd07af not found: ID does not exist" Mar 18 17:02:19.347459 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:02:19.347371 2576 scope.go:117] "RemoveContainer" containerID="2ff82353a116810c818128afdc108e5197c904cd21eceff76ec0a28130787ed0" Mar 18 17:02:19.347690 ip-10-0-133-190 kubenswrapper[2576]: E0318 17:02:19.347658 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ff82353a116810c818128afdc108e5197c904cd21eceff76ec0a28130787ed0\": container with ID starting with 2ff82353a116810c818128afdc108e5197c904cd21eceff76ec0a28130787ed0 not found: ID does not exist" containerID="2ff82353a116810c818128afdc108e5197c904cd21eceff76ec0a28130787ed0" Mar 18 17:02:19.347797 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:02:19.347698 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ff82353a116810c818128afdc108e5197c904cd21eceff76ec0a28130787ed0"} err="failed to get container status \"2ff82353a116810c818128afdc108e5197c904cd21eceff76ec0a28130787ed0\": rpc error: code = NotFound desc = could not find container \"2ff82353a116810c818128afdc108e5197c904cd21eceff76ec0a28130787ed0\": container with ID starting with 2ff82353a116810c818128afdc108e5197c904cd21eceff76ec0a28130787ed0 not found: ID does not exist" Mar 18 17:02:19.349933 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:02:19.349904 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-342bc-predictor-54858cdc9d-lbqp4"] Mar 18 17:02:19.351288 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:02:19.351269 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-342bc-predictor-54858cdc9d-lbqp4"] Mar 18 17:02:19.563282 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:02:19.563251 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b4856d3-fbfd-4a47-aff7-1c797aea2b16" path="/var/lib/kubelet/pods/5b4856d3-fbfd-4a47-aff7-1c797aea2b16/volumes" Mar 18 17:02:25.312060 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:02:25.312020 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-b99a18-predictor-5974475d47-hlswc" podUID="fe86ed4c-c91f-487b-9530-8aa1a9c230f2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Mar 18 17:02:29.563930 ip-10-0-133-190 kubenswrapper[2576]: E0318 17:02:29.563898 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-jrcb2" podUID="c1eb4bd1-d0f6-4dd7-a063-638cff8ab0d3" Mar 18 17:02:35.312760 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:02:35.312711 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-b99a18-predictor-5974475d47-hlswc" podUID="fe86ed4c-c91f-487b-9530-8aa1a9c230f2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Mar 18 17:02:44.560293 ip-10-0-133-190 kubenswrapper[2576]: E0318 17:02:44.560258 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-jrcb2" podUID="c1eb4bd1-d0f6-4dd7-a063-638cff8ab0d3" Mar 18 17:02:45.312379 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:02:45.312333 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-b99a18-predictor-5974475d47-hlswc" podUID="fe86ed4c-c91f-487b-9530-8aa1a9c230f2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Mar 18 17:02:55.312522 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:02:55.312426 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-b99a18-predictor-5974475d47-hlswc" podUID="fe86ed4c-c91f-487b-9530-8aa1a9c230f2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Mar 18 17:02:58.559469 ip-10-0-133-190 kubenswrapper[2576]: E0318 17:02:58.559437 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-jrcb2" podUID="c1eb4bd1-d0f6-4dd7-a063-638cff8ab0d3" Mar 18 17:03:05.312726 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:03:05.312678 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-b99a18-predictor-5974475d47-hlswc" podUID="fe86ed4c-c91f-487b-9530-8aa1a9c230f2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Mar 18 17:03:10.560328 ip-10-0-133-190 kubenswrapper[2576]: E0318 17:03:10.560292 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-jrcb2" podUID="c1eb4bd1-d0f6-4dd7-a063-638cff8ab0d3" Mar 18 17:03:15.312288 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:03:15.312240 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-b99a18-predictor-5974475d47-hlswc" podUID="fe86ed4c-c91f-487b-9530-8aa1a9c230f2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Mar 18 17:03:23.563513 ip-10-0-133-190 kubenswrapper[2576]: E0318 17:03:23.563479 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-jrcb2" podUID="c1eb4bd1-d0f6-4dd7-a063-638cff8ab0d3" Mar 18 17:03:25.312289 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:03:25.312240 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-b99a18-predictor-5974475d47-hlswc" podUID="fe86ed4c-c91f-487b-9530-8aa1a9c230f2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Mar 18 17:03:25.563698 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:03:25.563610 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-b99a18-predictor-5974475d47-hlswc" podUID="fe86ed4c-c91f-487b-9530-8aa1a9c230f2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Mar 18 17:03:29.503128 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:03:29.503103 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g2jdr_df16c6f7-55c8-4f26-80fc-ae0ba3fa368a/ovn-acl-logging/0.log" Mar 18 17:03:29.508038 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:03:29.508009 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g2jdr_df16c6f7-55c8-4f26-80fc-ae0ba3fa368a/ovn-acl-logging/0.log" Mar 18 17:03:35.563985 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:03:35.563919 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-b99a18-predictor-5974475d47-hlswc" podUID="fe86ed4c-c91f-487b-9530-8aa1a9c230f2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Mar 18 17:03:36.560214 ip-10-0-133-190 kubenswrapper[2576]: E0318 17:03:36.560119 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-jrcb2" podUID="c1eb4bd1-d0f6-4dd7-a063-638cff8ab0d3" Mar 18 17:03:45.564601 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:03:45.564572 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-primary-b99a18-predictor-5974475d47-hlswc" Mar 18 17:03:48.272839 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:03:48.272799 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-b99a18-predictor-5bc9b9f7d-qtt5c"] Mar 18 17:03:48.273259 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:03:48.273249 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5b4856d3-fbfd-4a47-aff7-1c797aea2b16" containerName="storage-initializer" Mar 18 17:03:48.273303 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:03:48.273264 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b4856d3-fbfd-4a47-aff7-1c797aea2b16" containerName="storage-initializer" Mar 18 17:03:48.273303 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:03:48.273292 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5b4856d3-fbfd-4a47-aff7-1c797aea2b16" containerName="kserve-container" Mar 18 17:03:48.273303 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:03:48.273300 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b4856d3-fbfd-4a47-aff7-1c797aea2b16" containerName="kserve-container" Mar 18 17:03:48.273393 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:03:48.273365 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="5b4856d3-fbfd-4a47-aff7-1c797aea2b16" containerName="kserve-container" Mar 18 17:03:48.275758 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:03:48.275736 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-b99a18-predictor-5bc9b9f7d-qtt5c" Mar 18 17:03:48.277864 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:03:48.277836 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-secret-b99a18\"" Mar 18 17:03:48.277864 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:03:48.277836 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-sa-b99a18-dockercfg-mzztk\"" Mar 18 17:03:48.283370 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:03:48.283081 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-b99a18-predictor-5bc9b9f7d-qtt5c"] Mar 18 17:03:48.326731 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:03:48.326694 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6789\" (UniqueName: \"kubernetes.io/projected/95e87798-dde3-44b3-82e3-2ec6fd2a7b68-kube-api-access-s6789\") pod \"isvc-secondary-b99a18-predictor-5bc9b9f7d-qtt5c\" (UID: \"95e87798-dde3-44b3-82e3-2ec6fd2a7b68\") " pod="kserve-ci-e2e-test/isvc-secondary-b99a18-predictor-5bc9b9f7d-qtt5c" Mar 18 17:03:48.326929 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:03:48.326816 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/95e87798-dde3-44b3-82e3-2ec6fd2a7b68-kserve-provision-location\") pod \"isvc-secondary-b99a18-predictor-5bc9b9f7d-qtt5c\" (UID: \"95e87798-dde3-44b3-82e3-2ec6fd2a7b68\") " pod="kserve-ci-e2e-test/isvc-secondary-b99a18-predictor-5bc9b9f7d-qtt5c" Mar 18 17:03:48.428253 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:03:48.428212 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/95e87798-dde3-44b3-82e3-2ec6fd2a7b68-kserve-provision-location\") pod \"isvc-secondary-b99a18-predictor-5bc9b9f7d-qtt5c\" (UID: \"95e87798-dde3-44b3-82e3-2ec6fd2a7b68\") " pod="kserve-ci-e2e-test/isvc-secondary-b99a18-predictor-5bc9b9f7d-qtt5c" Mar 18 17:03:48.428427 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:03:48.428262 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s6789\" (UniqueName: \"kubernetes.io/projected/95e87798-dde3-44b3-82e3-2ec6fd2a7b68-kube-api-access-s6789\") pod \"isvc-secondary-b99a18-predictor-5bc9b9f7d-qtt5c\" (UID: \"95e87798-dde3-44b3-82e3-2ec6fd2a7b68\") " pod="kserve-ci-e2e-test/isvc-secondary-b99a18-predictor-5bc9b9f7d-qtt5c" Mar 18 17:03:48.428602 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:03:48.428577 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/95e87798-dde3-44b3-82e3-2ec6fd2a7b68-kserve-provision-location\") pod \"isvc-secondary-b99a18-predictor-5bc9b9f7d-qtt5c\" (UID: \"95e87798-dde3-44b3-82e3-2ec6fd2a7b68\") " pod="kserve-ci-e2e-test/isvc-secondary-b99a18-predictor-5bc9b9f7d-qtt5c" Mar 18 17:03:48.435954 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:03:48.435911 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6789\" (UniqueName: \"kubernetes.io/projected/95e87798-dde3-44b3-82e3-2ec6fd2a7b68-kube-api-access-s6789\") pod \"isvc-secondary-b99a18-predictor-5bc9b9f7d-qtt5c\" (UID: \"95e87798-dde3-44b3-82e3-2ec6fd2a7b68\") " pod="kserve-ci-e2e-test/isvc-secondary-b99a18-predictor-5bc9b9f7d-qtt5c" Mar 18 17:03:48.587774 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:03:48.587683 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-b99a18-predictor-5bc9b9f7d-qtt5c" Mar 18 17:03:48.707910 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:03:48.707885 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-b99a18-predictor-5bc9b9f7d-qtt5c"] Mar 18 17:03:48.710523 ip-10-0-133-190 kubenswrapper[2576]: W0318 17:03:48.710497 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod95e87798_dde3_44b3_82e3_2ec6fd2a7b68.slice/crio-fbd236371226c911b35b79585aad2f66790107612d4d80e0c36f8b06640a4907 WatchSource:0}: Error finding container fbd236371226c911b35b79585aad2f66790107612d4d80e0c36f8b06640a4907: Status 404 returned error can't find the container with id fbd236371226c911b35b79585aad2f66790107612d4d80e0c36f8b06640a4907 Mar 18 17:03:48.712717 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:03:48.712699 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 17:03:49.561596 ip-10-0-133-190 kubenswrapper[2576]: E0318 17:03:49.561567 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-jrcb2" podUID="c1eb4bd1-d0f6-4dd7-a063-638cff8ab0d3" Mar 18 17:03:49.606561 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:03:49.606529 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-b99a18-predictor-5bc9b9f7d-qtt5c_95e87798-dde3-44b3-82e3-2ec6fd2a7b68/storage-initializer/0.log" Mar 18 17:03:49.606736 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:03:49.606569 2576 generic.go:358] "Generic (PLEG): container finished" podID="95e87798-dde3-44b3-82e3-2ec6fd2a7b68" containerID="c246a13045de24c78e387db7ef9d0ded0d272b1b78ecfe8048fce0800857de5f" exitCode=1 Mar 18 17:03:49.606736 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:03:49.606637 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-b99a18-predictor-5bc9b9f7d-qtt5c" event={"ID":"95e87798-dde3-44b3-82e3-2ec6fd2a7b68","Type":"ContainerDied","Data":"c246a13045de24c78e387db7ef9d0ded0d272b1b78ecfe8048fce0800857de5f"} Mar 18 17:03:49.606736 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:03:49.606662 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-b99a18-predictor-5bc9b9f7d-qtt5c" event={"ID":"95e87798-dde3-44b3-82e3-2ec6fd2a7b68","Type":"ContainerStarted","Data":"fbd236371226c911b35b79585aad2f66790107612d4d80e0c36f8b06640a4907"} Mar 18 17:03:50.611286 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:03:50.611262 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-b99a18-predictor-5bc9b9f7d-qtt5c_95e87798-dde3-44b3-82e3-2ec6fd2a7b68/storage-initializer/1.log" Mar 18 17:03:50.611719 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:03:50.611597 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-b99a18-predictor-5bc9b9f7d-qtt5c_95e87798-dde3-44b3-82e3-2ec6fd2a7b68/storage-initializer/0.log" Mar 18 17:03:50.611719 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:03:50.611628 2576 generic.go:358] "Generic (PLEG): container finished" podID="95e87798-dde3-44b3-82e3-2ec6fd2a7b68" containerID="e57788399aa30bf3c651660b489a4e4a49c75a7f9f4931efed57e82426140943" exitCode=1 Mar 18 17:03:50.611719 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:03:50.611678 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-b99a18-predictor-5bc9b9f7d-qtt5c" event={"ID":"95e87798-dde3-44b3-82e3-2ec6fd2a7b68","Type":"ContainerDied","Data":"e57788399aa30bf3c651660b489a4e4a49c75a7f9f4931efed57e82426140943"} Mar 18 17:03:50.611719 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:03:50.611704 2576 scope.go:117] "RemoveContainer" containerID="c246a13045de24c78e387db7ef9d0ded0d272b1b78ecfe8048fce0800857de5f" Mar 18 17:03:50.612013 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:03:50.611994 2576 scope.go:117] "RemoveContainer" containerID="c246a13045de24c78e387db7ef9d0ded0d272b1b78ecfe8048fce0800857de5f" Mar 18 17:03:50.625542 ip-10-0-133-190 kubenswrapper[2576]: E0318 17:03:50.625507 2576 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-b99a18-predictor-5bc9b9f7d-qtt5c_kserve-ci-e2e-test_95e87798-dde3-44b3-82e3-2ec6fd2a7b68_0 in pod sandbox fbd236371226c911b35b79585aad2f66790107612d4d80e0c36f8b06640a4907 from index: no such id: 'c246a13045de24c78e387db7ef9d0ded0d272b1b78ecfe8048fce0800857de5f'" containerID="c246a13045de24c78e387db7ef9d0ded0d272b1b78ecfe8048fce0800857de5f" Mar 18 17:03:50.625606 ip-10-0-133-190 kubenswrapper[2576]: E0318 17:03:50.625564 2576 kuberuntime_container.go:951] "Unhandled Error" err="failed to remove pod init container \"storage-initializer\": rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-b99a18-predictor-5bc9b9f7d-qtt5c_kserve-ci-e2e-test_95e87798-dde3-44b3-82e3-2ec6fd2a7b68_0 in pod sandbox fbd236371226c911b35b79585aad2f66790107612d4d80e0c36f8b06640a4907 from index: no such id: 'c246a13045de24c78e387db7ef9d0ded0d272b1b78ecfe8048fce0800857de5f'; Skipping pod \"isvc-secondary-b99a18-predictor-5bc9b9f7d-qtt5c_kserve-ci-e2e-test(95e87798-dde3-44b3-82e3-2ec6fd2a7b68)\"" logger="UnhandledError" Mar 18 17:03:50.626893 ip-10-0-133-190 kubenswrapper[2576]: E0318 17:03:50.626871 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-secondary-b99a18-predictor-5bc9b9f7d-qtt5c_kserve-ci-e2e-test(95e87798-dde3-44b3-82e3-2ec6fd2a7b68)\"" pod="kserve-ci-e2e-test/isvc-secondary-b99a18-predictor-5bc9b9f7d-qtt5c" podUID="95e87798-dde3-44b3-82e3-2ec6fd2a7b68" Mar 18 17:03:51.616522 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:03:51.616497 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-b99a18-predictor-5bc9b9f7d-qtt5c_95e87798-dde3-44b3-82e3-2ec6fd2a7b68/storage-initializer/1.log" Mar 18 17:03:51.617076 ip-10-0-133-190 kubenswrapper[2576]: E0318 17:03:51.617045 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-secondary-b99a18-predictor-5bc9b9f7d-qtt5c_kserve-ci-e2e-test(95e87798-dde3-44b3-82e3-2ec6fd2a7b68)\"" pod="kserve-ci-e2e-test/isvc-secondary-b99a18-predictor-5bc9b9f7d-qtt5c" podUID="95e87798-dde3-44b3-82e3-2ec6fd2a7b68" Mar 18 17:04:00.094374 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:04:00.094337 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-b99a18-predictor-5bc9b9f7d-qtt5c"] Mar 18 17:04:00.189025 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:04:00.188993 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-b99a18-predictor-5974475d47-hlswc"] Mar 18 17:04:00.189329 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:04:00.189285 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-primary-b99a18-predictor-5974475d47-hlswc" podUID="fe86ed4c-c91f-487b-9530-8aa1a9c230f2" containerName="kserve-container" containerID="cri-o://7e4051dc025f10dd48eec2f3360b69952cfcab7350af0abc5d2d15ed1503b398" gracePeriod=30 Mar 18 17:04:00.248173 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:04:00.248152 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-b99a18-predictor-5bc9b9f7d-qtt5c_95e87798-dde3-44b3-82e3-2ec6fd2a7b68/storage-initializer/1.log" Mar 18 17:04:00.248313 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:04:00.248210 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-b99a18-predictor-5bc9b9f7d-qtt5c" Mar 18 17:04:00.323623 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:04:00.323594 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/95e87798-dde3-44b3-82e3-2ec6fd2a7b68-kserve-provision-location\") pod \"95e87798-dde3-44b3-82e3-2ec6fd2a7b68\" (UID: \"95e87798-dde3-44b3-82e3-2ec6fd2a7b68\") " Mar 18 17:04:00.323822 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:04:00.323653 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s6789\" (UniqueName: \"kubernetes.io/projected/95e87798-dde3-44b3-82e3-2ec6fd2a7b68-kube-api-access-s6789\") pod \"95e87798-dde3-44b3-82e3-2ec6fd2a7b68\" (UID: \"95e87798-dde3-44b3-82e3-2ec6fd2a7b68\") " Mar 18 17:04:00.323887 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:04:00.323865 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95e87798-dde3-44b3-82e3-2ec6fd2a7b68-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "95e87798-dde3-44b3-82e3-2ec6fd2a7b68" (UID: "95e87798-dde3-44b3-82e3-2ec6fd2a7b68"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 18 17:04:00.325819 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:04:00.325788 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95e87798-dde3-44b3-82e3-2ec6fd2a7b68-kube-api-access-s6789" (OuterVolumeSpecName: "kube-api-access-s6789") pod "95e87798-dde3-44b3-82e3-2ec6fd2a7b68" (UID: "95e87798-dde3-44b3-82e3-2ec6fd2a7b68"). InnerVolumeSpecName "kube-api-access-s6789". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 17:04:00.388592 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:04:00.388509 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-4b816e-predictor-9b5d65bdb-5sppx"] Mar 18 17:04:00.388869 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:04:00.388853 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="95e87798-dde3-44b3-82e3-2ec6fd2a7b68" containerName="storage-initializer" Mar 18 17:04:00.388869 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:04:00.388870 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="95e87798-dde3-44b3-82e3-2ec6fd2a7b68" containerName="storage-initializer" Mar 18 17:04:00.389033 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:04:00.388887 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="95e87798-dde3-44b3-82e3-2ec6fd2a7b68" containerName="storage-initializer" Mar 18 17:04:00.389033 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:04:00.388892 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="95e87798-dde3-44b3-82e3-2ec6fd2a7b68" containerName="storage-initializer" Mar 18 17:04:00.389033 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:04:00.388965 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="95e87798-dde3-44b3-82e3-2ec6fd2a7b68" containerName="storage-initializer" Mar 18 17:04:00.389033 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:04:00.388973 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="95e87798-dde3-44b3-82e3-2ec6fd2a7b68" containerName="storage-initializer" Mar 18 17:04:00.392845 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:04:00.392822 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-4b816e-predictor-9b5d65bdb-5sppx" Mar 18 17:04:00.394694 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:04:00.394676 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-sa-4b816e-dockercfg-f67cn\"" Mar 18 17:04:00.394793 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:04:00.394676 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-secret-4b816e\"" Mar 18 17:04:00.398020 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:04:00.398000 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-4b816e-predictor-9b5d65bdb-5sppx"] Mar 18 17:04:00.424661 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:04:00.424629 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/95e87798-dde3-44b3-82e3-2ec6fd2a7b68-kserve-provision-location\") on node \"ip-10-0-133-190.ec2.internal\" DevicePath \"\"" Mar 18 17:04:00.424792 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:04:00.424665 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-s6789\" (UniqueName: \"kubernetes.io/projected/95e87798-dde3-44b3-82e3-2ec6fd2a7b68-kube-api-access-s6789\") on node \"ip-10-0-133-190.ec2.internal\" DevicePath \"\"" Mar 18 17:04:00.525494 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:04:00.525457 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdjf2\" (UniqueName: \"kubernetes.io/projected/74866ab0-1959-4da9-ab8b-fc07c1971d4d-kube-api-access-pdjf2\") pod \"isvc-init-fail-4b816e-predictor-9b5d65bdb-5sppx\" (UID: \"74866ab0-1959-4da9-ab8b-fc07c1971d4d\") " pod="kserve-ci-e2e-test/isvc-init-fail-4b816e-predictor-9b5d65bdb-5sppx" Mar 18 17:04:00.525667 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:04:00.525518 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/74866ab0-1959-4da9-ab8b-fc07c1971d4d-kserve-provision-location\") pod \"isvc-init-fail-4b816e-predictor-9b5d65bdb-5sppx\" (UID: \"74866ab0-1959-4da9-ab8b-fc07c1971d4d\") " pod="kserve-ci-e2e-test/isvc-init-fail-4b816e-predictor-9b5d65bdb-5sppx" Mar 18 17:04:00.625969 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:04:00.625921 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pdjf2\" (UniqueName: \"kubernetes.io/projected/74866ab0-1959-4da9-ab8b-fc07c1971d4d-kube-api-access-pdjf2\") pod \"isvc-init-fail-4b816e-predictor-9b5d65bdb-5sppx\" (UID: \"74866ab0-1959-4da9-ab8b-fc07c1971d4d\") " pod="kserve-ci-e2e-test/isvc-init-fail-4b816e-predictor-9b5d65bdb-5sppx" Mar 18 17:04:00.626138 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:04:00.625991 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/74866ab0-1959-4da9-ab8b-fc07c1971d4d-kserve-provision-location\") pod \"isvc-init-fail-4b816e-predictor-9b5d65bdb-5sppx\" (UID: \"74866ab0-1959-4da9-ab8b-fc07c1971d4d\") " pod="kserve-ci-e2e-test/isvc-init-fail-4b816e-predictor-9b5d65bdb-5sppx" Mar 18 17:04:00.626348 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:04:00.626333 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/74866ab0-1959-4da9-ab8b-fc07c1971d4d-kserve-provision-location\") pod \"isvc-init-fail-4b816e-predictor-9b5d65bdb-5sppx\" (UID: \"74866ab0-1959-4da9-ab8b-fc07c1971d4d\") " pod="kserve-ci-e2e-test/isvc-init-fail-4b816e-predictor-9b5d65bdb-5sppx" Mar 18 17:04:00.633441 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:04:00.633415 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdjf2\" (UniqueName: \"kubernetes.io/projected/74866ab0-1959-4da9-ab8b-fc07c1971d4d-kube-api-access-pdjf2\") pod \"isvc-init-fail-4b816e-predictor-9b5d65bdb-5sppx\" (UID: \"74866ab0-1959-4da9-ab8b-fc07c1971d4d\") " pod="kserve-ci-e2e-test/isvc-init-fail-4b816e-predictor-9b5d65bdb-5sppx" Mar 18 17:04:00.645450 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:04:00.645387 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-b99a18-predictor-5bc9b9f7d-qtt5c_95e87798-dde3-44b3-82e3-2ec6fd2a7b68/storage-initializer/1.log" Mar 18 17:04:00.645574 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:04:00.645454 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-b99a18-predictor-5bc9b9f7d-qtt5c" event={"ID":"95e87798-dde3-44b3-82e3-2ec6fd2a7b68","Type":"ContainerDied","Data":"fbd236371226c911b35b79585aad2f66790107612d4d80e0c36f8b06640a4907"} Mar 18 17:04:00.645574 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:04:00.645488 2576 scope.go:117] "RemoveContainer" containerID="e57788399aa30bf3c651660b489a4e4a49c75a7f9f4931efed57e82426140943" Mar 18 17:04:00.645574 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:04:00.645492 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-b99a18-predictor-5bc9b9f7d-qtt5c" Mar 18 17:04:00.681948 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:04:00.681902 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-b99a18-predictor-5bc9b9f7d-qtt5c"] Mar 18 17:04:00.685500 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:04:00.685469 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-b99a18-predictor-5bc9b9f7d-qtt5c"] Mar 18 17:04:00.704345 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:04:00.704310 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-4b816e-predictor-9b5d65bdb-5sppx" Mar 18 17:04:00.824900 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:04:00.824868 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-4b816e-predictor-9b5d65bdb-5sppx"] Mar 18 17:04:00.827771 ip-10-0-133-190 kubenswrapper[2576]: W0318 17:04:00.827743 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod74866ab0_1959_4da9_ab8b_fc07c1971d4d.slice/crio-efbc3e3e0b0d123131fb4569ca4b6673e0d22487e870ae6f26b1dc500622c2b4 WatchSource:0}: Error finding container efbc3e3e0b0d123131fb4569ca4b6673e0d22487e870ae6f26b1dc500622c2b4: Status 404 returned error can't find the container with id efbc3e3e0b0d123131fb4569ca4b6673e0d22487e870ae6f26b1dc500622c2b4 Mar 18 17:04:01.562439 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:04:01.562403 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95e87798-dde3-44b3-82e3-2ec6fd2a7b68" path="/var/lib/kubelet/pods/95e87798-dde3-44b3-82e3-2ec6fd2a7b68/volumes" Mar 18 17:04:01.650581 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:04:01.650549 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-4b816e-predictor-9b5d65bdb-5sppx" event={"ID":"74866ab0-1959-4da9-ab8b-fc07c1971d4d","Type":"ContainerStarted","Data":"c955fdfbf6282e8b07df1b03f9276448e436c4496b214c666b2ed3908da647e2"} Mar 18 17:04:01.650581 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:04:01.650582 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-4b816e-predictor-9b5d65bdb-5sppx" event={"ID":"74866ab0-1959-4da9-ab8b-fc07c1971d4d","Type":"ContainerStarted","Data":"efbc3e3e0b0d123131fb4569ca4b6673e0d22487e870ae6f26b1dc500622c2b4"} Mar 18 17:04:02.559828 ip-10-0-133-190 kubenswrapper[2576]: E0318 17:04:02.559794 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-jrcb2" podUID="c1eb4bd1-d0f6-4dd7-a063-638cff8ab0d3" Mar 18 17:04:02.654218 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:04:02.654189 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-4b816e-predictor-9b5d65bdb-5sppx_74866ab0-1959-4da9-ab8b-fc07c1971d4d/storage-initializer/0.log" Mar 18 17:04:02.654577 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:04:02.654231 2576 generic.go:358] "Generic (PLEG): container finished" podID="74866ab0-1959-4da9-ab8b-fc07c1971d4d" containerID="c955fdfbf6282e8b07df1b03f9276448e436c4496b214c666b2ed3908da647e2" exitCode=1 Mar 18 17:04:02.654577 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:04:02.654285 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-4b816e-predictor-9b5d65bdb-5sppx" event={"ID":"74866ab0-1959-4da9-ab8b-fc07c1971d4d","Type":"ContainerDied","Data":"c955fdfbf6282e8b07df1b03f9276448e436c4496b214c666b2ed3908da647e2"} Mar 18 17:04:03.658773 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:04:03.658738 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-4b816e-predictor-9b5d65bdb-5sppx_74866ab0-1959-4da9-ab8b-fc07c1971d4d/storage-initializer/1.log" Mar 18 17:04:03.659197 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:04:03.659125 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-4b816e-predictor-9b5d65bdb-5sppx_74866ab0-1959-4da9-ab8b-fc07c1971d4d/storage-initializer/0.log" Mar 18 17:04:03.659197 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:04:03.659163 2576 generic.go:358] "Generic (PLEG): container finished" podID="74866ab0-1959-4da9-ab8b-fc07c1971d4d" containerID="72633d5e7c9053ef715328875118d944d91e4c72a7d0eb8bc594538956691de9" exitCode=1 Mar 18 17:04:03.659374 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:04:03.659196 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-4b816e-predictor-9b5d65bdb-5sppx" event={"ID":"74866ab0-1959-4da9-ab8b-fc07c1971d4d","Type":"ContainerDied","Data":"72633d5e7c9053ef715328875118d944d91e4c72a7d0eb8bc594538956691de9"} Mar 18 17:04:03.659374 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:04:03.659244 2576 scope.go:117] "RemoveContainer" containerID="c955fdfbf6282e8b07df1b03f9276448e436c4496b214c666b2ed3908da647e2" Mar 18 17:04:03.659550 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:04:03.659533 2576 scope.go:117] "RemoveContainer" containerID="c955fdfbf6282e8b07df1b03f9276448e436c4496b214c666b2ed3908da647e2" Mar 18 17:04:03.673902 ip-10-0-133-190 kubenswrapper[2576]: E0318 17:04:03.673871 2576 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-init-fail-4b816e-predictor-9b5d65bdb-5sppx_kserve-ci-e2e-test_74866ab0-1959-4da9-ab8b-fc07c1971d4d_0 in pod sandbox efbc3e3e0b0d123131fb4569ca4b6673e0d22487e870ae6f26b1dc500622c2b4 from index: no such id: 'c955fdfbf6282e8b07df1b03f9276448e436c4496b214c666b2ed3908da647e2'" containerID="c955fdfbf6282e8b07df1b03f9276448e436c4496b214c666b2ed3908da647e2" Mar 18 17:04:03.673977 ip-10-0-133-190 kubenswrapper[2576]: E0318 17:04:03.673923 2576 kuberuntime_container.go:951] "Unhandled Error" err="failed to remove pod init container \"storage-initializer\": rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-init-fail-4b816e-predictor-9b5d65bdb-5sppx_kserve-ci-e2e-test_74866ab0-1959-4da9-ab8b-fc07c1971d4d_0 in pod sandbox efbc3e3e0b0d123131fb4569ca4b6673e0d22487e870ae6f26b1dc500622c2b4 from index: no such id: 'c955fdfbf6282e8b07df1b03f9276448e436c4496b214c666b2ed3908da647e2'; Skipping pod \"isvc-init-fail-4b816e-predictor-9b5d65bdb-5sppx_kserve-ci-e2e-test(74866ab0-1959-4da9-ab8b-fc07c1971d4d)\"" logger="UnhandledError" Mar 18 17:04:03.675369 ip-10-0-133-190 kubenswrapper[2576]: E0318 17:04:03.675341 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-init-fail-4b816e-predictor-9b5d65bdb-5sppx_kserve-ci-e2e-test(74866ab0-1959-4da9-ab8b-fc07c1971d4d)\"" pod="kserve-ci-e2e-test/isvc-init-fail-4b816e-predictor-9b5d65bdb-5sppx" podUID="74866ab0-1959-4da9-ab8b-fc07c1971d4d" Mar 18 17:04:04.663216 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:04:04.663186 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-4b816e-predictor-9b5d65bdb-5sppx_74866ab0-1959-4da9-ab8b-fc07c1971d4d/storage-initializer/1.log" Mar 18 17:04:04.663697 ip-10-0-133-190 kubenswrapper[2576]: E0318 17:04:04.663674 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-init-fail-4b816e-predictor-9b5d65bdb-5sppx_kserve-ci-e2e-test(74866ab0-1959-4da9-ab8b-fc07c1971d4d)\"" pod="kserve-ci-e2e-test/isvc-init-fail-4b816e-predictor-9b5d65bdb-5sppx" podUID="74866ab0-1959-4da9-ab8b-fc07c1971d4d" Mar 18 17:04:05.132193 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:04:05.132164 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-b99a18-predictor-5974475d47-hlswc" Mar 18 17:04:05.175982 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:04:05.175949 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-4b816e-predictor-9b5d65bdb-5sppx"] Mar 18 17:04:05.264483 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:04:05.264394 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fe86ed4c-c91f-487b-9530-8aa1a9c230f2-kserve-provision-location\") pod \"fe86ed4c-c91f-487b-9530-8aa1a9c230f2\" (UID: \"fe86ed4c-c91f-487b-9530-8aa1a9c230f2\") " Mar 18 17:04:05.264483 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:04:05.264483 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-47mqq\" (UniqueName: \"kubernetes.io/projected/fe86ed4c-c91f-487b-9530-8aa1a9c230f2-kube-api-access-47mqq\") pod \"fe86ed4c-c91f-487b-9530-8aa1a9c230f2\" (UID: \"fe86ed4c-c91f-487b-9530-8aa1a9c230f2\") " Mar 18 17:04:05.264796 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:04:05.264768 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe86ed4c-c91f-487b-9530-8aa1a9c230f2-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "fe86ed4c-c91f-487b-9530-8aa1a9c230f2" (UID: "fe86ed4c-c91f-487b-9530-8aa1a9c230f2"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 18 17:04:05.266515 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:04:05.266490 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe86ed4c-c91f-487b-9530-8aa1a9c230f2-kube-api-access-47mqq" (OuterVolumeSpecName: "kube-api-access-47mqq") pod "fe86ed4c-c91f-487b-9530-8aa1a9c230f2" (UID: "fe86ed4c-c91f-487b-9530-8aa1a9c230f2"). InnerVolumeSpecName "kube-api-access-47mqq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 17:04:05.365234 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:04:05.365199 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fe86ed4c-c91f-487b-9530-8aa1a9c230f2-kserve-provision-location\") on node \"ip-10-0-133-190.ec2.internal\" DevicePath \"\"" Mar 18 17:04:05.365234 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:04:05.365228 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-47mqq\" (UniqueName: \"kubernetes.io/projected/fe86ed4c-c91f-487b-9530-8aa1a9c230f2-kube-api-access-47mqq\") on node \"ip-10-0-133-190.ec2.internal\" DevicePath \"\"" Mar 18 17:04:05.397402 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:04:05.397363 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-4bf7c-predictor-5854b44986-vgfk9"] Mar 18 17:04:05.397752 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:04:05.397726 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fe86ed4c-c91f-487b-9530-8aa1a9c230f2" containerName="storage-initializer" Mar 18 17:04:05.397752 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:04:05.397748 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe86ed4c-c91f-487b-9530-8aa1a9c230f2" containerName="storage-initializer" Mar 18 17:04:05.397900 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:04:05.397775 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fe86ed4c-c91f-487b-9530-8aa1a9c230f2" containerName="kserve-container" Mar 18 17:04:05.397900 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:04:05.397784 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe86ed4c-c91f-487b-9530-8aa1a9c230f2" containerName="kserve-container" Mar 18 17:04:05.397900 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:04:05.397877 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="fe86ed4c-c91f-487b-9530-8aa1a9c230f2" containerName="kserve-container" Mar 18 17:04:05.400568 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:04:05.400546 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-4bf7c-predictor-5854b44986-vgfk9" Mar 18 17:04:05.408600 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:04:05.408560 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-4bf7c-predictor-5854b44986-vgfk9"] Mar 18 17:04:05.567126 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:04:05.567033 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ndcq\" (UniqueName: \"kubernetes.io/projected/9a3e6f2c-de86-4352-ad7c-876ec6601b25-kube-api-access-5ndcq\") pod \"raw-sklearn-4bf7c-predictor-5854b44986-vgfk9\" (UID: \"9a3e6f2c-de86-4352-ad7c-876ec6601b25\") " pod="kserve-ci-e2e-test/raw-sklearn-4bf7c-predictor-5854b44986-vgfk9" Mar 18 17:04:05.567266 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:04:05.567125 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9a3e6f2c-de86-4352-ad7c-876ec6601b25-kserve-provision-location\") pod \"raw-sklearn-4bf7c-predictor-5854b44986-vgfk9\" (UID: \"9a3e6f2c-de86-4352-ad7c-876ec6601b25\") " pod="kserve-ci-e2e-test/raw-sklearn-4bf7c-predictor-5854b44986-vgfk9" Mar 18 17:04:05.667529 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:04:05.667495 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9a3e6f2c-de86-4352-ad7c-876ec6601b25-kserve-provision-location\") pod \"raw-sklearn-4bf7c-predictor-5854b44986-vgfk9\" (UID: \"9a3e6f2c-de86-4352-ad7c-876ec6601b25\") " pod="kserve-ci-e2e-test/raw-sklearn-4bf7c-predictor-5854b44986-vgfk9" Mar 18 17:04:05.667987 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:04:05.667553 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5ndcq\" (UniqueName: \"kubernetes.io/projected/9a3e6f2c-de86-4352-ad7c-876ec6601b25-kube-api-access-5ndcq\") pod \"raw-sklearn-4bf7c-predictor-5854b44986-vgfk9\" (UID: \"9a3e6f2c-de86-4352-ad7c-876ec6601b25\") " pod="kserve-ci-e2e-test/raw-sklearn-4bf7c-predictor-5854b44986-vgfk9" Mar 18 17:04:05.667987 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:04:05.667762 2576 generic.go:358] "Generic (PLEG): container finished" podID="fe86ed4c-c91f-487b-9530-8aa1a9c230f2" containerID="7e4051dc025f10dd48eec2f3360b69952cfcab7350af0abc5d2d15ed1503b398" exitCode=0 Mar 18 17:04:05.667987 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:04:05.667834 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-b99a18-predictor-5974475d47-hlswc" Mar 18 17:04:05.667987 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:04:05.667845 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-b99a18-predictor-5974475d47-hlswc" event={"ID":"fe86ed4c-c91f-487b-9530-8aa1a9c230f2","Type":"ContainerDied","Data":"7e4051dc025f10dd48eec2f3360b69952cfcab7350af0abc5d2d15ed1503b398"} Mar 18 17:04:05.667987 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:04:05.667883 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9a3e6f2c-de86-4352-ad7c-876ec6601b25-kserve-provision-location\") pod \"raw-sklearn-4bf7c-predictor-5854b44986-vgfk9\" (UID: \"9a3e6f2c-de86-4352-ad7c-876ec6601b25\") " pod="kserve-ci-e2e-test/raw-sklearn-4bf7c-predictor-5854b44986-vgfk9" Mar 18 17:04:05.667987 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:04:05.667900 2576 scope.go:117] "RemoveContainer" containerID="7e4051dc025f10dd48eec2f3360b69952cfcab7350af0abc5d2d15ed1503b398" Mar 18 17:04:05.667987 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:04:05.667887 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-b99a18-predictor-5974475d47-hlswc" event={"ID":"fe86ed4c-c91f-487b-9530-8aa1a9c230f2","Type":"ContainerDied","Data":"a1cc12ce85c4b7191aeb2284b55af9f5be17ff6b0a85b99e094f9ee287a698f3"} Mar 18 17:04:05.675265 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:04:05.675233 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ndcq\" (UniqueName: \"kubernetes.io/projected/9a3e6f2c-de86-4352-ad7c-876ec6601b25-kube-api-access-5ndcq\") pod \"raw-sklearn-4bf7c-predictor-5854b44986-vgfk9\" (UID: \"9a3e6f2c-de86-4352-ad7c-876ec6601b25\") " pod="kserve-ci-e2e-test/raw-sklearn-4bf7c-predictor-5854b44986-vgfk9" Mar 18 17:04:05.676580 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:04:05.676560 2576 scope.go:117] "RemoveContainer" containerID="c7825f427708007ae92df5c9665d785797356df205e3a3d5a7617e0fd4da8030" Mar 18 17:04:05.682277 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:04:05.682252 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-b99a18-predictor-5974475d47-hlswc"] Mar 18 17:04:05.686161 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:04:05.686137 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-b99a18-predictor-5974475d47-hlswc"] Mar 18 17:04:05.688809 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:04:05.688790 2576 scope.go:117] "RemoveContainer" containerID="7e4051dc025f10dd48eec2f3360b69952cfcab7350af0abc5d2d15ed1503b398" Mar 18 17:04:05.689169 ip-10-0-133-190 kubenswrapper[2576]: E0318 17:04:05.689142 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e4051dc025f10dd48eec2f3360b69952cfcab7350af0abc5d2d15ed1503b398\": container with ID starting with 7e4051dc025f10dd48eec2f3360b69952cfcab7350af0abc5d2d15ed1503b398 not found: ID does not exist" containerID="7e4051dc025f10dd48eec2f3360b69952cfcab7350af0abc5d2d15ed1503b398" Mar 18 17:04:05.689261 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:04:05.689178 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e4051dc025f10dd48eec2f3360b69952cfcab7350af0abc5d2d15ed1503b398"} err="failed to get container status \"7e4051dc025f10dd48eec2f3360b69952cfcab7350af0abc5d2d15ed1503b398\": rpc error: code = NotFound desc = could not find container \"7e4051dc025f10dd48eec2f3360b69952cfcab7350af0abc5d2d15ed1503b398\": container with ID starting with 7e4051dc025f10dd48eec2f3360b69952cfcab7350af0abc5d2d15ed1503b398 not found: ID does not exist" Mar 18 17:04:05.689261 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:04:05.689206 2576 scope.go:117] "RemoveContainer" containerID="c7825f427708007ae92df5c9665d785797356df205e3a3d5a7617e0fd4da8030" Mar 18 17:04:05.689481 ip-10-0-133-190 kubenswrapper[2576]: E0318 17:04:05.689461 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7825f427708007ae92df5c9665d785797356df205e3a3d5a7617e0fd4da8030\": container with ID starting with c7825f427708007ae92df5c9665d785797356df205e3a3d5a7617e0fd4da8030 not found: ID does not exist" containerID="c7825f427708007ae92df5c9665d785797356df205e3a3d5a7617e0fd4da8030" Mar 18 17:04:05.689543 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:04:05.689486 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7825f427708007ae92df5c9665d785797356df205e3a3d5a7617e0fd4da8030"} err="failed to get container status \"c7825f427708007ae92df5c9665d785797356df205e3a3d5a7617e0fd4da8030\": rpc error: code = NotFound desc = could not find container \"c7825f427708007ae92df5c9665d785797356df205e3a3d5a7617e0fd4da8030\": container with ID starting with c7825f427708007ae92df5c9665d785797356df205e3a3d5a7617e0fd4da8030 not found: ID does not exist" Mar 18 17:04:05.712074 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:04:05.712043 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-4bf7c-predictor-5854b44986-vgfk9" Mar 18 17:04:05.786809 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:04:05.786785 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-4b816e-predictor-9b5d65bdb-5sppx_74866ab0-1959-4da9-ab8b-fc07c1971d4d/storage-initializer/1.log" Mar 18 17:04:05.786974 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:04:05.786855 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-4b816e-predictor-9b5d65bdb-5sppx" Mar 18 17:04:05.846151 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:04:05.846085 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-4bf7c-predictor-5854b44986-vgfk9"] Mar 18 17:04:05.848302 ip-10-0-133-190 kubenswrapper[2576]: W0318 17:04:05.848272 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a3e6f2c_de86_4352_ad7c_876ec6601b25.slice/crio-bb73c0463c7093164f1feeb533349517f43e2d19231acf6d04a9308afc98b8c8 WatchSource:0}: Error finding container bb73c0463c7093164f1feeb533349517f43e2d19231acf6d04a9308afc98b8c8: Status 404 returned error can't find the container with id bb73c0463c7093164f1feeb533349517f43e2d19231acf6d04a9308afc98b8c8 Mar 18 17:04:05.869117 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:04:05.869094 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pdjf2\" (UniqueName: \"kubernetes.io/projected/74866ab0-1959-4da9-ab8b-fc07c1971d4d-kube-api-access-pdjf2\") pod \"74866ab0-1959-4da9-ab8b-fc07c1971d4d\" (UID: \"74866ab0-1959-4da9-ab8b-fc07c1971d4d\") " Mar 18 17:04:05.869217 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:04:05.869169 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/74866ab0-1959-4da9-ab8b-fc07c1971d4d-kserve-provision-location\") pod \"74866ab0-1959-4da9-ab8b-fc07c1971d4d\" (UID: \"74866ab0-1959-4da9-ab8b-fc07c1971d4d\") " Mar 18 17:04:05.869431 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:04:05.869401 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74866ab0-1959-4da9-ab8b-fc07c1971d4d-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "74866ab0-1959-4da9-ab8b-fc07c1971d4d" (UID: "74866ab0-1959-4da9-ab8b-fc07c1971d4d"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 18 17:04:05.871137 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:04:05.871117 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74866ab0-1959-4da9-ab8b-fc07c1971d4d-kube-api-access-pdjf2" (OuterVolumeSpecName: "kube-api-access-pdjf2") pod "74866ab0-1959-4da9-ab8b-fc07c1971d4d" (UID: "74866ab0-1959-4da9-ab8b-fc07c1971d4d"). InnerVolumeSpecName "kube-api-access-pdjf2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 17:04:05.970439 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:04:05.970405 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pdjf2\" (UniqueName: \"kubernetes.io/projected/74866ab0-1959-4da9-ab8b-fc07c1971d4d-kube-api-access-pdjf2\") on node \"ip-10-0-133-190.ec2.internal\" DevicePath \"\"" Mar 18 17:04:05.970439 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:04:05.970434 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/74866ab0-1959-4da9-ab8b-fc07c1971d4d-kserve-provision-location\") on node \"ip-10-0-133-190.ec2.internal\" DevicePath \"\"" Mar 18 17:04:06.672689 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:04:06.672659 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-4b816e-predictor-9b5d65bdb-5sppx_74866ab0-1959-4da9-ab8b-fc07c1971d4d/storage-initializer/1.log" Mar 18 17:04:06.673183 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:04:06.672751 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-4b816e-predictor-9b5d65bdb-5sppx" event={"ID":"74866ab0-1959-4da9-ab8b-fc07c1971d4d","Type":"ContainerDied","Data":"efbc3e3e0b0d123131fb4569ca4b6673e0d22487e870ae6f26b1dc500622c2b4"} Mar 18 17:04:06.673183 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:04:06.672766 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-4b816e-predictor-9b5d65bdb-5sppx" Mar 18 17:04:06.673183 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:04:06.672779 2576 scope.go:117] "RemoveContainer" containerID="72633d5e7c9053ef715328875118d944d91e4c72a7d0eb8bc594538956691de9" Mar 18 17:04:06.674098 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:04:06.674059 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-4bf7c-predictor-5854b44986-vgfk9" event={"ID":"9a3e6f2c-de86-4352-ad7c-876ec6601b25","Type":"ContainerStarted","Data":"a3060fb11ebd1813a8cc1e04ddcdb2719af389301090d5348c6d49ed3427c581"} Mar 18 17:04:06.674098 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:04:06.674093 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-4bf7c-predictor-5854b44986-vgfk9" event={"ID":"9a3e6f2c-de86-4352-ad7c-876ec6601b25","Type":"ContainerStarted","Data":"bb73c0463c7093164f1feeb533349517f43e2d19231acf6d04a9308afc98b8c8"} Mar 18 17:04:06.723515 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:04:06.719997 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-4b816e-predictor-9b5d65bdb-5sppx"] Mar 18 17:04:06.726076 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:04:06.726048 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-4b816e-predictor-9b5d65bdb-5sppx"] Mar 18 17:04:07.563100 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:04:07.563066 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74866ab0-1959-4da9-ab8b-fc07c1971d4d" path="/var/lib/kubelet/pods/74866ab0-1959-4da9-ab8b-fc07c1971d4d/volumes" Mar 18 17:04:07.563429 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:04:07.563416 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe86ed4c-c91f-487b-9530-8aa1a9c230f2" path="/var/lib/kubelet/pods/fe86ed4c-c91f-487b-9530-8aa1a9c230f2/volumes" Mar 18 17:04:09.684185 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:04:09.684148 2576 generic.go:358] "Generic (PLEG): container finished" podID="9a3e6f2c-de86-4352-ad7c-876ec6601b25" containerID="a3060fb11ebd1813a8cc1e04ddcdb2719af389301090d5348c6d49ed3427c581" exitCode=0 Mar 18 17:04:09.684535 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:04:09.684222 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-4bf7c-predictor-5854b44986-vgfk9" event={"ID":"9a3e6f2c-de86-4352-ad7c-876ec6601b25","Type":"ContainerDied","Data":"a3060fb11ebd1813a8cc1e04ddcdb2719af389301090d5348c6d49ed3427c581"} Mar 18 17:04:10.688323 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:04:10.688294 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-4bf7c-predictor-5854b44986-vgfk9" event={"ID":"9a3e6f2c-de86-4352-ad7c-876ec6601b25","Type":"ContainerStarted","Data":"00a53613c735282f78b018c1047bd85ec9c24315841e02748eedfe9de76219d8"} Mar 18 17:04:10.688701 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:04:10.688567 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/raw-sklearn-4bf7c-predictor-5854b44986-vgfk9" Mar 18 17:04:10.689950 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:04:10.689908 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-4bf7c-predictor-5854b44986-vgfk9" podUID="9a3e6f2c-de86-4352-ad7c-876ec6601b25" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Mar 18 17:04:10.704650 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:04:10.704600 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/raw-sklearn-4bf7c-predictor-5854b44986-vgfk9" podStartSLOduration=5.7045854049999996 podStartE2EDuration="5.704585405s" podCreationTimestamp="2026-03-18 17:04:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:04:10.703069928 +0000 UTC m=+1241.752885355" watchObservedRunningTime="2026-03-18 17:04:10.704585405 +0000 UTC m=+1241.754400835" Mar 18 17:04:11.691329 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:04:11.691294 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-4bf7c-predictor-5854b44986-vgfk9" podUID="9a3e6f2c-de86-4352-ad7c-876ec6601b25" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Mar 18 17:04:15.560180 ip-10-0-133-190 kubenswrapper[2576]: E0318 17:04:15.560140 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-jrcb2" podUID="c1eb4bd1-d0f6-4dd7-a063-638cff8ab0d3" Mar 18 17:04:21.691691 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:04:21.691592 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-4bf7c-predictor-5854b44986-vgfk9" podUID="9a3e6f2c-de86-4352-ad7c-876ec6601b25" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Mar 18 17:04:30.560082 ip-10-0-133-190 kubenswrapper[2576]: E0318 17:04:30.560041 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-jrcb2" podUID="c1eb4bd1-d0f6-4dd7-a063-638cff8ab0d3" Mar 18 17:04:31.692128 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:04:31.692079 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-4bf7c-predictor-5854b44986-vgfk9" podUID="9a3e6f2c-de86-4352-ad7c-876ec6601b25" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Mar 18 17:04:41.559807 ip-10-0-133-190 kubenswrapper[2576]: E0318 17:04:41.559764 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-jrcb2" podUID="c1eb4bd1-d0f6-4dd7-a063-638cff8ab0d3" Mar 18 17:04:41.691934 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:04:41.691885 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-4bf7c-predictor-5854b44986-vgfk9" podUID="9a3e6f2c-de86-4352-ad7c-876ec6601b25" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Mar 18 17:04:51.692201 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:04:51.692157 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-4bf7c-predictor-5854b44986-vgfk9" podUID="9a3e6f2c-de86-4352-ad7c-876ec6601b25" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Mar 18 17:04:55.560518 ip-10-0-133-190 kubenswrapper[2576]: E0318 17:04:55.560476 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-jrcb2" podUID="c1eb4bd1-d0f6-4dd7-a063-638cff8ab0d3" Mar 18 17:05:01.691652 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:05:01.691606 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-4bf7c-predictor-5854b44986-vgfk9" podUID="9a3e6f2c-de86-4352-ad7c-876ec6601b25" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Mar 18 17:05:09.563342 ip-10-0-133-190 kubenswrapper[2576]: E0318 17:05:09.563156 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-jrcb2" podUID="c1eb4bd1-d0f6-4dd7-a063-638cff8ab0d3" Mar 18 17:05:11.691875 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:05:11.691830 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-4bf7c-predictor-5854b44986-vgfk9" podUID="9a3e6f2c-de86-4352-ad7c-876ec6601b25" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Mar 18 17:05:13.563673 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:05:13.563638 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-4bf7c-predictor-5854b44986-vgfk9" podUID="9a3e6f2c-de86-4352-ad7c-876ec6601b25" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Mar 18 17:05:20.559988 ip-10-0-133-190 kubenswrapper[2576]: E0318 17:05:20.559928 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-jrcb2" podUID="c1eb4bd1-d0f6-4dd7-a063-638cff8ab0d3" Mar 18 17:05:23.564026 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:05:23.563989 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-4bf7c-predictor-5854b44986-vgfk9" podUID="9a3e6f2c-de86-4352-ad7c-876ec6601b25" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Mar 18 17:05:31.560555 ip-10-0-133-190 kubenswrapper[2576]: E0318 17:05:31.560526 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-jrcb2" podUID="c1eb4bd1-d0f6-4dd7-a063-638cff8ab0d3" Mar 18 17:05:33.564242 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:05:33.564212 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/raw-sklearn-4bf7c-predictor-5854b44986-vgfk9" Mar 18 17:05:35.465001 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:05:35.464962 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-4bf7c-predictor-5854b44986-vgfk9"] Mar 18 17:05:35.465473 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:05:35.465313 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/raw-sklearn-4bf7c-predictor-5854b44986-vgfk9" podUID="9a3e6f2c-de86-4352-ad7c-876ec6601b25" containerName="kserve-container" containerID="cri-o://00a53613c735282f78b018c1047bd85ec9c24315841e02748eedfe9de76219d8" gracePeriod=30 Mar 18 17:05:35.676345 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:05:35.676314 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-8eb0c-predictor-58f7bfb56d-rpps8"] Mar 18 17:05:35.676756 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:05:35.676743 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="74866ab0-1959-4da9-ab8b-fc07c1971d4d" containerName="storage-initializer" Mar 18 17:05:35.676806 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:05:35.676760 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="74866ab0-1959-4da9-ab8b-fc07c1971d4d" containerName="storage-initializer" Mar 18 17:05:35.676806 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:05:35.676777 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="74866ab0-1959-4da9-ab8b-fc07c1971d4d" containerName="storage-initializer" Mar 18 17:05:35.676806 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:05:35.676786 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="74866ab0-1959-4da9-ab8b-fc07c1971d4d" containerName="storage-initializer" Mar 18 17:05:35.676902 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:05:35.676873 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="74866ab0-1959-4da9-ab8b-fc07c1971d4d" containerName="storage-initializer" Mar 18 17:05:35.677060 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:05:35.677043 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="74866ab0-1959-4da9-ab8b-fc07c1971d4d" containerName="storage-initializer" Mar 18 17:05:35.681065 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:05:35.681040 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-runtime-8eb0c-predictor-58f7bfb56d-rpps8" Mar 18 17:05:35.688699 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:05:35.688673 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-8eb0c-predictor-58f7bfb56d-rpps8"] Mar 18 17:05:35.771892 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:05:35.771795 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbcd6\" (UniqueName: \"kubernetes.io/projected/bcac3e88-2eeb-4948-be6c-dbb5bd19da51-kube-api-access-nbcd6\") pod \"raw-sklearn-runtime-8eb0c-predictor-58f7bfb56d-rpps8\" (UID: \"bcac3e88-2eeb-4948-be6c-dbb5bd19da51\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-8eb0c-predictor-58f7bfb56d-rpps8" Mar 18 17:05:35.771892 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:05:35.771845 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bcac3e88-2eeb-4948-be6c-dbb5bd19da51-kserve-provision-location\") pod \"raw-sklearn-runtime-8eb0c-predictor-58f7bfb56d-rpps8\" (UID: \"bcac3e88-2eeb-4948-be6c-dbb5bd19da51\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-8eb0c-predictor-58f7bfb56d-rpps8" Mar 18 17:05:35.872275 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:05:35.872240 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bcac3e88-2eeb-4948-be6c-dbb5bd19da51-kserve-provision-location\") pod \"raw-sklearn-runtime-8eb0c-predictor-58f7bfb56d-rpps8\" (UID: \"bcac3e88-2eeb-4948-be6c-dbb5bd19da51\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-8eb0c-predictor-58f7bfb56d-rpps8" Mar 18 17:05:35.872464 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:05:35.872328 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nbcd6\" (UniqueName: \"kubernetes.io/projected/bcac3e88-2eeb-4948-be6c-dbb5bd19da51-kube-api-access-nbcd6\") pod \"raw-sklearn-runtime-8eb0c-predictor-58f7bfb56d-rpps8\" (UID: \"bcac3e88-2eeb-4948-be6c-dbb5bd19da51\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-8eb0c-predictor-58f7bfb56d-rpps8" Mar 18 17:05:35.872712 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:05:35.872689 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bcac3e88-2eeb-4948-be6c-dbb5bd19da51-kserve-provision-location\") pod \"raw-sklearn-runtime-8eb0c-predictor-58f7bfb56d-rpps8\" (UID: \"bcac3e88-2eeb-4948-be6c-dbb5bd19da51\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-8eb0c-predictor-58f7bfb56d-rpps8" Mar 18 17:05:35.882484 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:05:35.882457 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbcd6\" (UniqueName: \"kubernetes.io/projected/bcac3e88-2eeb-4948-be6c-dbb5bd19da51-kube-api-access-nbcd6\") pod \"raw-sklearn-runtime-8eb0c-predictor-58f7bfb56d-rpps8\" (UID: \"bcac3e88-2eeb-4948-be6c-dbb5bd19da51\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-8eb0c-predictor-58f7bfb56d-rpps8" Mar 18 17:05:35.992324 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:05:35.992286 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-runtime-8eb0c-predictor-58f7bfb56d-rpps8" Mar 18 17:05:36.115710 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:05:36.115677 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-8eb0c-predictor-58f7bfb56d-rpps8"] Mar 18 17:05:36.118756 ip-10-0-133-190 kubenswrapper[2576]: W0318 17:05:36.118728 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbcac3e88_2eeb_4948_be6c_dbb5bd19da51.slice/crio-1e682fb086cbf0c74cfc4a2f51fcaa82d72493666a1bc59072f17c117a9bfba2 WatchSource:0}: Error finding container 1e682fb086cbf0c74cfc4a2f51fcaa82d72493666a1bc59072f17c117a9bfba2: Status 404 returned error can't find the container with id 1e682fb086cbf0c74cfc4a2f51fcaa82d72493666a1bc59072f17c117a9bfba2 Mar 18 17:05:36.958474 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:05:36.958435 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-8eb0c-predictor-58f7bfb56d-rpps8" event={"ID":"bcac3e88-2eeb-4948-be6c-dbb5bd19da51","Type":"ContainerStarted","Data":"486c6f14ea3b035487d2d26df9ba61b585f7182d1922bdd157ef35f6a7a47bf3"} Mar 18 17:05:36.958474 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:05:36.958479 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-8eb0c-predictor-58f7bfb56d-rpps8" event={"ID":"bcac3e88-2eeb-4948-be6c-dbb5bd19da51","Type":"ContainerStarted","Data":"1e682fb086cbf0c74cfc4a2f51fcaa82d72493666a1bc59072f17c117a9bfba2"} Mar 18 17:05:39.970047 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:05:39.970009 2576 generic.go:358] "Generic (PLEG): container finished" podID="bcac3e88-2eeb-4948-be6c-dbb5bd19da51" containerID="486c6f14ea3b035487d2d26df9ba61b585f7182d1922bdd157ef35f6a7a47bf3" exitCode=0 Mar 18 17:05:39.970493 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:05:39.970074 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-8eb0c-predictor-58f7bfb56d-rpps8" event={"ID":"bcac3e88-2eeb-4948-be6c-dbb5bd19da51","Type":"ContainerDied","Data":"486c6f14ea3b035487d2d26df9ba61b585f7182d1922bdd157ef35f6a7a47bf3"} Mar 18 17:05:40.506038 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:05:40.506011 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-4bf7c-predictor-5854b44986-vgfk9" Mar 18 17:05:40.609201 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:05:40.609152 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5ndcq\" (UniqueName: \"kubernetes.io/projected/9a3e6f2c-de86-4352-ad7c-876ec6601b25-kube-api-access-5ndcq\") pod \"9a3e6f2c-de86-4352-ad7c-876ec6601b25\" (UID: \"9a3e6f2c-de86-4352-ad7c-876ec6601b25\") " Mar 18 17:05:40.609201 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:05:40.609205 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9a3e6f2c-de86-4352-ad7c-876ec6601b25-kserve-provision-location\") pod \"9a3e6f2c-de86-4352-ad7c-876ec6601b25\" (UID: \"9a3e6f2c-de86-4352-ad7c-876ec6601b25\") " Mar 18 17:05:40.609607 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:05:40.609578 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a3e6f2c-de86-4352-ad7c-876ec6601b25-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "9a3e6f2c-de86-4352-ad7c-876ec6601b25" (UID: "9a3e6f2c-de86-4352-ad7c-876ec6601b25"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 18 17:05:40.611329 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:05:40.611308 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a3e6f2c-de86-4352-ad7c-876ec6601b25-kube-api-access-5ndcq" (OuterVolumeSpecName: "kube-api-access-5ndcq") pod "9a3e6f2c-de86-4352-ad7c-876ec6601b25" (UID: "9a3e6f2c-de86-4352-ad7c-876ec6601b25"). InnerVolumeSpecName "kube-api-access-5ndcq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 17:05:40.710677 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:05:40.710592 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5ndcq\" (UniqueName: \"kubernetes.io/projected/9a3e6f2c-de86-4352-ad7c-876ec6601b25-kube-api-access-5ndcq\") on node \"ip-10-0-133-190.ec2.internal\" DevicePath \"\"" Mar 18 17:05:40.710677 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:05:40.710618 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9a3e6f2c-de86-4352-ad7c-876ec6601b25-kserve-provision-location\") on node \"ip-10-0-133-190.ec2.internal\" DevicePath \"\"" Mar 18 17:05:40.974768 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:05:40.974734 2576 generic.go:358] "Generic (PLEG): container finished" podID="9a3e6f2c-de86-4352-ad7c-876ec6601b25" containerID="00a53613c735282f78b018c1047bd85ec9c24315841e02748eedfe9de76219d8" exitCode=0 Mar 18 17:05:40.975254 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:05:40.974806 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-4bf7c-predictor-5854b44986-vgfk9" Mar 18 17:05:40.975254 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:05:40.974835 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-4bf7c-predictor-5854b44986-vgfk9" event={"ID":"9a3e6f2c-de86-4352-ad7c-876ec6601b25","Type":"ContainerDied","Data":"00a53613c735282f78b018c1047bd85ec9c24315841e02748eedfe9de76219d8"} Mar 18 17:05:40.975254 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:05:40.974878 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-4bf7c-predictor-5854b44986-vgfk9" event={"ID":"9a3e6f2c-de86-4352-ad7c-876ec6601b25","Type":"ContainerDied","Data":"bb73c0463c7093164f1feeb533349517f43e2d19231acf6d04a9308afc98b8c8"} Mar 18 17:05:40.975254 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:05:40.974900 2576 scope.go:117] "RemoveContainer" containerID="00a53613c735282f78b018c1047bd85ec9c24315841e02748eedfe9de76219d8" Mar 18 17:05:40.976432 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:05:40.976409 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-8eb0c-predictor-58f7bfb56d-rpps8" event={"ID":"bcac3e88-2eeb-4948-be6c-dbb5bd19da51","Type":"ContainerStarted","Data":"ffc6d4f3547d3ea4818dd3d9ae2598a8ee4ff9a21001872b37af45a0c1110b5f"} Mar 18 17:05:40.976707 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:05:40.976692 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/raw-sklearn-runtime-8eb0c-predictor-58f7bfb56d-rpps8" Mar 18 17:05:40.978263 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:05:40.978233 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-8eb0c-predictor-58f7bfb56d-rpps8" podUID="bcac3e88-2eeb-4948-be6c-dbb5bd19da51" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.41:8080: connect: connection refused" Mar 18 17:05:40.982850 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:05:40.982833 2576 scope.go:117] "RemoveContainer" containerID="a3060fb11ebd1813a8cc1e04ddcdb2719af389301090d5348c6d49ed3427c581" Mar 18 17:05:40.991752 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:05:40.991730 2576 scope.go:117] "RemoveContainer" containerID="00a53613c735282f78b018c1047bd85ec9c24315841e02748eedfe9de76219d8" Mar 18 17:05:40.992119 ip-10-0-133-190 kubenswrapper[2576]: E0318 17:05:40.992086 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00a53613c735282f78b018c1047bd85ec9c24315841e02748eedfe9de76219d8\": container with ID starting with 00a53613c735282f78b018c1047bd85ec9c24315841e02748eedfe9de76219d8 not found: ID does not exist" containerID="00a53613c735282f78b018c1047bd85ec9c24315841e02748eedfe9de76219d8" Mar 18 17:05:40.992228 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:05:40.992126 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00a53613c735282f78b018c1047bd85ec9c24315841e02748eedfe9de76219d8"} err="failed to get container status \"00a53613c735282f78b018c1047bd85ec9c24315841e02748eedfe9de76219d8\": rpc error: code = NotFound desc = could not find container \"00a53613c735282f78b018c1047bd85ec9c24315841e02748eedfe9de76219d8\": container with ID starting with 00a53613c735282f78b018c1047bd85ec9c24315841e02748eedfe9de76219d8 not found: ID does not exist" Mar 18 17:05:40.992228 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:05:40.992192 2576 scope.go:117] "RemoveContainer" containerID="a3060fb11ebd1813a8cc1e04ddcdb2719af389301090d5348c6d49ed3427c581" Mar 18 17:05:40.993003 ip-10-0-133-190 kubenswrapper[2576]: E0318 17:05:40.992966 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3060fb11ebd1813a8cc1e04ddcdb2719af389301090d5348c6d49ed3427c581\": container with ID starting with a3060fb11ebd1813a8cc1e04ddcdb2719af389301090d5348c6d49ed3427c581 not found: ID does not exist" containerID="a3060fb11ebd1813a8cc1e04ddcdb2719af389301090d5348c6d49ed3427c581" Mar 18 17:05:40.993107 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:05:40.993007 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3060fb11ebd1813a8cc1e04ddcdb2719af389301090d5348c6d49ed3427c581"} err="failed to get container status \"a3060fb11ebd1813a8cc1e04ddcdb2719af389301090d5348c6d49ed3427c581\": rpc error: code = NotFound desc = could not find container \"a3060fb11ebd1813a8cc1e04ddcdb2719af389301090d5348c6d49ed3427c581\": container with ID starting with a3060fb11ebd1813a8cc1e04ddcdb2719af389301090d5348c6d49ed3427c581 not found: ID does not exist" Mar 18 17:05:40.994313 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:05:40.994276 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/raw-sklearn-runtime-8eb0c-predictor-58f7bfb56d-rpps8" podStartSLOduration=5.9942634980000005 podStartE2EDuration="5.994263498s" podCreationTimestamp="2026-03-18 17:05:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:05:40.992139141 +0000 UTC m=+1332.041954570" watchObservedRunningTime="2026-03-18 17:05:40.994263498 +0000 UTC m=+1332.044078942" Mar 18 17:05:41.003304 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:05:41.003278 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-4bf7c-predictor-5854b44986-vgfk9"] Mar 18 17:05:41.004608 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:05:41.004587 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-4bf7c-predictor-5854b44986-vgfk9"] Mar 18 17:05:41.564354 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:05:41.564324 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a3e6f2c-de86-4352-ad7c-876ec6601b25" path="/var/lib/kubelet/pods/9a3e6f2c-de86-4352-ad7c-876ec6601b25/volumes" Mar 18 17:05:41.980616 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:05:41.980582 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-8eb0c-predictor-58f7bfb56d-rpps8" podUID="bcac3e88-2eeb-4948-be6c-dbb5bd19da51" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.41:8080: connect: connection refused" Mar 18 17:05:43.559879 ip-10-0-133-190 kubenswrapper[2576]: E0318 17:05:43.559726 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-jrcb2" podUID="c1eb4bd1-d0f6-4dd7-a063-638cff8ab0d3" Mar 18 17:05:51.981592 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:05:51.981496 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-8eb0c-predictor-58f7bfb56d-rpps8" podUID="bcac3e88-2eeb-4948-be6c-dbb5bd19da51" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.41:8080: connect: connection refused" Mar 18 17:05:54.560096 ip-10-0-133-190 kubenswrapper[2576]: E0318 17:05:54.560055 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-jrcb2" podUID="c1eb4bd1-d0f6-4dd7-a063-638cff8ab0d3" Mar 18 17:06:01.980898 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:06:01.980857 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-8eb0c-predictor-58f7bfb56d-rpps8" podUID="bcac3e88-2eeb-4948-be6c-dbb5bd19da51" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.41:8080: connect: connection refused" Mar 18 17:06:07.560186 ip-10-0-133-190 kubenswrapper[2576]: E0318 17:06:07.560156 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-jrcb2" podUID="c1eb4bd1-d0f6-4dd7-a063-638cff8ab0d3" Mar 18 17:06:11.980544 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:06:11.980504 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-8eb0c-predictor-58f7bfb56d-rpps8" podUID="bcac3e88-2eeb-4948-be6c-dbb5bd19da51" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.41:8080: connect: connection refused" Mar 18 17:06:21.560273 ip-10-0-133-190 kubenswrapper[2576]: E0318 17:06:21.560158 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-jrcb2" podUID="c1eb4bd1-d0f6-4dd7-a063-638cff8ab0d3" Mar 18 17:06:21.981342 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:06:21.981295 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-8eb0c-predictor-58f7bfb56d-rpps8" podUID="bcac3e88-2eeb-4948-be6c-dbb5bd19da51" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.41:8080: connect: connection refused" Mar 18 17:06:31.980876 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:06:31.980831 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-8eb0c-predictor-58f7bfb56d-rpps8" podUID="bcac3e88-2eeb-4948-be6c-dbb5bd19da51" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.41:8080: connect: connection refused" Mar 18 17:06:32.560278 ip-10-0-133-190 kubenswrapper[2576]: E0318 17:06:32.560245 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-jrcb2" podUID="c1eb4bd1-d0f6-4dd7-a063-638cff8ab0d3" Mar 18 17:06:41.981588 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:06:41.981541 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-8eb0c-predictor-58f7bfb56d-rpps8" podUID="bcac3e88-2eeb-4948-be6c-dbb5bd19da51" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.41:8080: connect: connection refused" Mar 18 17:06:45.559711 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:06:45.559671 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-8eb0c-predictor-58f7bfb56d-rpps8" podUID="bcac3e88-2eeb-4948-be6c-dbb5bd19da51" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.41:8080: connect: connection refused" Mar 18 17:06:46.815241 ip-10-0-133-190 kubenswrapper[2576]: E0318 17:06:46.815129 2576 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized" image="quay.io/opendatahub/odh-model-serving-api:fast" Mar 18 17:06:46.815586 ip-10-0-133-190 kubenswrapper[2576]: E0318 17:06:46.815314 2576 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:server,Image:quay.io/opendatahub/odh-model-serving-api:fast,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},ContainerPort{Name:metrics,HostPort:0,ContainerPort:9090,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:TLS_CERT_DIR,Value:/tls,ValueFrom:nil,},EnvVar{Name:GATEWAY_LABEL_SELECTOR,Value:,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{50 -3} {} 50m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tls-certs,ReadOnly:true,MountPath:/tls,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-k6p25,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000640000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod model-serving-api-9699c8d45-jrcb2_kserve(c1eb4bd1-d0f6-4dd7-a063-638cff8ab0d3): ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized" logger="UnhandledError" Mar 18 17:06:46.816482 ip-10-0-133-190 kubenswrapper[2576]: E0318 17:06:46.816457 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ErrImagePull: \"unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-jrcb2" podUID="c1eb4bd1-d0f6-4dd7-a063-638cff8ab0d3" Mar 18 17:06:55.559706 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:06:55.559662 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-8eb0c-predictor-58f7bfb56d-rpps8" podUID="bcac3e88-2eeb-4948-be6c-dbb5bd19da51" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.41:8080: connect: connection refused" Mar 18 17:06:59.564305 ip-10-0-133-190 kubenswrapper[2576]: E0318 17:06:59.564077 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-jrcb2" podUID="c1eb4bd1-d0f6-4dd7-a063-638cff8ab0d3" Mar 18 17:07:05.563266 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:07:05.563227 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/raw-sklearn-runtime-8eb0c-predictor-58f7bfb56d-rpps8" Mar 18 17:07:14.560079 ip-10-0-133-190 kubenswrapper[2576]: E0318 17:07:14.560035 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-jrcb2" podUID="c1eb4bd1-d0f6-4dd7-a063-638cff8ab0d3" Mar 18 17:07:15.877728 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:07:15.877646 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-8eb0c-predictor-58f7bfb56d-rpps8"] Mar 18 17:07:15.878129 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:07:15.877899 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/raw-sklearn-runtime-8eb0c-predictor-58f7bfb56d-rpps8" podUID="bcac3e88-2eeb-4948-be6c-dbb5bd19da51" containerName="kserve-container" containerID="cri-o://ffc6d4f3547d3ea4818dd3d9ae2598a8ee4ff9a21001872b37af45a0c1110b5f" gracePeriod=30 Mar 18 17:07:20.938452 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:07:20.938424 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-runtime-8eb0c-predictor-58f7bfb56d-rpps8" Mar 18 17:07:20.991947 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:07:20.991919 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nbcd6\" (UniqueName: \"kubernetes.io/projected/bcac3e88-2eeb-4948-be6c-dbb5bd19da51-kube-api-access-nbcd6\") pod \"bcac3e88-2eeb-4948-be6c-dbb5bd19da51\" (UID: \"bcac3e88-2eeb-4948-be6c-dbb5bd19da51\") " Mar 18 17:07:20.992118 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:07:20.991967 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bcac3e88-2eeb-4948-be6c-dbb5bd19da51-kserve-provision-location\") pod \"bcac3e88-2eeb-4948-be6c-dbb5bd19da51\" (UID: \"bcac3e88-2eeb-4948-be6c-dbb5bd19da51\") " Mar 18 17:07:20.992333 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:07:20.992312 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bcac3e88-2eeb-4948-be6c-dbb5bd19da51-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "bcac3e88-2eeb-4948-be6c-dbb5bd19da51" (UID: "bcac3e88-2eeb-4948-be6c-dbb5bd19da51"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 18 17:07:20.994000 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:07:20.993969 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bcac3e88-2eeb-4948-be6c-dbb5bd19da51-kube-api-access-nbcd6" (OuterVolumeSpecName: "kube-api-access-nbcd6") pod "bcac3e88-2eeb-4948-be6c-dbb5bd19da51" (UID: "bcac3e88-2eeb-4948-be6c-dbb5bd19da51"). InnerVolumeSpecName "kube-api-access-nbcd6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 17:07:21.093474 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:07:21.093382 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nbcd6\" (UniqueName: \"kubernetes.io/projected/bcac3e88-2eeb-4948-be6c-dbb5bd19da51-kube-api-access-nbcd6\") on node \"ip-10-0-133-190.ec2.internal\" DevicePath \"\"" Mar 18 17:07:21.093474 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:07:21.093416 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bcac3e88-2eeb-4948-be6c-dbb5bd19da51-kserve-provision-location\") on node \"ip-10-0-133-190.ec2.internal\" DevicePath \"\"" Mar 18 17:07:21.287957 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:07:21.287913 2576 generic.go:358] "Generic (PLEG): container finished" podID="bcac3e88-2eeb-4948-be6c-dbb5bd19da51" containerID="ffc6d4f3547d3ea4818dd3d9ae2598a8ee4ff9a21001872b37af45a0c1110b5f" exitCode=0 Mar 18 17:07:21.288115 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:07:21.287981 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-8eb0c-predictor-58f7bfb56d-rpps8" event={"ID":"bcac3e88-2eeb-4948-be6c-dbb5bd19da51","Type":"ContainerDied","Data":"ffc6d4f3547d3ea4818dd3d9ae2598a8ee4ff9a21001872b37af45a0c1110b5f"} Mar 18 17:07:21.288115 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:07:21.288000 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-runtime-8eb0c-predictor-58f7bfb56d-rpps8" Mar 18 17:07:21.288115 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:07:21.288024 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-8eb0c-predictor-58f7bfb56d-rpps8" event={"ID":"bcac3e88-2eeb-4948-be6c-dbb5bd19da51","Type":"ContainerDied","Data":"1e682fb086cbf0c74cfc4a2f51fcaa82d72493666a1bc59072f17c117a9bfba2"} Mar 18 17:07:21.288115 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:07:21.288040 2576 scope.go:117] "RemoveContainer" containerID="ffc6d4f3547d3ea4818dd3d9ae2598a8ee4ff9a21001872b37af45a0c1110b5f" Mar 18 17:07:21.296681 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:07:21.296666 2576 scope.go:117] "RemoveContainer" containerID="486c6f14ea3b035487d2d26df9ba61b585f7182d1922bdd157ef35f6a7a47bf3" Mar 18 17:07:21.305457 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:07:21.305430 2576 scope.go:117] "RemoveContainer" containerID="ffc6d4f3547d3ea4818dd3d9ae2598a8ee4ff9a21001872b37af45a0c1110b5f" Mar 18 17:07:21.305704 ip-10-0-133-190 kubenswrapper[2576]: E0318 17:07:21.305685 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ffc6d4f3547d3ea4818dd3d9ae2598a8ee4ff9a21001872b37af45a0c1110b5f\": container with ID starting with ffc6d4f3547d3ea4818dd3d9ae2598a8ee4ff9a21001872b37af45a0c1110b5f not found: ID does not exist" containerID="ffc6d4f3547d3ea4818dd3d9ae2598a8ee4ff9a21001872b37af45a0c1110b5f" Mar 18 17:07:21.305880 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:07:21.305712 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffc6d4f3547d3ea4818dd3d9ae2598a8ee4ff9a21001872b37af45a0c1110b5f"} err="failed to get container status \"ffc6d4f3547d3ea4818dd3d9ae2598a8ee4ff9a21001872b37af45a0c1110b5f\": rpc error: code = NotFound desc = could not find container \"ffc6d4f3547d3ea4818dd3d9ae2598a8ee4ff9a21001872b37af45a0c1110b5f\": container with ID starting with ffc6d4f3547d3ea4818dd3d9ae2598a8ee4ff9a21001872b37af45a0c1110b5f not found: ID does not exist" Mar 18 17:07:21.305880 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:07:21.305729 2576 scope.go:117] "RemoveContainer" containerID="486c6f14ea3b035487d2d26df9ba61b585f7182d1922bdd157ef35f6a7a47bf3" Mar 18 17:07:21.306063 ip-10-0-133-190 kubenswrapper[2576]: E0318 17:07:21.305914 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"486c6f14ea3b035487d2d26df9ba61b585f7182d1922bdd157ef35f6a7a47bf3\": container with ID starting with 486c6f14ea3b035487d2d26df9ba61b585f7182d1922bdd157ef35f6a7a47bf3 not found: ID does not exist" containerID="486c6f14ea3b035487d2d26df9ba61b585f7182d1922bdd157ef35f6a7a47bf3" Mar 18 17:07:21.306063 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:07:21.305932 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"486c6f14ea3b035487d2d26df9ba61b585f7182d1922bdd157ef35f6a7a47bf3"} err="failed to get container status \"486c6f14ea3b035487d2d26df9ba61b585f7182d1922bdd157ef35f6a7a47bf3\": rpc error: code = NotFound desc = could not find container \"486c6f14ea3b035487d2d26df9ba61b585f7182d1922bdd157ef35f6a7a47bf3\": container with ID starting with 486c6f14ea3b035487d2d26df9ba61b585f7182d1922bdd157ef35f6a7a47bf3 not found: ID does not exist" Mar 18 17:07:21.308634 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:07:21.308612 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-8eb0c-predictor-58f7bfb56d-rpps8"] Mar 18 17:07:21.311817 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:07:21.311794 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-8eb0c-predictor-58f7bfb56d-rpps8"] Mar 18 17:07:21.562554 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:07:21.562524 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bcac3e88-2eeb-4948-be6c-dbb5bd19da51" path="/var/lib/kubelet/pods/bcac3e88-2eeb-4948-be6c-dbb5bd19da51/volumes" Mar 18 17:07:26.559592 ip-10-0-133-190 kubenswrapper[2576]: E0318 17:07:26.559558 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-jrcb2" podUID="c1eb4bd1-d0f6-4dd7-a063-638cff8ab0d3" Mar 18 17:07:37.560522 ip-10-0-133-190 kubenswrapper[2576]: E0318 17:07:37.560406 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-jrcb2" podUID="c1eb4bd1-d0f6-4dd7-a063-638cff8ab0d3" Mar 18 17:07:44.045553 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:07:44.045520 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-6qxc8_d7144ee2-f2d9-497d-acc9-a5a9a6afcf2b/global-pull-secret-syncer/0.log" Mar 18 17:07:44.239753 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:07:44.239718 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-vclgl_bf84e45b-5912-4a57-b350-dd211f3513fe/konnectivity-agent/0.log" Mar 18 17:07:44.281370 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:07:44.281338 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-133-190.ec2.internal_1fbacc3938de459504eb6ed80d05d8dc/haproxy/0.log" Mar 18 17:07:48.344193 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:07:48.344161 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-ctmdb_b33135c9-e8d8-4c75-9332-67bbe7bfcaab/node-exporter/0.log" Mar 18 17:07:48.364055 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:07:48.364023 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-ctmdb_b33135c9-e8d8-4c75-9332-67bbe7bfcaab/kube-rbac-proxy/0.log" Mar 18 17:07:48.389095 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:07:48.389070 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-ctmdb_b33135c9-e8d8-4c75-9332-67bbe7bfcaab/init-textfile/0.log" Mar 18 17:07:48.560025 ip-10-0-133-190 kubenswrapper[2576]: E0318 17:07:48.559991 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-jrcb2" podUID="c1eb4bd1-d0f6-4dd7-a063-638cff8ab0d3" Mar 18 17:07:48.643209 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:07:48.643115 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-6b948c769-g4hrz_01085e71-bd91-4d49-a24f-b59a6e29f008/prometheus-operator/0.log" Mar 18 17:07:48.662141 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:07:48.662114 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-6b948c769-g4hrz_01085e71-bd91-4d49-a24f-b59a6e29f008/kube-rbac-proxy/0.log" Mar 18 17:07:48.690343 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:07:48.690315 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-8444df798b-59lfw_78f9ac4f-10cd-438c-8e38-5a772fb6f4e1/prometheus-operator-admission-webhook/0.log" Mar 18 17:07:50.944071 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:07:50.944039 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-5b85974fd6-4k2gx_b7fbe299-6374-4715-a716-d47ac426d698/download-server/0.log" Mar 18 17:07:51.342740 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:07:51.342710 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-vwfl5/perf-node-gather-daemonset-8np29"] Mar 18 17:07:51.343038 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:07:51.343025 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bcac3e88-2eeb-4948-be6c-dbb5bd19da51" containerName="storage-initializer" Mar 18 17:07:51.343091 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:07:51.343040 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcac3e88-2eeb-4948-be6c-dbb5bd19da51" containerName="storage-initializer" Mar 18 17:07:51.343091 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:07:51.343057 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9a3e6f2c-de86-4352-ad7c-876ec6601b25" containerName="kserve-container" Mar 18 17:07:51.343091 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:07:51.343063 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a3e6f2c-de86-4352-ad7c-876ec6601b25" containerName="kserve-container" Mar 18 17:07:51.343091 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:07:51.343071 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9a3e6f2c-de86-4352-ad7c-876ec6601b25" containerName="storage-initializer" Mar 18 17:07:51.343091 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:07:51.343078 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a3e6f2c-de86-4352-ad7c-876ec6601b25" containerName="storage-initializer" Mar 18 17:07:51.343091 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:07:51.343084 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bcac3e88-2eeb-4948-be6c-dbb5bd19da51" containerName="kserve-container" Mar 18 17:07:51.343091 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:07:51.343090 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcac3e88-2eeb-4948-be6c-dbb5bd19da51" containerName="kserve-container" Mar 18 17:07:51.343345 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:07:51.343151 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="9a3e6f2c-de86-4352-ad7c-876ec6601b25" containerName="kserve-container" Mar 18 17:07:51.343345 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:07:51.343162 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="bcac3e88-2eeb-4948-be6c-dbb5bd19da51" containerName="kserve-container" Mar 18 17:07:51.347714 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:07:51.347692 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vwfl5/perf-node-gather-daemonset-8np29" Mar 18 17:07:51.349650 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:07:51.349625 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-vwfl5\"/\"kube-root-ca.crt\"" Mar 18 17:07:51.350202 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:07:51.350180 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-vwfl5\"/\"default-dockercfg-cfrxv\"" Mar 18 17:07:51.350325 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:07:51.350213 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-vwfl5\"/\"openshift-service-ca.crt\"" Mar 18 17:07:51.352501 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:07:51.352480 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-vwfl5/perf-node-gather-daemonset-8np29"] Mar 18 17:07:51.431323 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:07:51.431278 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/4515485f-4104-4779-a2cc-2e7b06fc9368-podres\") pod \"perf-node-gather-daemonset-8np29\" (UID: \"4515485f-4104-4779-a2cc-2e7b06fc9368\") " pod="openshift-must-gather-vwfl5/perf-node-gather-daemonset-8np29" Mar 18 17:07:51.431536 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:07:51.431347 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4515485f-4104-4779-a2cc-2e7b06fc9368-sys\") pod \"perf-node-gather-daemonset-8np29\" (UID: \"4515485f-4104-4779-a2cc-2e7b06fc9368\") " pod="openshift-must-gather-vwfl5/perf-node-gather-daemonset-8np29" Mar 18 17:07:51.431536 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:07:51.431376 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7cxf\" (UniqueName: \"kubernetes.io/projected/4515485f-4104-4779-a2cc-2e7b06fc9368-kube-api-access-t7cxf\") pod \"perf-node-gather-daemonset-8np29\" (UID: \"4515485f-4104-4779-a2cc-2e7b06fc9368\") " pod="openshift-must-gather-vwfl5/perf-node-gather-daemonset-8np29" Mar 18 17:07:51.431536 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:07:51.431413 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4515485f-4104-4779-a2cc-2e7b06fc9368-lib-modules\") pod \"perf-node-gather-daemonset-8np29\" (UID: \"4515485f-4104-4779-a2cc-2e7b06fc9368\") " pod="openshift-must-gather-vwfl5/perf-node-gather-daemonset-8np29" Mar 18 17:07:51.431701 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:07:51.431524 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/4515485f-4104-4779-a2cc-2e7b06fc9368-proc\") pod \"perf-node-gather-daemonset-8np29\" (UID: \"4515485f-4104-4779-a2cc-2e7b06fc9368\") " pod="openshift-must-gather-vwfl5/perf-node-gather-daemonset-8np29" Mar 18 17:07:51.532383 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:07:51.532346 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/4515485f-4104-4779-a2cc-2e7b06fc9368-podres\") pod \"perf-node-gather-daemonset-8np29\" (UID: \"4515485f-4104-4779-a2cc-2e7b06fc9368\") " pod="openshift-must-gather-vwfl5/perf-node-gather-daemonset-8np29" Mar 18 17:07:51.532562 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:07:51.532399 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4515485f-4104-4779-a2cc-2e7b06fc9368-sys\") pod \"perf-node-gather-daemonset-8np29\" (UID: \"4515485f-4104-4779-a2cc-2e7b06fc9368\") " pod="openshift-must-gather-vwfl5/perf-node-gather-daemonset-8np29" Mar 18 17:07:51.532562 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:07:51.532470 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4515485f-4104-4779-a2cc-2e7b06fc9368-sys\") pod \"perf-node-gather-daemonset-8np29\" (UID: \"4515485f-4104-4779-a2cc-2e7b06fc9368\") " pod="openshift-must-gather-vwfl5/perf-node-gather-daemonset-8np29" Mar 18 17:07:51.532562 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:07:51.532517 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t7cxf\" (UniqueName: \"kubernetes.io/projected/4515485f-4104-4779-a2cc-2e7b06fc9368-kube-api-access-t7cxf\") pod \"perf-node-gather-daemonset-8np29\" (UID: \"4515485f-4104-4779-a2cc-2e7b06fc9368\") " pod="openshift-must-gather-vwfl5/perf-node-gather-daemonset-8np29" Mar 18 17:07:51.532731 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:07:51.532520 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/4515485f-4104-4779-a2cc-2e7b06fc9368-podres\") pod \"perf-node-gather-daemonset-8np29\" (UID: \"4515485f-4104-4779-a2cc-2e7b06fc9368\") " pod="openshift-must-gather-vwfl5/perf-node-gather-daemonset-8np29" Mar 18 17:07:51.532731 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:07:51.532574 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4515485f-4104-4779-a2cc-2e7b06fc9368-lib-modules\") pod \"perf-node-gather-daemonset-8np29\" (UID: \"4515485f-4104-4779-a2cc-2e7b06fc9368\") " pod="openshift-must-gather-vwfl5/perf-node-gather-daemonset-8np29" Mar 18 17:07:51.532731 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:07:51.532655 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4515485f-4104-4779-a2cc-2e7b06fc9368-lib-modules\") pod \"perf-node-gather-daemonset-8np29\" (UID: \"4515485f-4104-4779-a2cc-2e7b06fc9368\") " pod="openshift-must-gather-vwfl5/perf-node-gather-daemonset-8np29" Mar 18 17:07:51.532731 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:07:51.532659 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/4515485f-4104-4779-a2cc-2e7b06fc9368-proc\") pod \"perf-node-gather-daemonset-8np29\" (UID: \"4515485f-4104-4779-a2cc-2e7b06fc9368\") " pod="openshift-must-gather-vwfl5/perf-node-gather-daemonset-8np29" Mar 18 17:07:51.532886 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:07:51.532761 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/4515485f-4104-4779-a2cc-2e7b06fc9368-proc\") pod \"perf-node-gather-daemonset-8np29\" (UID: \"4515485f-4104-4779-a2cc-2e7b06fc9368\") " pod="openshift-must-gather-vwfl5/perf-node-gather-daemonset-8np29" Mar 18 17:07:51.539358 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:07:51.539337 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7cxf\" (UniqueName: \"kubernetes.io/projected/4515485f-4104-4779-a2cc-2e7b06fc9368-kube-api-access-t7cxf\") pod \"perf-node-gather-daemonset-8np29\" (UID: \"4515485f-4104-4779-a2cc-2e7b06fc9368\") " pod="openshift-must-gather-vwfl5/perf-node-gather-daemonset-8np29" Mar 18 17:07:51.657972 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:07:51.657854 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vwfl5/perf-node-gather-daemonset-8np29" Mar 18 17:07:51.777783 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:07:51.777599 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-vwfl5/perf-node-gather-daemonset-8np29"] Mar 18 17:07:51.780362 ip-10-0-133-190 kubenswrapper[2576]: W0318 17:07:51.780332 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod4515485f_4104_4779_a2cc_2e7b06fc9368.slice/crio-4d412515f02cb6e9b5d57449ef5e1edcb748b24d85418776b5cb4dcefbca6d85 WatchSource:0}: Error finding container 4d412515f02cb6e9b5d57449ef5e1edcb748b24d85418776b5cb4dcefbca6d85: Status 404 returned error can't find the container with id 4d412515f02cb6e9b5d57449ef5e1edcb748b24d85418776b5cb4dcefbca6d85 Mar 18 17:07:52.074631 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:07:52.074604 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-zlctd_c617a8be-fcd4-4bd3-971f-b335484b9beb/dns/0.log" Mar 18 17:07:52.095220 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:07:52.095192 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-zlctd_c617a8be-fcd4-4bd3-971f-b335484b9beb/kube-rbac-proxy/0.log" Mar 18 17:07:52.136467 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:07:52.136436 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-9w7wj_0bc34432-5dcb-459c-a6d8-587f77ae9dcf/dns-node-resolver/0.log" Mar 18 17:07:52.385716 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:07:52.385623 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vwfl5/perf-node-gather-daemonset-8np29" event={"ID":"4515485f-4104-4779-a2cc-2e7b06fc9368","Type":"ContainerStarted","Data":"2ff99e0a50b07d930c789a410f20beb3f8b66268fb1c8109dcee53216dca7a7d"} Mar 18 17:07:52.385716 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:07:52.385662 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vwfl5/perf-node-gather-daemonset-8np29" event={"ID":"4515485f-4104-4779-a2cc-2e7b06fc9368","Type":"ContainerStarted","Data":"4d412515f02cb6e9b5d57449ef5e1edcb748b24d85418776b5cb4dcefbca6d85"} Mar 18 17:07:52.385716 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:07:52.385693 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-vwfl5/perf-node-gather-daemonset-8np29" Mar 18 17:07:52.400762 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:07:52.400709 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-vwfl5/perf-node-gather-daemonset-8np29" podStartSLOduration=1.400696895 podStartE2EDuration="1.400696895s" podCreationTimestamp="2026-03-18 17:07:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:07:52.399999506 +0000 UTC m=+1463.449814933" watchObservedRunningTime="2026-03-18 17:07:52.400696895 +0000 UTC m=+1463.450512325" Mar 18 17:07:52.553687 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:07:52.553654 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-89d76c4f-f4bs8_962d1b72-c4cb-47c2-af2c-3921403d90f0/registry/0.log" Mar 18 17:07:52.593787 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:07:52.593757 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-sr4nn_ad6e055b-96d2-47b3-bb6b-f2e0165f3470/node-ca/0.log" Mar 18 17:07:53.648498 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:07:53.648470 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-mv4xk_4220311f-b2d3-43c5-87a6-ddf6edd88e2f/serve-healthcheck-canary/0.log" Mar 18 17:07:54.165050 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:07:54.165024 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-c5nc7_c49f1334-5f9a-4e6b-8d3c-dadd6d818f8e/kube-rbac-proxy/0.log" Mar 18 17:07:54.182656 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:07:54.182633 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-c5nc7_c49f1334-5f9a-4e6b-8d3c-dadd6d818f8e/exporter/0.log" Mar 18 17:07:54.202174 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:07:54.202151 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-c5nc7_c49f1334-5f9a-4e6b-8d3c-dadd6d818f8e/extractor/0.log" Mar 18 17:07:56.062055 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:07:56.062028 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-68cc5db7c4-x44z4_a141314c-4a0e-4f87-b919-d9f540c7e434/manager/0.log" Mar 18 17:07:56.177401 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:07:56.177374 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-vxtx7_7bd4203b-0848-472a-94ca-4d8cc6e6b1bf/manager/0.log" Mar 18 17:07:58.398306 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:07:58.398280 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-vwfl5/perf-node-gather-daemonset-8np29" Mar 18 17:07:59.561761 ip-10-0-133-190 kubenswrapper[2576]: E0318 17:07:59.561727 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-model-serving-api:fast\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/opendatahub/odh-model-serving-api:fast: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized; artifact err: get manifest: build image source: reading manifest fast in quay.io/opendatahub/odh-model-serving-api: unauthorized: access to the requested resource is not authorized\"" pod="kserve/model-serving-api-9699c8d45-jrcb2" podUID="c1eb4bd1-d0f6-4dd7-a063-638cff8ab0d3" Mar 18 17:07:59.709820 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:07:59.709788 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-6b589cdcc-hxcp4_c1a43c8a-8d1f-4c15-804d-994b6386c863/migrator/0.log" Mar 18 17:07:59.730700 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:07:59.730674 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-6b589cdcc-hxcp4_c1a43c8a-8d1f-4c15-804d-994b6386c863/graceful-termination/0.log" Mar 18 17:08:01.448018 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:08:01.447991 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-x7hss_fc696d94-dd9b-4ec4-8c68-3ef333b3aaa1/kube-multus-additional-cni-plugins/0.log" Mar 18 17:08:01.467622 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:08:01.467590 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-x7hss_fc696d94-dd9b-4ec4-8c68-3ef333b3aaa1/egress-router-binary-copy/0.log" Mar 18 17:08:01.487376 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:08:01.487348 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-x7hss_fc696d94-dd9b-4ec4-8c68-3ef333b3aaa1/cni-plugins/0.log" Mar 18 17:08:01.507147 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:08:01.507126 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-x7hss_fc696d94-dd9b-4ec4-8c68-3ef333b3aaa1/bond-cni-plugin/0.log" Mar 18 17:08:01.529373 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:08:01.529351 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-x7hss_fc696d94-dd9b-4ec4-8c68-3ef333b3aaa1/routeoverride-cni/0.log" Mar 18 17:08:01.548795 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:08:01.548769 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-x7hss_fc696d94-dd9b-4ec4-8c68-3ef333b3aaa1/whereabouts-cni-bincopy/0.log" Mar 18 17:08:01.567337 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:08:01.567313 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-x7hss_fc696d94-dd9b-4ec4-8c68-3ef333b3aaa1/whereabouts-cni/0.log" Mar 18 17:08:01.622111 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:08:01.622043 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wqbkg_832b4122-2cb3-47a3-9cdb-f45d80349575/kube-multus/0.log" Mar 18 17:08:01.676192 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:08:01.676143 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-bd2jt_e3720b07-5dd9-403e-9a66-17abd67f145f/network-metrics-daemon/0.log" Mar 18 17:08:01.697382 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:08:01.697361 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-bd2jt_e3720b07-5dd9-403e-9a66-17abd67f145f/kube-rbac-proxy/0.log" Mar 18 17:08:02.803839 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:08:02.803806 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g2jdr_df16c6f7-55c8-4f26-80fc-ae0ba3fa368a/ovn-controller/0.log" Mar 18 17:08:02.823235 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:08:02.823208 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g2jdr_df16c6f7-55c8-4f26-80fc-ae0ba3fa368a/ovn-acl-logging/0.log" Mar 18 17:08:02.830251 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:08:02.830227 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g2jdr_df16c6f7-55c8-4f26-80fc-ae0ba3fa368a/ovn-acl-logging/1.log" Mar 18 17:08:02.848529 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:08:02.848501 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g2jdr_df16c6f7-55c8-4f26-80fc-ae0ba3fa368a/kube-rbac-proxy-node/0.log" Mar 18 17:08:02.871277 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:08:02.871242 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g2jdr_df16c6f7-55c8-4f26-80fc-ae0ba3fa368a/kube-rbac-proxy-ovn-metrics/0.log" Mar 18 17:08:02.893180 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:08:02.893155 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g2jdr_df16c6f7-55c8-4f26-80fc-ae0ba3fa368a/northd/0.log" Mar 18 17:08:02.917455 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:08:02.917436 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g2jdr_df16c6f7-55c8-4f26-80fc-ae0ba3fa368a/nbdb/0.log" Mar 18 17:08:02.941047 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:08:02.941027 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g2jdr_df16c6f7-55c8-4f26-80fc-ae0ba3fa368a/sbdb/0.log" Mar 18 17:08:03.044559 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:08:03.044482 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g2jdr_df16c6f7-55c8-4f26-80fc-ae0ba3fa368a/ovnkube-controller/0.log" Mar 18 17:08:04.464956 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:08:04.464921 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-j676s_ce437450-60be-4502-aa99-03a4af6c8e7c/network-check-target-container/0.log" Mar 18 17:08:05.580524 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:08:05.580497 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-hqszc_fe7890a0-afa4-444d-9ad4-445e8890337a/iptables-alerter/0.log" Mar 18 17:08:06.245487 ip-10-0-133-190 kubenswrapper[2576]: I0318 17:08:06.245456 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-bt66t_50878b4f-dc7c-45c7-81b8-e6f009741d18/tuned/0.log"